Sample records for lakes tier-2 computing

  1. The Legnaro-Padova distributed Tier-2: challenges and results

    NASA Astrophysics Data System (ADS)

    Badoer, Simone; Biasotto, Massimo; Costa, Fulvia; Crescente, Alberto; Fantinel, Sergio; Ferrari, Roberto; Gulmini, Michele; Maron, Gaetano; Michelotto, Michele; Sgaravatto, Massimo; Toniolo, Nicola

    2014-06-01

    The Legnaro-Padova Tier-2 is a computing facility serving the ALICE and CMS LHC experiments. It also supports other High Energy Physics experiments and other virtual organizations of different disciplines, which can opportunistically harness idle resources if available. The unique characteristic of this Tier-2 is its topology: the computational resources are spread in two different sites, about 15 km apart: the INFN Legnaro National Laboratories and the INFN Padova unit, connected through a 10 Gbps network link (it will be soon updated to 20 Gbps). Nevertheless these resources are seamlessly integrated and are exposed as a single computing facility. Despite this intrinsic complexity, the Legnaro-Padova Tier-2 ranks among the best Grid sites for what concerns reliability and availability. The Tier-2 comprises about 190 worker nodes, providing about 26000 HS06 in total. Such computing nodes are managed by the LSF local resource management system, and are accessible using a Grid-based interface implemented through multiple CREAM CE front-ends. dCache, xrootd and Lustre are the storage systems in use at the Tier-2: about 1.5 PB of disk space is available to users in total, through multiple access protocols. A 10 Gbps network link, planned to be doubled in the next months, connects the Tier-2 to WAN. This link is used for the LHC Open Network Environment (LHCONE) and for other general purpose traffic. In this paper we discuss about the experiences at the Legnaro-Padova Tier-2: the problems that had to be addressed, the lessons learned, the implementation choices. We also present the tools used for the daily management operations. These include DOCET, a Java-based webtool designed, implemented and maintained at the Legnaro-Padova Tier-2, and deployed also in other sites, such as the LHC Italian T1. DOCET provides an uniform interface to manage all the information about the physical resources of a computing center. It is also used as documentation repository available to the Tier-2 operations team. Finally we discuss about the foreseen developments of the existing infrastructure. This includes in particular the evolution from a Grid-based resource towards a Cloud-based computing facility.

  2. Large scale commissioning and operational experience with tier-2 to tier-2 data transfer links in CMS

    NASA Astrophysics Data System (ADS)

    Letts, J.; Magini, N.

    2011-12-01

    Tier-2 to Tier-2 data transfers have been identified as a necessary extension of the CMS computing model. The Debugging Data Transfers (DDT) Task Force in CMS was charged with commissioning Tier-2 to Tier-2 PhEDEx transfer links beginning in late 2009, originally to serve the needs of physics analysis groups for the transfer of their results between the storage elements of the Tier-2 sites associated with the groups. PhEDEx is the data transfer middleware of the CMS experiment. For analysis jobs using CRAB, the CMS Remote Analysis Builder, the challenges of remote stage out of job output at the end of the analysis jobs led to the introduction of a local fallback stage out, and will eventually require the asynchronous transfer of user data over essentially all of the Tier-2 to Tier-2 network using the same PhEDEx infrastructure. In addition, direct file sharing of physics and Monte Carlo simulated data between Tier-2 sites can relieve the operational load of the Tier-1 sites in the original CMS Computing Model, and already represents an important component of CMS PhEDEx data transfer volume. The experience, challenges and methods used to debug and commission the thousands of data transfers links between CMS Tier-2 sites world-wide are explained and summarized. The resulting operational experience with Tier-2 to Tier-2 transfers is also presented.

  3. 26 CFR 31.3221-2 - Rates and computation of employer tax.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-2 Rates and computation of employer tax. (a) Rates—(1)(i) Tier 1 tax. The Tier 1 employer tax rate... disability insurance, and section 3111(b), relating to hospital insurance. The Tier 1 employer tax rate is... Federal Insurance Contributions Act. (ii) Example. The rule in paragraph (a)(1)(i) of this section is...

  4. 26 CFR 31.3201-2 - Rates and computation of employee tax.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-2 Rates and computation of employee tax. (a) Rates—(1)(i) Tier 1 tax. The Tier 1 employee tax rate... disability insurance, and section 3101(b), relating to hospital insurance. The Tier 1 employee tax rate is... Federal Insurance Contributions Act. (ii) Example. The rule in paragraph (a)(1)(i) of this section is...

  5. From the CMS Computing Experience in the WLCG STEP'09 Challenge to the First Data Taking of the LHC Era

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Gutsche, O.

    The Worldwide LHC Computing Grid (WLCG) project decided in March 2009 to perform scale tests of parts of its overall Grid infrastructure before the start of the LHC data taking. The "Scale Test for the Experiment Program" (STEP'09) was performed mainly in June 2009 -with more selected tests in September- October 2009 -and emphasized the simultaneous test of the computing systems of all 4 LHC experiments. CMS tested its Tier-0 tape writing and processing capabilities. The Tier-1 tape systems were stress tested using the complete range of Tier-1 work-flows: transfer from Tier-0 and custody of data on tape, processing and subsequent archival, redistribution of datasets amongst all Tier-1 sites as well as burst transfers of datasets to Tier-2 sites. The Tier-2 analysis capacity was tested using bulk analysis job submissions to backfill normal user activity. In this talk, we will report on the different performed tests and present their post-mortem analysis.

  6. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  7. 10. INTERIOR OF OUTLET TOWER LOOKING DOWN TO TIER #1 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. INTERIOR OF OUTLET TOWER LOOKING DOWN TO TIER #1 OF SLIDE GATES. STRUCTURE HAS LEVELS ENABLING OPERATORS TO CHOOSE LEVEL WITH BEST QUALITY WATER. OVERHANGING DEVICE THAT LOOKS LIKE A LIGHT STANDARD IS ACTUALLY A METER FOR MEASURING WATER LEVELS. - Lake Mathews, East of Route 15, Riverside, Riverside County, CA

  8. CMS results in the Combined Computing Readiness Challenge CCRC'08

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Bauerdick, L.; CMS Collaboration

    2009-12-01

    During February and May 2008, CMS participated to the Combined Computing Readiness Challenge (CCRC'08) together with all other LHC experiments. The purpose of this worldwide exercise was to check the readiness of the Computing infrastructure for LHC data taking. Another set of major CMS tests called Computing, Software and Analysis challenge (CSA'08) - as well as CMS cosmic runs - were also running at the same time: CCRC augmented the load on computing with additional tests to validate and stress-test all CMS computing workflows at full data taking scale, also extending this to the global WLCG community. CMS exercised most aspects of the CMS computing model, with very comprehensive tests. During May 2008, CMS moved more than 3.6 Petabytes among more than 300 links in the complex Grid topology. CMS demonstrated that is able to safely move data out of CERN to the Tier-1 sites, sustaining more than 600 MB/s as a daily average for more than seven days in a row, with enough headroom and with hourly peaks of up to 1.7 GB/s. CMS ran hundreds of simultaneous jobs at each Tier-1 site, re-reconstructing and skimming hundreds of millions of events. After re-reconstruction the fresh AOD (Analysis Object Data) has to be synchronized between Tier-1 centers: CMS demonstrated that the required inter-Tier-1 transfers are achievable within a few days. CMS also showed that skimmed analysis data sets can be transferred to Tier-2 sites for analysis at sufficient rate, regionally as well as inter-regionally, achieving all goals in about 90% of >200 links. Simultaneously, CMS also ran a large Tier-2 analysis exercise, where realistic analysis jobs were submitted to a large set of Tier-2 sites by a large number of people to produce a chaotic workload across the systems, and with more than 400 analysis users in May. Taken all together, CMS routinely achieved submissions of 100k jobs/day, with peaks up to 200k jobs/day. The achieved results in CCRC'08 - focussing on the distributed workflows - are presented and discussed.

  9. Morphology and gene sequence of Levicoleps biwae n. gen., n. sp. (Ciliophora, Prostomatida), a proposed endemic from the ancient Lake Biwa, Japan.

    PubMed

    Foissner, Wilhelm; Kusuoka, Yasushi; Shimano, Satoshi

    2008-01-01

    Levicoleps biwae n. gen., n. sp. was discovered in organic mud on the shore of Lake Biwa, Japan. Its morphology and small subunit rRNA gene sequence were studied with standard methods. Further, we established a terminology for the colepid armour and selected four features for genus recognition: the number of armour tiers, the structure of the tier plates, the presence/absence of armour spines, and the number of adoral organelles (three or five). The Japanese colepid, a barrel-shaped ciliate with an average size of 75 x 45 microm, has six armour tiers and hirtus-type tier plates, but lacks armour spines, both in the environment and in laboratory culture. Thus, it is considered to represent a new genus. This rank is supported by the considerable genetic distance (7%) from the common Coleps hirtus. Although L. biwae looks quite similar to C. hirtus in vivo, it is very likely most closely related to Coleps amphacanthus, a species with conspicuous armour spines, as indicated by body size, the number of ciliary rows and, especially, the multiple caudal cilia. Lake Biwa is about four million years old and inhabited by many endemic organisms, ranging from algae to large fish. Thus, we suspect that L. biwae is restricted to Lake Biwa or, at least, to Asia. Based on literature data and the generic features established, we also propose the new genus Reticoleps for Coleps remanei Kahl, 1933, and resurrect the genus Pinacocoleps Diesing, 1865 to include Coleps incurvus Ehrenberg, 1833, Coleps pulcher Spiegel, 1926, Coleps tessalatus Kahl, 1930 and, probably, Baikalocoleps quadratus Obolkina, 1995a. Nine colepid genera are diagnosed and dichotomously keyed.

  10. Morphology and Gene Sequence of Levicoleps biwae n. gen., n. sp. (Ciliophora, Prostomatida), a Proposed Endemic from the Ancient Lake Biwa, Japan

    PubMed Central

    FOISSNER, WILHELM; KUSUOKA, YASUSHI; SHIMANO, SATOSHI

    2010-01-01

    Levicoleps biwae n. gen., n. sp. was discovered in organic mud on the shore of Lake Biwa, Japan. Its morphology and small subunit rRNA gene sequence were studied with standard methods. Further, we established a terminology for the colepid armour and selected four features for genus recognition: the number of armour tiers, the structure of the tier plates, the presence/absence of armour spines, and the number of adoral organelles (three or five). The Japanese colepid, a barrel-shaped ciliate with an average size of 75 × 45 μm, has six armour tiers and hirtus-type tier plates, but lacks armour spines, both in the environment and in laboratory culture. Thus, it is considered to represent a new genus. This rank is supported by the considerable genetic distance (7%) from the common Coleps hirtus. Although L. biwae looks quite similar to C. hirtus in vivo, it is very likely most closely related to Coleps amphacanthus, a species with conspicuous armour spines, as indicated by body size, the number of ciliary rows and, especially, the multiple caudal cilia. Lake Biwa is about four million years old and inhabited by many endemic organisms, ranging from algae to large fish. Thus, we suspect that L. biwae is restricted to Lake Biwa or, at least, to Asia. Based on literature data and the generic features established, we also propose the new genus Reticoleps for Coleps remanei Kahl, 1933, and resurrect the genus Pinacocoleps Diesing, 1865 to include Coleps incurvus Ehrenberg, 1833, Coleps pulcher Spiegel, 1926, Coleps tessalatus Kahl, 1930 and, probably, Baikalocoleps quadratus Obolkina, 1995a. Nine colepid genera are diagnosed and dichotomously keyed. PMID:18460156

  11. Optimising LAN access to grid enabled storage elements

    NASA Astrophysics Data System (ADS)

    Stewart, G. A.; Cowan, G. A.; Dunne, B.; Elwell, A.; Millar, A. P.

    2008-07-01

    When operational, the Large Hadron Collider experiments at CERN will collect tens of petabytes of physics data per year. The worldwide LHC computing grid (WLCG) will distribute this data to over two hundred Tier-1 and Tier-2 computing centres, enabling particle physicists around the globe to access the data for analysis. Although different middleware solutions exist for effective management of storage systems at collaborating institutes, the patterns of access envisaged for Tier-2s fall into two distinct categories. The first involves bulk transfer of data between different Grid storage elements using protocols such as GridFTP. This data movement will principally involve writing ESD and AOD files into Tier-2 storage. Secondly, once datasets are stored at a Tier-2, physics analysis jobs will read the data from the local SE. Such jobs require a POSIX-like interface to the storage so that individual physics events can be extracted. In this paper we consider the performance of POSIX-like access to files held in Disk Pool Manager (DPM) storage elements, a popular lightweight SRM storage manager from EGEE.

  12. LHCNet: Wide Area Networking and Collaborative Systems for HEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, H.B,

    2007-08-20

    This proposal presents the status and progress in 2006-7, and the technical and financial plans for 2008-2010 for the US LHCNet transatlantic network supporting U.S. participation in the LHC physics program. US LHCNet provides transatlantic connections of the Tier1 computing facilities at Fermilab and Brookhaven with the Tier0 and Tier1 facilities at CERN as well as Tier1s elsewhere in Europe and Asia. Together with ESnet, Internet2, the GEANT pan-European network, and NSF’s UltraLight project, US LHCNet also supports connections between the Tier2 centers (where most of the analysis of the data will take place, starting this year) and the Tier1smore » as needed.See report« less

  13. Tier2 Submit Software

    EPA Pesticide Factsheets

    Download this tool for Windows or Mac, which helps facilities prepare a Tier II electronic chemical inventory report. The data can also be exported into the CAMEOfm (Computer-Aided Management of Emergency Operations) emergency planning software.

  14. 26 CFR 31.3211-2 - Rates and computation of employee representative tax.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...) Rates—(1)(i) Tier 1 tax. The Tier 1 employee representative tax rate equals the sum of the tax rates in... employer tax for hospital insurance. The Tier 1 employee representative tax rate is applied to compensation... Insurance Contributions Act. (ii) Example. The rule in paragraph (a)(1)(i) of this section is illustrated by...

  15. Grid Computing at GSI for ALICE and FAIR - present and future

    NASA Astrophysics Data System (ADS)

    Schwarz, Kilian; Uhlig, Florian; Karabowicz, Radoslaw; Montiel-Gonzalez, Almudena; Zynovyev, Mykhaylo; Preuss, Carsten

    2012-12-01

    The future FAIR experiments CBM and PANDA have computing requirements that fall in a category that could currently not be satisfied by one single computing centre. One needs a larger, distributed computing infrastructure to cope with the amount of data to be simulated and analysed. Since 2002, GSI operates a tier2 center for ALICE@CERN. The central component of the GSI computing facility and hence the core of the ALICE tier2 centre is a LSF/SGE batch farm, currently split into three subclusters with a total of 15000 CPU cores shared by the participating experiments, and accessible both locally and soon also completely via Grid. In terms of data storage, a 5.5 PB Lustre file system, directly accessible from all worker nodes is maintained, as well as a 300 TB xrootd-based Grid storage element. Based on this existing expertise, and utilising ALICE's middleware ‘AliEn’, the Grid infrastructure for PANDA and CBM is being built. Besides a tier0 centre at GSI, the computing Grids of the two FAIR collaborations encompass now more than 17 sites in 11 countries and are constantly expanding. The operation of the distributed FAIR computing infrastructure benefits significantly from the experience gained with the ALICE tier2 centre. A close collaboration between ALICE Offline and FAIR provides mutual advantages. The employment of a common Grid middleware as well as compatible simulation and analysis software frameworks ensure significant synergy effects.

  16. 26 CFR 1.902-2 - Treatment of deficits in post-1986 undistributed earnings and pre-1987 accumulated profits of a...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... earnings and pre-1987 accumulated profits of a first- or lower-tier corporation for purposes of computing... earnings and pre-1987 accumulated profits of a first- or lower-tier corporation for purposes of computing... would be a dividend if there were current or accumulated earnings and profits, then the post-1986...

  17. 20 CFR 225.21 - Survivor Tier I PIA.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INSURANCE AMOUNT DETERMINATIONS PIA's Used in Computing Survivor Annuities and the Amount of the Residual Lump-Sum Payable § 225.21 Survivor Tier I PIA. The Survivor Tier I PIA is used in computing the tier I... Security Act using the deceased employee's combined railroad and social security earnings after 1950 (or...

  18. US LHCNet: Transatlantic Networking for the LHC and the U.S. HEP Community

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Harvey B; Barczyk, Artur J

    2013-04-05

    US LHCNet provides the transatlantic connectivity between the Tier1 computing facilities at the Fermilab and Brookhaven National Labs and the Tier0 and Tier1 facilities at CERN, as well as Tier1s elsewhere in Europe and Asia. Together with ESnet, Internet2, and other R&E Networks participating in the LHCONE initiative, US LHCNet also supports transatlantic connections between the Tier2 centers (where most of the data analysis is taking place) and the Tier1s as needed. Given the key roles of the US and European Tier1 centers as well as Tier2 centers on both continents, the largest data flows are across the Atlantic, wheremore » US LHCNet has the major role. US LHCNet manages and operates the transatlantic network infrastructure including four Points of Presence (PoPs) and currently six transatlantic OC-192 (10Gbps) leased links. Operating at the optical layer, the network provides a highly resilient fabric for data movement, with a target service availability level in excess of 99.95%. This level of resilience and seamless operation is achieved through careful design including path diversity on both submarine and terrestrial segments, use of carrier-grade equipment with built-in high-availability and redundancy features, deployment of robust failover mechanisms based on SONET protection schemes, as well as the design of facility-diverse paths between the LHC computing sites. The US LHCNet network provides services at Layer 1(optical), Layer 2 (Ethernet) and Layer 3 (IPv4 and IPv6). The flexible design of the network, including modular equipment, a talented and agile team, and flexible circuit lease management, allows US LHCNet to react quickly to changing requirements form the LHC community. Network capacity is provisioned just-in-time to meet the needs, as demonstrated in the past years during the changing LHC start-up plans.« less

  19. CMS tier structure and operation of the experiment-specific tasks in Germany

    NASA Astrophysics Data System (ADS)

    Nowack, A.

    2008-07-01

    In Germany, several university institutes and research centres take part in the CMS experiment. Concerning the data analysis, a couple of computing centres at different Tier levels, ranging from Tier 1 to Tier 3, exists at these places. The German Tier 1 centre GridKa at the research centre at Karlsruhe serves all four LHC experiments as well as four non-LHC experiments. With respect to the CMS experiment, GridKa is mainly involved in central tasks. The Tier 2 centre in Germany consists of two sites, one at the research centre DESY at Hamburg and one at RWTH Aachen University, forming a federated Tier 2 centre. Both parts cover different aspects of a Tier 2 centre. The German Tier 3 centres are located at the research centre DESY at Hamburg, at RWTH Aachen University, and at the University of Karlsruhe. Furthermore the building of a German user analysis facility is planned. Since the CMS community in German is rather small, a good cooperation between the different sites is essential. This cooperation includes physical topics as well as technical and operational issues. All available communication channels such as email, phone, monthly video conferences, and regular personal meetings are used. For example, the distribution of data sets is coordinated globally within Germany. Also the CMS-specific services such as the data transfer tool PhEDEx or the Monte Carlo production are operated by people from different sites in order to spread the knowledge widely and increase the redundancy in terms of operators.

  20. The CMS Tier0 goes cloud and grid for LHC Run 2

    DOE PAGES

    Hufnagel, Dirk

    2015-12-23

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threadedmore » framework to deal with the increased event complexity and to ensure efficient use of the resources. Furthermore, this contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.« less

  1. The CMS TierO goes Cloud and Grid for LHC Run 2

    NASA Astrophysics Data System (ADS)

    Hufnagel, Dirk

    2015-12-01

    In 2015, CMS will embark on a new era of collecting LHC collisions at unprecedented rates and complexity. This will put a tremendous stress on our computing systems. Prompt Processing of the raw data by the Tier-0 infrastructure will no longer be constrained to CERN alone due to the significantly increased resource requirements. In LHC Run 2, we will need to operate it as a distributed system utilizing both the CERN Cloud-based Agile Infrastructure and a significant fraction of the CMS Tier-1 Grid resources. In another big change for LHC Run 2, we will process all data using the multi-threaded framework to deal with the increased event complexity and to ensure efficient use of the resources. This contribution will cover the evolution of the Tier-0 infrastructure and present scale testing results and experiences from the first data taking in 2015.

  2. ATLAS Distributed Computing Experience and Performance During the LHC Run-2

    NASA Astrophysics Data System (ADS)

    Filipčič, A.; ATLAS Collaboration

    2017-10-01

    ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of the new model was demonstrated through the delivery of analysis datasets to users just one week after data taking, by completing the calibration loop, Tier-0 processing and train production steps promptly. The great flexibility of the new system also makes it possible to execute part of the Tier-0 processing on the grid when Tier-0 resources experience a backlog during high data-taking periods. The introduction of the data lifetime model, where each dataset is assigned a finite lifetime (with extensions possible for frequently accessed data), was made possible by Rucio. Thanks to this the storage crises experienced in Run-1 have not reappeared during Run-2. In addition, the distinction between Tier-1 and Tier-2 disk storage, now largely artificial given the quality of Tier-2 resources and their networking, has been removed through the introduction of dynamic ATLAS clouds that group the storage endpoint nucleus and its close-by execution satellite sites. All stable ATLAS sites are now able to store unique or primary copies of the datasets. ATLAS Distributed Computing is further evolving to speed up request processing by introducing network awareness, using machine learning and optimisation of the latencies during the execution of the full chain of tasks. The Event Service, a new workflow and job execution engine, is designed around check-pointing at the level of event processing to use opportunistic resources more efficiently. ATLAS has been extensively exploring possibilities of using computing resources extending beyond conventional grid sites in the WLCG fabric to deliver as many computing cycles as possible and thereby enhance the significance of the Monte-Carlo samples to deliver better physics results. The exploitation of opportunistic resources was at an early stage throughout 2015, at the level of 10% of the total ATLAS computing power, but in the next few years it is expected to deliver much more. In addition, demonstrating the ability to use an opportunistic resource can lead to securing ATLAS allocations on the facility, hence the importance of this work goes beyond merely the initial CPU cycles gained. In this paper, we give an overview and compare the performance, development effort, flexibility and robustness of the various approaches.

  3. Development and Evaluation of the Diagnostic Power for a Computer-Based Two-Tier Assessment

    ERIC Educational Resources Information Center

    Lin, Jing-Wen

    2016-01-01

    This study adopted a quasi-experimental design with follow-up interview to develop a computer-based two-tier assessment (CBA) regarding the science topic of electric circuits and to evaluate the diagnostic power of the assessment. Three assessment formats (i.e., paper-and-pencil, static computer-based, and dynamic computer-based tests) using…

  4. 20 CFR 225.10 - General.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... DETERMINATIONS PIA's Used in Computing Employee, Spouse and Divorced Spouse Annuities § 225.10 General. This subpart contains information about the PIA's that can be used in computing most employee, spouse and divorced spouse annuities. The Tier I PIA is used in computing the tier I component of an employee, spouse...

  5. 26 CFR 1.902-2 - Treatment of deficits in post-1986 undistributed earnings and pre-1987 accumulated profits of a...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... earnings and pre-1987 accumulated profits of a first- or lower-tier corporation for purposes of computing... undistributed earnings and pre-1987 accumulated profits of a first- or lower-tier corporation for purposes of... would be a dividend if there were current or accumulated earnings and profits, then the post-1986...

  6. An integrated approach to assess broad-scale condition of coastal wetlands - The Gulf of Mexico Coastal Wetlands pilot survey

    USGS Publications Warehouse

    Nestlerode, J.A.; Engle, V.D.; Bourgeois, P.; Heitmuller, P.T.; Macauley, J.M.; Allen, Y.C.

    2009-01-01

    The Environmental Protection Agency (EPA) and U.S. Geological Survey (USGS) initiated a two-year regional pilot survey in 2007 to develop, test, and validate tools and approaches to assess the condition of northern Gulf of Mexico (GOM) coastal wetlands. Sampling sites were selected from estuarine and palustrine wetland areas with herbaceous, forested, and shrub/scrub habitats delineated by the US Fish and Wildlife Service National Wetlands Inventory Status and Trends (NWI S&T) program and contained within northern GOM coastal watersheds. A multi-level, stepwise, iterative survey approach is being applied to multiple wetland classes at 100 probabilistically-selected coastal wetlands sites. Tier 1 provides information at the landscape scale about habitat inventory, land use, and environmental stressors associated with the watershed in which each wetland site is located. Tier 2, a rapid assessment conducted through a combination of office and field work, is based on best professional judgment and on-site evidence. Tier 3, an intensive site assessment, involves on-site collection of vegetation, water, and sediment samples to establish an integrated understanding of current wetland condition and validate methods and findings from Tiers 1 and 2. The results from this survey, along with other similar regional pilots from the Mid-Atlantic, West Coast, and Great Lakes Regions will contribute to a design and implementation approach for the National Wetlands Condition Assessment to be conducted by EPA's Office of Water in 2011. ?? Springer Science+Business Media B.V. 2008.

  7. Utilization of Herbicide Concentration/Exposure Time Relationships for Controlling Submersed Invasive Plants on Lake Gaston, Virginia/North Carolina

    DTIC Science & Technology

    2011-06-01

    of efficacy in controlling invasive plant species, and are verified in aquatic and wetland field sites throughout the US. This multi-tiered...localized treatment sites, moving or still water. Chlorosis of stems and leaves with plant death in 7-10 days. 2,4-D 3 DMA liquid, BEE salt...with Eurasian watermilfoil. Formulations include a liquid dimethyl amine (DMA) and a granular clay butoxyethanol ester ( BEE ). Current Evaluations

  8. Integration of Russian Tier-1 Grid Center with High Performance Computers at NRC-KI for LHC experiments and beyond HENP

    NASA Astrophysics Data System (ADS)

    Belyaev, A.; Berezhnaya, A.; Betev, L.; Buncic, P.; De, K.; Drizhuk, D.; Klimentov, A.; Lazin, Y.; Lyalin, I.; Mashinistov, R.; Novikov, A.; Oleynik, D.; Polyakov, A.; Poyda, A.; Ryabinkin, E.; Teslyuk, A.; Tkachenko, I.; Yasnopolskiy, L.

    2015-12-01

    The LHC experiments are preparing for the precision measurements and further discoveries that will be made possible by higher LHC energies from April 2015 (LHC Run2). The need for simulation, data processing and analysis would overwhelm the expected capacity of grid infrastructure computing facilities deployed by the Worldwide LHC Computing Grid (WLCG). To meet this challenge the integration of the opportunistic resources into LHC computing model is highly important. The Tier-1 facility at Kurchatov Institute (NRC-KI) in Moscow is a part of WLCG and it will process, simulate and store up to 10% of total data obtained from ALICE, ATLAS and LHCb experiments. In addition Kurchatov Institute has supercomputers with peak performance 0.12 PFLOPS. The delegation of even a fraction of supercomputing resources to the LHC Computing will notably increase total capacity. In 2014 the development a portal combining a Tier-1 and a supercomputer in Kurchatov Institute was started to provide common interfaces and storage. The portal will be used not only for HENP experiments, but also by other data- and compute-intensive sciences like biology with genome sequencing analysis; astrophysics with cosmic rays analysis, antimatter and dark matter search, etc.

  9. Using normal ranges for interpreting results of monitoring and tiering to guide future work: A case study of increasing polycyclic aromatic compounds in lake sediments from the Cold Lake oil sands (Alberta, Canada) described in Korosi et al. (2016).

    PubMed

    Munkittrick, Kelly R; Arciszewski, Tim J

    2017-12-01

    Since the publishing of the Kelly et al. papers (2009, 2010) describing elevated contaminants in snow near the Alberta oil sands, there has been a significant expansion of monitoring efforts, enhanced by $50M a year contributed by industry to a regional Joint Oil Sands Monitoring (JOSM) program. In parallel to the intensification of research and monitoring efforts, including expansion of measured indicators, techniques for chemical analysis have also become more sensitive. Both factors contribute to the increased sensitivity and power, and improve our capacity to detect any change. The increase in capability requires a counterbalance to account for trivial change. This can be done using an interpretative approach that requires contextualization of differences to meaningfully inform environmental monitoring programs and provide focus for action. Experience obtained through 25 years of involvement with Canada's Environmental Effects Monitoring (EEM) program has shown that a tiered program informed by triggers can provide the context to make decisions about monitoring priorities. Here we provide a potential interpretation framework using a case study around the Korosi et al. (2016) study which found recent increases in alkylated polycyclic aromatic compounds (aPACs) in the Cold Lake in situ oil sands area. Public contaminant profiles from the JOSM studies in the oil sands region are used to evaluate the changes using an interpretation framework based on estimated normal ranges using existing data for site-specific, local and regional (distant) levels that was modelled after the tiered Canadian EEM design. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. SELECTION OF CANDIDATE EUTROPHICATION MODELS FOR TOTAL MAXIMUM DAILY LOADS ANALYSES

    EPA Science Inventory

    A tiered approach was developed to evaluate candidate eutrophication models to select a common suite of models that could be used for Total Maximum Daily Loads (TMDL) analyses in estuaries, rivers, and lakes/reservoirs. Consideration for linkage to watershed models and ecologica...

  11. Optimisation of the usage of LHC and local computing resources in a multidisciplinary physics department hosting a WLCG Tier-2 centre

    NASA Astrophysics Data System (ADS)

    Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel

    2015-12-01

    We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.

  12. 20 CFR 225.20 - General.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... DETERMINATIONS PIA's Used in Computing Survivor Annuities and the Amount of the Residual Lump-Sum Payable § 225.20 General. The Survivor Tier I PIA and the Employee RIB PIA are used in computing the tier I component of a survivor annuity. The Combined Earnings PIA, Social Security Earnings PIA and Railroad...

  13. 78 FR 65643 - Environmental Impact Statements; Notice of Availability

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-01

    ..., Gas Hills In-Situ Recovery Uranium Project, Review Period Ends: 12/02/2013, Contact: Tom Sunderland..., Tiering FEIS--U.S. Coast Guard Rulemaking for Dry Cargo Residue Discharges in the Great Lakes, Review... No. 20130290, Draft EIS, NPS, CA, Restoration of Native Species in High Elevation Aquatic Ecosystems...

  14. ATLAS WORLD-cloud and networking in PanDA

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, F.; De, K.; Di Girolamo, A.; Maeno, T.; Walker, R.; ATLAS Collaboration

    2017-10-01

    The ATLAS computing model was originally designed as static clouds (usually national or geographical groupings of sites) around the Tier 1 centres, which confined tasks and most of the data traffic. Since those early days, the sites’ network bandwidth has increased at 0(1000) and the difference in functionalities between Tier 1s and Tier 2s has reduced. After years of manual, intermediate solutions, we have now ramped up to full usage of World-cloud, the latest step in the PanDA Workload Management System to increase resource utilization on the ATLAS Grid, for all workflows (MC production, data (re)processing, etc.). We have based the development on two new site concepts. Nuclei sites are the Tier 1s and large Tier 2s, where tasks will be assigned and the output aggregated, and satellites are the sites that will execute the jobs and send the output to their nucleus. PanDA dynamically pairs nuclei and satellite sites for each task based on the input data availability, capability matching, site load and network connectivity. This contribution will introduce the conceptual changes for World-cloud, the development necessary in PanDA, an insight into the network model and the first half-year of operational experience.

  15. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    ERIC Educational Resources Information Center

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  16. 20 CFR 226.33 - Spouse regular annuity rate.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Spouse regular annuity rate. 226.33 Section... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing a Spouse or Divorced Spouse Annuity § 226.33 Spouse regular annuity rate. The final tier I and tier II rates, from §§ 226.30 and 226.32, are...

  17. CMS Readiness for Multi-Core Workload Scheduling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides amore » solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.« less

  18. CMS readiness for multi-core workload scheduling

    NASA Astrophysics Data System (ADS)

    Perez-Calero Yzquierdo, A.; Balcas, J.; Hernandez, J.; Aftab Khan, F.; Letts, J.; Mason, D.; Verguilov, V.

    2017-10-01

    In the present run of the LHC, CMS data reconstruction and simulation algorithms benefit greatly from being executed as multiple threads running on several processor cores. The complexity of the Run 2 events requires parallelization of the code to reduce the memory-per- core footprint constraining serial execution programs, thus optimizing the exploitation of present multi-core processor architectures. The allocation of computing resources for multi-core tasks, however, becomes a complex problem in itself. The CMS workload submission infrastructure employs multi-slot partitionable pilots, built on HTCondor and GlideinWMS native features, to enable scheduling of single and multi-core jobs simultaneously. This provides a solution for the scheduling problem in a uniform way across grid sites running a diversity of gateways to compute resources and batch system technologies. This paper presents this strategy and the tools on which it has been implemented. The experience of managing multi-core resources at the Tier-0 and Tier-1 sites during 2015, along with the deployment phase to Tier-2 sites during early 2016 is reported. The process of performance monitoring and optimization to achieve efficient and flexible use of the resources is also described.

  19. Network monitoring in the Tier2 site in Prague

    NASA Astrophysics Data System (ADS)

    Eliáš, Marek; Fiala, Lukáš; Horký, Jiří; Chudoba, Jiří; Kouba, Tomáš; Kundrát, Jan; Švec, Jan

    2011-12-01

    Network monitoring provides different types of view on the network traffic. It's output enables computing centre staff to make qualified decisions about changes in the organization of computing centre network and to spot possible problems. In this paper we present network monitoring framework used at Tier-2 in Prague in Institute of Physics (FZU). The framework consists of standard software and custom tools. We discuss our system for hardware failures detection using syslog logging and Nagios active checks, bandwidth monitoring of physical links and analysis of NetFlow exports from Cisco routers. We present tool for automatic detection of network layout based on SNMP. This tool also records topology changes into SVN repository. Adapted weathermap4rrd is used to visualize recorded data to get fast overview showing current bandwidth usage of links in network.

  20. 20 CFR 228.10 - Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...

  1. 20 CFR 228.10 - Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...

  2. 20 CFR 228.10 - Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...

  3. 20 CFR 228.10 - Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...

  4. 20 CFR 228.10 - Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Computation of the tier I annuity component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. 228.10... component for a widow(er), disabled widow(er), remarried widow(er), and a surviving divorced spouse. The...

  5. 76 FR 33380 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-08

    ... Two New Pricing Tiers, Investor Tier 1 and Investor Tier 2 June 3, 2011. Pursuant to Section 19(b)(1... Services (the ``Schedule'') to introduce two new pricing tiers, Investor Tier 1 and Investor Tier 2. The... proposes to introduce two new pricing tier levels, Investor Tier 1 and Investor Tier 2. Investor Tier 1...

  6. A Dashboard for the Italian Computing in ALICE

    NASA Astrophysics Data System (ADS)

    Elia, D.; Vino, G.; Bagnasco, S.; Crescente, A.; Donvito, G.; Franco, A.; Lusso, S.; Mura, D.; Piano, S.; Platania, G.; ALICE Collaboration

    2017-10-01

    A dashboard devoted to the computing in the Italian sites for the ALICE experiment at the LHC has been deployed. A combination of different complementary monitoring tools is typically used in most of the Tier-2 sites: this makes somewhat difficult to figure out at a glance the status of the site and to compare information extracted from different sources for debugging purposes. To overcome these limitations a dedicated ALICE dashboard has been designed and implemented in each of the ALICE Tier-2 sites in Italy: in particular, it provides a single, interactive and easily customizable graphical interface where heterogeneous data are presented. The dashboard is based on two main ingredients: an open source time-series database and a dashboard builder tool for visualizing time-series metrics. Various sensors, able to collect data from the multiple data sources, have been also written. A first version of a national computing dashboard has been implemented using a specific instance of the builder to gather data from all the local databases.

  7. 40 CFR Appendix C to Part 132 - Great Lakes Water Quality Initiative Methodologies for Development of Human Health Criteria and...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... organisms where higher doses or concentrations resulted in an adverse effect. Quantitative structure... probable or possible human carcinogen, when, because of major qualitative or quantitative limitations, the... quantitative risk assessment, but for which data are inadequate for Tier I criterion development due to a tumor...

  8. 40 CFR Appendix C to Part 132 - Great Lakes Water Quality Initiative Methodologies for Development of Human Health Criteria and...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... organisms where higher doses or concentrations resulted in an adverse effect. Quantitative structure... probable or possible human carcinogen, when, because of major qualitative or quantitative limitations, the... quantitative risk assessment, but for which data are inadequate for Tier I criterion development due to a tumor...

  9. 40 CFR Appendix C to Part 132 - Great Lakes Water Quality Initiative Methodologies for Development of Human Health Criteria and...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... organisms where higher doses or concentrations resulted in an adverse effect. Quantitative structure... probable or possible human carcinogen, when, because of major qualitative or quantitative limitations, the... quantitative risk assessment, but for which data are inadequate for Tier I criterion development due to a tumor...

  10. A New Generation of Networks and Computing Models for High Energy Physics in the LHC Era

    NASA Astrophysics Data System (ADS)

    Newman, H.

    2011-12-01

    Wide area networks of increasing end-to-end capacity and capability are vital for every phase of high energy physicists' work. Our bandwidth usage, and the typical capacity of the major national backbones and intercontinental links used by our field have progressed by a factor of several hundred times over the past decade. With the opening of the LHC era in 2009-10 and the prospects for discoveries in the upcoming LHC run, the outlook is for a continuation or an acceleration of these trends using next generation networks over the next few years. Responding to the need to rapidly distribute and access datasets of tens to hundreds of terabytes drawn from multi-petabyte data stores, high energy physicists working with network engineers and computer scientists are learning to use long range networks effectively on an increasing scale, and aggregate flows reaching the 100 Gbps range have been observed. The progress of the LHC, and the unprecedented ability of the experiments to produce results rapidly using worldwide distributed data processing and analysis has sparked major, emerging changes in the LHC Computing Models, which are moving from the classic hierarchical model designed a decade ago to more agile peer-to-peer-like models that make more effective use of the resources at Tier2 and Tier3 sites located throughout the world. A new requirements working group has gauged the needs of Tier2 centers, and charged the LHCOPN group that runs the network interconnecting the LHC Tierls with designing a new architecture interconnecting the Tier2s. As seen from the perspective of ICFA's Standing Committee on Inter-regional Connectivity (SCIC), the Digital Divide that separates physicists in several regions of the developing world from those in the developed world remains acute, although many countries have made major advances through the rapid installation of modern network infrastructures. A case in point is Africa, where a new round of undersea cables promises to transform the continent.

  11. Future Approach to tier-0 extension

    NASA Astrophysics Data System (ADS)

    Jones, B.; McCance, G.; Cordeiro, C.; Giordano, D.; Traylen, S.; Moreno García, D.

    2017-10-01

    The current tier-0 processing at CERN is done on two managed sites, the CERN computer centre and the Wigner computer centre. With the proliferation of public cloud resources at increasingly competitive prices, we have been investigating how to transparently increase our compute capacity to include these providers. The approach taken has been to integrate these resources using our existing deployment and computer management tools and to provide them in a way that exposes them to users as part of the same site. The paper will describe the architecture, the toolset and the current production experiences of this model.

  12. Managing a tier-2 computer centre with a private cloud infrastructure

    NASA Astrophysics Data System (ADS)

    Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara

    2014-06-01

    In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.

  13. 76 FR 40974 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing and Immediate Effectiveness of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-12

    ... Two New Pricing Tiers, Step-Up Tier 1 and Step-Up Tier 2 July 6, 2011. Pursuant to Section 19(b)(1) of... Services (the ``Schedule'') to introduce two new pricing tiers, Step-Up Tier 1 and Step-Up Tier 2. The text... Arca proposes to introduce two new pricing tier levels, Step-Up Tier 1 and Step-Up Tier 2. Step-Up Tier...

  14. The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment

    NASA Astrophysics Data System (ADS)

    Brun, R.; Duellmann, D.; Ganis, G.; Hanushevsky, A.; Janyst, L.; Peters, A. J.; Rademakers, F.; Sindrilaru, E.

    2011-12-01

    The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyse the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with a discussion of the potential role of this new component at the different tiers of a distributed computing grid.

  15. The Use of Proxy Caches for File Access in a Multi-Tier Grid Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, R.; Dullmann, D.; Ganis, G.

    2012-04-19

    The use of proxy caches has been extensively studied in the HEP environment for efficient access of database data and showed significant performance with only very moderate operational effort at higher grid tiers (T2, T3). In this contribution we propose to apply the same concept to the area of file access and analyze the possible performance gains, operational impact on site services and applicability to different HEP use cases. Base on a proof-of-concept studies with a modified XROOT proxy server we review the cache efficiency and overheads for access patterns of typical ROOT based analysis programs. We conclude with amore » discussion of the potential role of this new component at the different tiers of a distributed computing grid.« less

  16. FAST: A fully asynchronous and status-tracking pattern for geoprocessing services orchestration

    NASA Astrophysics Data System (ADS)

    Wu, Huayi; You, Lan; Gui, Zhipeng; Gao, Shuang; Li, Zhenqiang; Yu, Jingmin

    2014-09-01

    Geoprocessing service orchestration (GSO) provides a unified and flexible way to implement cross-application, long-lived, and multi-step geoprocessing service workflows by coordinating geoprocessing services collaboratively. Usually, geoprocessing services and geoprocessing service workflows are data and/or computing intensive. The intensity feature may make the execution process of a workflow time-consuming. Since it initials an execution request without blocking other interactions on the client side, an asynchronous mechanism is especially appropriate for GSO workflows. Many critical problems remain to be solved in existing asynchronous patterns for GSO including difficulties in improving performance, status tracking, and clarifying the workflow structure. These problems are a challenge when orchestrating performance efficiency, making statuses instantly available, and constructing clearly structured GSO workflows. A Fully Asynchronous and Status-Tracking (FAST) pattern that adopts asynchronous interactions throughout the whole communication tier of a workflow is proposed for GSO. The proposed FAST pattern includes a mechanism that actively pushes the latest status to clients instantly and economically. An independent proxy was designed to isolate the status tracking logic from the geoprocessing business logic, which assists the formation of a clear GSO workflow structure. A workflow was implemented in the FAST pattern to simulate the flooding process in the Poyang Lake region. Experimental results show that the proposed FAST pattern can efficiently tackle data/computing intensive geoprocessing tasks. The performance of all collaborative partners was improved due to the asynchronous mechanism throughout communication tier. A status-tracking mechanism helps users retrieve the latest running status of a GSO workflow in an efficient and instant way. The clear structure of the GSO workflow lowers the barriers for geospatial domain experts and model designers to compose asynchronous GSO workflows. Most importantly, it provides better support for locating and diagnosing potential exceptions.

  17. A distributed Tier-1

    NASA Astrophysics Data System (ADS)

    Fischer, L.; Grønager, M.; Kleist, J.; Smirnova, O.

    2008-07-01

    The Tier-1 facility operated by the Nordic DataGrid Facility (NDGF) differs significantly from other Tier-1s in several aspects: firstly, it is not located at one or a few premises, but instead is distributed throughout the Nordic countries; secondly, it is not under the governance of a single organization but instead is a meta-center built of resources under the control of a number of different national organizations. We present some technical implications of these aspects as well as the high-level design of this distributed Tier-1. The focus will be on computing services, storage and monitoring.

  18. Integrating Puppet and Gitolite to provide a novel solution for scalable system management at the MPPMU Tier2 centre

    NASA Astrophysics Data System (ADS)

    Delle Fratte, C.; Kennedy, J. A.; Kluth, S.; Mazzaferro, L.

    2015-12-01

    In a grid computing infrastructure tasks such as continuous upgrades, services installations and software deployments are part of an admins daily work. In such an environment tools to help with the management, provisioning and monitoring of the deployed systems and services have become crucial. As experiments such as the LHC increase in scale, the computing infrastructure also becomes larger and more complex. Moreover, today's admins increasingly work within teams that share responsibilities and tasks. Such a scaled up situation requires tools that not only simplify the workload on administrators but also enable them to work seamlessly in teams. In this paper will be presented our experience from managing the Max Planck Institute Tier2 using Puppet and Gitolite in a cooperative way to help the system administrator in their daily work. In addition to describing the Puppet-Gitolite system, best practices and customizations will also be shown.

  19. Status and Trends in Networking at LHC Tier1 Facilities

    NASA Astrophysics Data System (ADS)

    Bobyshev, A.; DeMar, P.; Grigaliunas, V.; Bigrow, J.; Hoeft, B.; Reymund, A.

    2012-12-01

    The LHC is entering its fourth year of production operation. Most Tier1 facilities have been in operation for almost a decade, when development and ramp-up efforts are included. LHC's distributed computing model is based on the availability of high capacity, high performance network facilities for both the WAN and LAN data movement, particularly within the Tier1 centers. As a result, the Tier1 centers tend to be on the leading edge of data center networking technology. In this paper, we analyze past and current developments in Tier1 LAN networking, as well as extrapolating where we anticipate networking technology is heading. Our analysis will include examination into the following areas: • Evolution of Tier1 centers to their current state • Evolving data center networking models and how they apply to Tier1 centers • Impact of emerging network technologies (e.g. 10GE-connected hosts, 40GE/100GE links, IPv6) on Tier1 centers • Trends in WAN data movement and emergence of software-defined WAN network capabilities • Network virtualization

  20. Testing SLURM open source batch system for a Tierl/Tier2 HEP computing facility

    NASA Astrophysics Data System (ADS)

    Donvito, Giacinto; Salomoni, Davide; Italiano, Alessandro

    2014-06-01

    In this work the testing activities that were carried on to verify if the SLURM batch system could be used as the production batch system of a typical Tier1/Tier2 HEP computing center are shown. SLURM (Simple Linux Utility for Resource Management) is an Open Source batch system developed mainly by the Lawrence Livermore National Laboratory, SchedMD, Linux NetworX, Hewlett-Packard, and Groupe Bull. Testing was focused both on verifying the functionalities of the batch system and the performance that SLURM is able to offer. We first describe our initial set of requirements. Functionally, we started configuring SLURM so that it replicates all the scheduling policies already used in production in the computing centers involved in the test, i.e. INFN-Bari and the INFN-Tier1 at CNAF, Bologna. Currently, the INFN-Tier1 is using IBM LSF (Load Sharing Facility), while INFN-Bari, an LHC Tier2 for both CMS and Alice, is using Torque as resource manager and MAUI as scheduler. We show how we configured SLURM in order to enable several scheduling functionalities such as Hierarchical FairShare, Quality of Service, user-based and group-based priority, limits on the number of jobs per user/group/queue, job age scheduling, job size scheduling, and scheduling of consumable resources. We then show how different job typologies, like serial, MPI, multi-thread, whole-node and interactive jobs can be managed. Tests on the use of ACLs on queues or in general other resources are then described. A peculiar SLURM feature we also verified is triggers on event, useful to configure specific actions on each possible event in the batch system. We also tested highly available configurations for the master node. This feature is of paramount importance since a mandatory requirement in our scenarios is to have a working farm cluster even in case of hardware failure of the server(s) hosting the batch system. Among our requirements there is also the possibility to deal with pre-execution and post-execution scripts, and controlled handling of the failure of such scripts. This feature is heavily used, for example, at the INFN-Tier1 in order to check the health status of a worker node before execution of each job. Pre- and post-execution scripts are also important to let WNoDeS, the IaaS Cloud solution developed at INFN, use SLURM as its resource manager. WNoDeS has already been supporting the LSF and Torque batch systems for some time; in this work we show the work done so that WNoDeS supports SLURM as well. Finally, we show several performance tests that we carried on to verify SLURM scalability and reliability, detailing scalability tests both in terms of managed nodes and of queued jobs.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bobyshev, A.; DeMar, P.; Grigaliunas, V.

    The LHC is entering its fourth year of production operation. Most Tier1 facilities have been in operation for almost a decade, when development and ramp-up efforts are included. LHC's distributed computing model is based on the availability of high capacity, high performance network facilities for both the WAN and LAN data movement, particularly within the Tier1 centers. As a result, the Tier1 centers tend to be on the leading edge of data center networking technology. In this paper, we analyze past and current developments in Tier1 LAN networking, as well as extrapolating where we anticipate networking technology is heading. Ourmore » analysis will include examination into the following areas: Evolution of Tier1 centers to their current state Evolving data center networking models and how they apply to Tier1 centers Impact of emerging network technologies (e.g. 10GE-connected hosts, 40GE/100GE links, IPv6) on Tier1 centers Trends in WAN data movement and emergence of software-defined WAN network capabilities Network virtualization« less

  2. How MAP kinase modules function as robust, yet adaptable, circuits.

    PubMed

    Tian, Tianhai; Harding, Angus

    2014-01-01

    Genetic and biochemical studies have revealed that the diversity of cell types and developmental patterns evident within the animal kingdom is generated by a handful of conserved, core modules. Core biological modules must be robust, able to maintain functionality despite perturbations, and yet sufficiently adaptable for random mutations to generate phenotypic variation during evolution. Understanding how robust, adaptable modules have influenced the evolution of eukaryotes will inform both evolutionary and synthetic biology. One such system is the MAP kinase module, which consists of a 3-tiered kinase circuit configuration that has been evolutionarily conserved from yeast to man. MAP kinase signal transduction pathways are used across eukaryotic phyla to drive biological functions that are crucial for life. Here we ask the fundamental question, why do MAPK modules follow a conserved 3-tiered topology rather than some other number? Using computational simulations, we identify a fundamental 2-tiered circuit topology that can be readily reconfigured by feedback loops and scaffolds to generate diverse signal outputs. When this 2-kinase circuit is connected to proximal input kinases, a 3-tiered modular configuration is created that is both robust and adaptable, providing a biological circuit that can regulate multiple phenotypes and maintain functionality in an uncertain world. We propose that the 3-tiered signal transduction module has been conserved through positive selection, because it facilitated the generation of phenotypic variation during eukaryotic evolution.

  3. How MAP kinase modules function as robust, yet adaptable, circuits

    PubMed Central

    Tian, Tianhai; Harding, Angus

    2014-01-01

    Genetic and biochemical studies have revealed that the diversity of cell types and developmental patterns evident within the animal kingdom is generated by a handful of conserved, core modules. Core biological modules must be robust, able to maintain functionality despite perturbations, and yet sufficiently adaptable for random mutations to generate phenotypic variation during evolution. Understanding how robust, adaptable modules have influenced the evolution of eukaryotes will inform both evolutionary and synthetic biology. One such system is the MAP kinase module, which consists of a 3-tiered kinase circuit configuration that has been evolutionarily conserved from yeast to man. MAP kinase signal transduction pathways are used across eukaryotic phyla to drive biological functions that are crucial for life. Here we ask the fundamental question, why do MAPK modules follow a conserved 3-tiered topology rather than some other number? Using computational simulations, we identify a fundamental 2-tiered circuit topology that can be readily reconfigured by feedback loops and scaffolds to generate diverse signal outputs. When this 2-kinase circuit is connected to proximal input kinases, a 3-tiered modular configuration is created that is both robust and adaptable, providing a biological circuit that can regulate multiple phenotypes and maintain functionality in an uncertain world. We propose that the 3-tiered signal transduction module has been conserved through positive selection, because it facilitated the generation of phenotypic variation during eukaryotic evolution. PMID:25483189

  4. Perfluorooctane sulfonate (PFOS) contamination of fish in urban lakes: a prioritization methodology for lake management.

    PubMed

    Xiao, Feng; Gulliver, John S; Simcik, Matt F

    2013-12-15

    The contamination of urban lakes by anthropogenic pollutants such as perfluorooctane sulfonate (PFOS) is a worldwide environmental problem. Large-scale, long-term monitoring of urban lakes requires careful prioritization of available resources, focusing efforts on potentially impaired lakes. Herein, a database of PFOS concentrations in 304 fish caught from 28 urban lakes was used for development of an urban-lake prioritization framework by means of exploratory data analysis (EDA) with the aid of a geographical information system. The prioritization scheme consists of three main tiers: preliminary classification, carried out by hierarchical cluster analysis; predictor screening, fulfilled by a regression tree method; and model development by means of a neural network. The predictive performance of the newly developed model was assessed using a training/validation splitting method and determined by an external validation set. The application of the model in the U.S. state of Minnesota identified 40 urban lakes that may contain elevated levels of PFOS; these lakes were not previously considered in PFOS monitoring programs. The model results also highlight ongoing industrial/commercial activities as a principal determinant of PFOS pollution in urban lakes, and suggest vehicular traffic as an important source and surface runoff as a primary pollution carrier. In addition, the EDA approach was further compared to a spatial interpolation method (kriging), and their advantages and disadvantages were discussed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Understanding the T2 traffic in CMS during Run-1

    NASA Astrophysics Data System (ADS)

    T, Wildish

    2015-12-01

    In the run-up to Run-1 CMS was operating its facilities according to the MONARC model, where data-transfers were strictly hierarchical in nature. Direct transfers between Tier-2 nodes was excluded, being perceived as operationally intensive and risky in an era where the network was expected to be a major source of errors. By the end of Run-1 wide-area networks were more capable and stable than originally anticipated. The original data-placement model was relaxed, and traffic was allowed between Tier-2 nodes. Tier-2 to Tier-2 traffic in 2012 already exceeded the amount of Tier-2 to Tier-1 traffic, so it clearly has the potential to become important in the future. Moreover, while Tier-2 to Tier-1 traffic is mostly upload of Monte Carlo data, the Tier-2 to Tier-2 traffic represents data moved in direct response to requests from the physics analysis community. As such, problems or delays there are more likely to have a direct impact on the user community. Tier-2 to Tier-2 traffic may also traverse parts of the WAN that are at the 'edge' of our network, with limited network capacity or reliability compared to, say, the Tier-0 to Tier-1 traffic which goes the over LHCOPN network. CMS is looking to exploit technologies that allow us to interact with the network fabric so that it can manage our traffic better for us, this we hope to achieve before the end of Run-2. Tier-2 to Tier-2 traffic would be the most interesting use-case for such traffic management, precisely because it is close to the users' analysis and far from the 'core' network infrastructure. As such, a better understanding of our Tier-2 to Tier-2 traffic is important. Knowing the characteristics of our data-flows can help us place our data more intelligently. Knowing how widely the data moves can help us anticipate the requirements for network capacity, and inform the dynamic data placement algorithms we expect to have in place for Run-2. This paper presents an analysis of the CMS Tier-2 traffic during Run 1.

  6. Extending the farm on external sites: the INFN Tier-1 experience

    NASA Astrophysics Data System (ADS)

    Boccali, T.; Cavalli, A.; Chiarelli, L.; Chierici, A.; Cesini, D.; Ciaschini, V.; Dal Pra, S.; dell'Agnello, L.; De Girolamo, D.; Falabella, A.; Fattibene, E.; Maron, G.; Prosperini, A.; Sapunenko, V.; Virgilio, S.; Zani, S.

    2017-10-01

    The Tier-1 at CNAF is the main INFN computing facility offering computing and storage resources to more than 30 different scientific collaborations including the 4 experiments at the LHC. It is also foreseen a huge increase in computing needs in the following years mainly driven by the experiments at the LHC (especially starting with the run 3 from 2021) but also by other upcoming experiments such as CTA[1] While we are considering the upgrade of the infrastructure of our data center, we are also evaluating the possibility of using CPU resources available in other data centres or even leased from commercial cloud providers. Hence, at INFN Tier-1, besides participating to the EU project HNSciCloud, we have also pledged a small amount of computing resources (˜ 2000 cores) located at the Bari ReCaS[2] for the WLCG experiments for 2016 and we are testing the use of resources provided by a commercial cloud provider. While the Bari ReCaS data center is directly connected to the GARR network[3] with the obvious advantage of a low latency and high bandwidth connection, in the case of the commercial provider we rely only on the General Purpose Network. In this paper we describe the set-up phase and the first results of these installations started in the last quarter of 2015, focusing on the issues that we have had to cope with and discussing the measured results in terms of efficiency.

  7. 77 FR 44528 - Dry Cargo Residue Discharges in the Great Lakes

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-30

    ...The Coast Guard proposes replacing its existing interim rule with a new rule to regulate the operation of U.S. and foreign vessels carrying bulk dry cargo such as limestone, iron ore, and coal on the U.S. waters of the Great Lakes, and the operation of U.S. bulk dry cargo vessels anywhere on the Great Lakes. Specifically, the Coast Guard proposes new requirements for the discharge of bulk dry cargo residue (DCR) on the U.S. waters of the Great Lakes. The Coast Guard also announces the availability of the tiered Draft Environmental Impact Statement (DEIS) prepared in support of this proposal. The proposed rule would continue to allow non-hazardous and non-toxic discharges of bulk DCR in limited areas of the Great Lakes. However, vessel owners and operators would need to minimize DCR discharges using methods they would be required to document in DCR management plans. The proposed rule would prohibit limestone and clean stone DCR discharges in some waters where they are now permitted. The proposed rule promotes the Coast Guard's strategic goals of maritime mobility and safety and protection of natural resources.

  8. Multi-Dimensional Optimization for Cloud Based Multi-Tier Applications

    ERIC Educational Resources Information Center

    Jung, Gueyoung

    2010-01-01

    Emerging trends toward cloud computing and virtualization have been opening new avenues to meet enormous demands of space, resource utilization, and energy efficiency in modern data centers. By being allowed to host many multi-tier applications in consolidated environments, cloud infrastructure providers enable resources to be shared among these…

  9. Changing the batch system in a Tier 1 computing center: why and how

    NASA Astrophysics Data System (ADS)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  10. Bathymetric map and area/capacity table for Castle Lake, Washington

    USGS Publications Warehouse

    Mosbrucker, Adam R.; Spicer, Kurt R.

    2017-11-14

    The May 18, 1980, eruption of Mount St. Helens produced a 2.5-cubic-kilometer debris avalanche that dammed South Fork Castle Creek, causing Castle Lake to form behind a 20-meter-tall blockage. Risk of a catastrophic breach of the newly impounded lake led to outlet channel stabilization work, aggressive monitoring programs, mapping efforts, and blockage stability studies. Despite relatively large uncertainty, early mapping efforts adequately supported several lake breakout models, but have limited applicability to current lake monitoring and hazard assessment. Here, we present the results of a bathymetric survey conducted in August 2012 with the purpose of (1) verifying previous volume estimates, (2) computing an area/capacity table, and (3) producing a bathymetric map. Our survey found seasonal lake volume ranges between 21.0 and 22.6 million cubic meters with a fundamental vertical accuracy representing 0.88 million cubic meters. Lake surface area ranges between 1.13 and 1.16 square kilometers. Relationships developed by our results allow the computation of lake volume from near real-time lake elevation measurements or from remotely sensed imagery.

  11. Sediment characteristics and sedimentation rates in Lake Michie, Durham County, North Carolina, 1990-92

    USGS Publications Warehouse

    Weaver, J.C.

    1994-01-01

    A reservoir sedimentation study was conducted at 508-acre Lake Michie, a municipal water-supply reservoir in northeastern Durham County, North Carolina, during 1990-92. The effects of sedimentation in Lake Michie were investigated, and current and historical rates of sedimentation were evaluated. Particle-size distributions of lake-bottom sediment indicate that, overall, Lake Michie is rich in silt and clay. Nearly all sand is deposited in the upstream region of the lake, and its percentage in the sediment decreases to less than 2 percent in the lower half of the lake. The average specific weight of lake-bottom sediment in Lake Michie is 73.6 pounds per cubic foot. The dry-weight percentage of total organic carbon in lake-bottom sediment ranges from 1.1 to 3.8 percent. Corresponding carbon-nitrogen ratios range form 8.6 to 17.6. Correlation of the total organic carbon percentages with carbon-nitrogen ratios indicates that plant and leaf debris are the primary sources of organic material in Lake Michie. Sedimentation rates were computed using comparisons of bathymetric volumes. Comparing the current and previous bathymetric volumes, the net amount of sediment deposited (trapped) in Lake Michie during 1926-92 is estimated to be about 2,541 acre-feet or slightly more than 20 percent of the original storage volume computed in 1935. Currently (1992), the average sedimentation rate is 38 acre-feet per year, down from 45.1 acre-feet per year in 1935. To confirm the evidence that sedimentation rates have decreased at Lake Michie since its construction in 1926, sediment accretion rates were computed using radionuclide profiles of lake-bottom sediment. Sediment accretion rates estimated from radiochemical analyses of Cesium-137 and lead-210 and radionuclides in the lake-bottom sediment indicate that rates were higher in the lake?s early years prior to 1962. Estimated suspended-sediment yields for inflow and outflow sites during 1983-91 indicate a suspended-sediment trap efficiency of 89 percent. An overall trap efficiency for the period of 1983-91 was computed using the capacity-inflow ratio. The use of this ratio indicates that the trap efficiency for Lake Michie is 85 percent. However, the suspended-sediment trap efficiency indicates that the actual overall trap efficiency for Lake Michie was probably greater than 89 percent during this period.

  12. Method and system for knowledge discovery using non-linear statistical analysis and a 1st and 2nd tier computer program

    DOEpatents

    Hively, Lee M [Philadelphia, TN

    2011-07-12

    The invention relates to a method and apparatus for simultaneously processing different sources of test data into informational data and then processing different categories of informational data into knowledge-based data. The knowledge-based data can then be communicated between nodes in a system of multiple computers according to rules for a type of complex, hierarchical computer system modeled on a human brain.

  13. Tier Two Interventions Implemented within the Context of a Tiered Prevention Framework

    ERIC Educational Resources Information Center

    Mitchell, Barbara S.; Stormont, Melissa; Gage, Nicholas A.

    2011-01-01

    Despite a growing body of evidence demonstrating the value of Tier 1 and Tier 3 interventions, significantly less is known about Tier 2 level treatments when they are added within the context of a tiered continuum of support. The purpose of this article is to systematically review the existing research base for Tier 2 small group intervention…

  14. Radio-echo sounding of 'active' Antarctic subglacial lakes

    NASA Astrophysics Data System (ADS)

    Siegert, M. J.; Ross, N.; Blankenship, D. D.; Young, D. A.; Greenbaum, J. S.; Richter, T.; Rippin, D. M.; Le Brocq, A. M.; Wright, A.; Bingham, R.; Corr, H.; Ferraccioli, F.; Jordan, T. A.; Smith, B. E.; Payne, A. J.; Dowdeswell, J. A.; Bamber, J. L.

    2013-12-01

    Repeat-pass satellite altimetry has revealed 124 discrete surface height changes across the Antarctic Ice Sheet, interpreted to be caused by subglacial lake discharges (surface lowering) and inputs (surface uplift). Few of these active lakes have been confirmed by radio-echo sounding (RES) despite several attempts, however. Over the last 5 years, major geophysical campaigns have acquired RES data from several 'active' lake sites, including the US-UK-Australian ICECAP programme in East Antactica and the UK survey of the Institute Ice Stream in West Antarctica. In the latter case, a targeted RES survey of one 'active' lake was undertaken. RES evidence of the subglacial bed beneath 'active' lakes in both East and West Antarctica will be presented, and the evidence for pooled subglacial water from these data will be assessed. Based on this assessment, the nature of 'active' subglacial lakes, and their associated hydrology and relationship with surrounding topography will be discussed, as will the likelihood of further 'active' lakes in Antarctica. Hydraulic potential map of the Byrd Glacier catchment with contours at 5 MPa intervals. Predicted subglacial flowpaths are shown in blue. Subglacial lakes known from previous geophysical surveys are shown as black triangles while the newly discovered 'Three-tier lakes' are shown in dashed black outline. Surface height change features within the Byrd subglacial catchment are shown in outline and are shaded to indicate whether they were rising or falling during the ICESat campaign. Those features are labelled in-line with the numbering system of Smith et al. (J. Glac. 2009).

  15. Mega drought in the Colorado River Basin, water supply, and adaptive scenario planning for the Phoenix Metropolitan Area; simulations using WaterSim 5.

    NASA Astrophysics Data System (ADS)

    Sampson, D. A.

    2015-12-01

    The Decision Center for a Desert City (DCDC), a boundary organization, bridges science and policy (to foster knowledge-based decision making); we study how decisions are made in the face of uncertainty. Our water policy and management model for the Phoenix Metropolitan Area (hereafter "Phoenix"), termed WaterSim, represents one such bridging mechanism. We evaluated the effect of varying the length of drought on water availability for Phoenix. We examined droughts (starting in 2000) lasting 15, 25, and 50 years. We picked a 60-year window of runoff estimates from the paleo reconstruction data for the Colorado River (CO) (1121 through 1180 A.D.), and the two local rivers (1391 through 1450 A.D.), and assumed that the proportional difference in median flow between these periods and the long-term record represented an estimate of potential drought reductions on river flows. This resulted in a 12%, and 19% reduction in flows for the CO River and the Salt-Verde (SV) Rivers, respectively. WaterSim uses 30-year trace periods from the historical flow records to simulate river flow for future projections. We used each 30-year trace from the historical record (1906 to present, CO River; 1945 to present SV Rivers) , and default settings, to simulate 60 year projections of Lake Mead elevation and the accompanying Colorado River water shortages to Phoenix. Overall, elevations for Lake Mead fell below the 1st shortage sharing tier (1075 ft) in 83% of the simulations; 74% of the simulations fell below the 2nd tier (1050 ft), and 64% fell below the 3rd (1025 ft). Length of drought, however, determined the shortage tiers met. Median elevations for droughts ending in 2015, 2025, and 2050 were 1036, 1019, and 967 feet msl, respectively. We present the plausible water futures with adaptive anticipatory scenario planning for the projected reductions in surface water availability to demonstrate decision points for water conservation measures to effectively manage shortage conditions.

  16. Snake River Sockeye Salmon Habitat and Limnological Research : 2008 Annual Progress Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohler, Andre E.; Griswold, Robert G.; Taki, Doug

    2009-07-31

    In March 1990, the Shoshone-Bannock Tribes petitioned the National Marine Fisheries Service (NMFS) to list Snake River sockeye salmon (Oncorhynchus nerka) as endangered. Snake River sockeye salmon were officially listed as endangered in November 1991 under the Endangered Species Act (56 FR 58619). In 1991, the Snake River Sockeye Salmon Habitat and Limnological Research Project was implemented. This project is part of an interagency effort to prevent the extinction of the Redfish Lake stock of Snake River sockeye salmon. The Shoshone-Bannock Tribal goal for this project is two tiered: the immediate goal is to increase the population of Snake Rivermore » sockeye salmon while preserving the unique genetic characteristics of the evolutionarily significant unit (ESU). The Tribes long term goal is to maintain a viable population that warrants delisting and provides Tribal harvest opportunities. The Bonneville Power Administration (BPA) provides funding for this interagency Recovery effort. Collaborators in the recovery effort include the National Oceanic and Atmospheric Administration (NOAA), the Idaho Department of Fish and Game (IDFG), the University of Idaho (UI), and the Shoshone-Bannock Tribes (SBT). This report summarizes activities conducted by Shoshone-Bannock Tribal Fisheries Department personnel during the 2008 calendar year. Project tasks include: (1) monitor limnological parameters of the Sawtooth Valley lakes to assess lake productivity; (2) conduct lake fertilization in Pettit and Alturas lakes; (3) reduce the number of mature kokanee salmon spawning in Alturas Lake Creek; (4) monitor, enumerate, and evaluate sockeye salmon smolt migration from Pettit and Alturas lakes; (5) monitor spawning kokanee salmon escapement and estimate fry recruitment in Fishhook and Alturas Lake creeks; (6) conduct sockeye and kokanee salmon population surveys; (7) evaluate potential competition and predation between stocked juvenile sockeye salmon and a variety of fish species in Redfish, Pettit, and Alturas lakes; and (8) assist IDFG with captive broodstock production activities.« less

  17. BNL ATLAS Grid Computing

    ScienceCinema

    Michael Ernst

    2017-12-09

    As the sole Tier-1 computing facility for ATLAS in the United States and the largest ATLAS computing center worldwide Brookhaven provides a large portion of the overall computing resources for U.S. collaborators and serves as the central hub for storing,

  18. Utilization of ERTS-1 data to monitor and classify eutrophication of inland lakes

    NASA Technical Reports Server (NTRS)

    Rogers, R. H.; Smith, V. E. (Principal Investigator)

    1973-01-01

    The author has identified the following significant results. Significant findings are: (1) one-acre lakes and one-acre islands are detectable; (2)removal of atmospheric parameters derived from RPMI measurements show test lakes to have reflectances of 3.1 to 5.5% in band 4 and 0.3 to 2.3% in band 5; (3) failure to remove reflectance caused by atmosphere results in errors up to 500% in computing lake reflectance from ERTS-1 data; (4) in band 4, up to seven reflectance levels were observed in test lakes; (5) reflectance patterns have been displayed on a color-coded TV monitor and on computer-generated gray scales; (6) deep and shallow water can be separated by a trained photointerpreter and automatic machine processing, with estimates of water depth possible in some cases; (7) RPMI provides direct spectral signature measurements of lakes and lake features such as algal scums and floating plants; (8) a method is reported for obtaining lake color, as estimated by Forel-Ule standards, from ERTS-1 data; (9) a strong correlation between browner water color, diminishing water transparency; and (10) classifying lake eutrophication by observation of surface scums or macrophytes in shallow water seems straightforward.

  19. 20 CFR 228.19 - Reduction for a social security benefit.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Reduction for a social security benefit. 228... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.19 Reduction for a social security benefit. The tier I annuity component is reduced for the amount of any social security benefit to...

  20. 20 CFR 228.19 - Reduction for a social security benefit.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Reduction for a social security benefit. 228... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.19 Reduction for a social security benefit. The tier I annuity component is reduced for the amount of any social security benefit to...

  1. 20 CFR 228.19 - Reduction for a social security benefit.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Reduction for a social security benefit. 228... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.19 Reduction for a social security benefit. The tier I annuity component is reduced for the amount of any social security benefit to...

  2. 20 CFR 228.19 - Reduction for a social security benefit.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Reduction for a social security benefit. 228... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.19 Reduction for a social security benefit. The tier I annuity component is reduced for the amount of any social security benefit to...

  3. 20 CFR 228.19 - Reduction for a social security benefit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Reduction for a social security benefit. 228... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.19 Reduction for a social security benefit. The tier I annuity component is reduced for the amount of any social security benefit to...

  4. Mapping of the US Domestic Influenza Virologic Surveillance Landscape.

    PubMed

    Jester, Barbara; Schwerzmann, Joy; Mustaquim, Desiree; Aden, Tricia; Brammer, Lynnette; Humes, Rosemary; Shult, Pete; Shahangian, Shahram; Gubareva, Larisa; Xu, Xiyan; Miller, Joseph; Jernigan, Daniel

    2018-07-17

    Influenza virologic surveillance is critical each season for tracking influenza circulation, following trends in antiviral drug resistance, detecting novel influenza infections in humans, and selecting viruses for use in annual seasonal vaccine production. We developed a framework and process map for characterizing the landscape of US influenza virologic surveillance into 5 tiers of influenza testing: outpatient settings (tier 1), inpatient settings and commercial laboratories (tier 2), state public health laboratories (tier 3), National Influenza Reference Center laboratories (tier 4), and Centers for Disease Control and Prevention laboratories (tier 5). During the 2015-16 season, the numbers of influenza tests directly contributing to virologic surveillance were 804,000 in tiers 1 and 2; 78,000 in tier 3; 2,800 in tier 4; and 3,400 in tier 5. With the release of the 2017 US Pandemic Influenza Plan, the proposed framework will support public health officials in modeling, surveillance, and pandemic planning and response.

  5. Acute tier-1 and tier-2 effect assessment approaches in the EFSA Aquatic Guidance Document: are they sufficiently protective for insecticides?

    PubMed

    van Wijngaarden, René P A; Maltby, Lorraine; Brock, Theo C M

    2015-08-01

    The objective of this paper is to evaluate whether the acute tier-1 and tier-2 methods as proposed by the Aquatic Guidance Document recently published by the European Food Safety Authority (EFSA) are appropriate for deriving regulatory acceptable concentrations (RACs) for insecticides. The tier-1 and tier-2 RACs were compared with RACs based on threshold concentrations from micro/mesocosm studies (ETO-RAC). A lower-tier RAC was considered as sufficiently protective, if less than the corresponding ETO-RAC. ETO-RACs were calculated for repeated (n = 13) and/or single pulsed applications (n = 17) of 26 insecticides to micro/mesocosms, giving a maximum of 30 insecticide × application combinations (i.e. cases) for comparison. Acute tier-1 RACs (for 24 insecticides) were lower than the corresponding ETO-RACs in 27 out of 29 cases, while tier-2 Geom-RACs (for 23 insecticides) were lower in 24 out of 26 cases. The tier-2 SSD-RAC (for 21 insecticides) using HC5 /3 was lower than the ETO-RAC in 23 out of 27 cases, whereas the tier-2 SSD-RAC using HC5 /6 was protective in 25 out of 27 cases. The tier-1 and tier-2 approaches proposed by EFSA for acute effect assessment are sufficiently protective for the majority of insecticides evaluated. Further evaluation may be needed for insecticides with more novel chemistries (neonicotinoids, biopesticides) and compounds that show delayed effects (insect growth regulators). © 2014 Society of Chemical Industry.

  6. Investigation of the Fractal Geometry of Tundra Lake Patterns using Historical Topographic Maps and Satellite Imagery.

    NASA Astrophysics Data System (ADS)

    Kariyawasam, T.; Essa, A.; Gong, M.; Sudakov, I.

    2017-12-01

    Greenhouse gas emissions from tundra lakes are a significant positive feedback to the atmosphere in a changing climate as a pronounced growth of the numbers of tundra lake patterns has been observed in the Arctic region. Detailed knowledge of spatial dynamics of lake patterns in a changing arctic tundra landscape and their geometrical properties is therefore potentially valuable, in order to understand and accurately model the sources of greenhouse gas emissions from boreal permafrost. Our goal is to use a collection of historical topographic maps and satellite imagery of tundra lakes to conduct computational image analyses for examining spatial dynamics of Tundra lake patterns. Our approach is based upon analyzing area-perimeter data of thousands of tundra lakes to compute the fractal dimension to study the tundra lake pattern geometry, which have been used to classify pollen grains by textual patterning (Mander, 2016), vegetation in dryland ecosystems (Mander, 2017) and melt pond patterns (Hohenegger, 2012). By analyzing area - perimeter data for over 900 lakes we find that for both historical topographic maps and current satellite imagery, the fractal dimension D is stable at 1.6 for Tundra lakes with area less than about 100km2. For Tundra lake sizes bigger than 100 km2 fractal dimension takes values close to 2 and less than one indicative of structural changes in Tundra lake pattern geometry. Furthermore the current study did not reveal any percolation transition above some critical threshold in Tundra lake evolution. The results of the study will provide scientists with new data on these aspects of tundra lakes to help characterize the geomorphology of spatial patterns in arctic tundra lakes.

  7. Effects of Tier 2 and Tier 3 Mathematics Interventions for Second Graders with Mathematics Difficulties

    ERIC Educational Resources Information Center

    Dennis, Minyi Shih

    2015-01-01

    Two studies were conducted to examine the effects of Tier 2 and Tier 3 mathematics interventions on students with mathematics learning difficulties. In the first study, the work of Bryant et al. was replicated and expanded upon by documenting the sustained effects of a Tier 2 mathematics intervention on mathematics performance by second graders.…

  8. Accelerating chronically unresponsive children to tier 3 instruction: what level of data is necessary to ensure selection accuracy?

    PubMed

    Compton, Donald L; Gilbert, Jennifer K; Jenkins, Joseph R; Fuchs, Douglas; Fuchs, Lynn S; Cho, Eunsoo; Barquero, Laura A; Bouton, Bobette

    2012-01-01

    Response-to-intervention (RTI) approaches to disability identification are meant to put an end to the so-called wait-to-fail requirement associated with IQ discrepancy. However, in an unfortunate irony, there is a group of children who wait to fail in RTI frameworks. That is, they must fail both general classroom instruction (Tier 1) and small-group intervention (Tier 2) before becoming eligible for the most intensive intervention (Tier 3). The purpose of this article was to determine how to predict accurately which at-risk children will be unresponsive to Tiers 1 and 2, thereby allowing unresponsive children to move directly from Tier 1 to Tier 3. As part of an efficacy study of a multitier RTI approach to prevention and identification of reading disabilities (RD), 129 first-grade children who were unresponsive to classroom reading instruction were randomly assigned to 14 weeks of small-group, Tier 2 intervention. Nonresponders to this instruction (n = 33) were identified using local norms on first-grade word identification fluency growth linked to a distal outcome of RD at the end of second grade. Logistic regression models were used to predict membership in responder and nonresponder groups. Predictors were entered as blocks of data from least to most difficult to obtain: universal screening data, Tier 1 response data, norm referenced tests, and Tier 2 response data. Tier 2 response data were not necessary to classify students as responders and nonresponders to Tier 2 instruction, suggesting that some children can be accurately identified as eligible for Tier 3 intervention using only Tier 1 data, thereby avoiding prolonged periods of failure to instruction.

  9. On implementation of DCTCP on three-tier and fat-tree data center network topologies.

    PubMed

    Zafar, Saima; Bashir, Abeer; Chaudhry, Shafique Ahmad

    2016-01-01

    A data center is a facility for housing computational and storage systems interconnected through a communication network called data center network (DCN). Due to a tremendous growth in the computational power, storage capacity and the number of inter-connected servers, the DCN faces challenges concerning efficiency, reliability and scalability. Although transmission control protocol (TCP) is a time-tested transport protocol in the Internet, DCN challenges such as inadequate buffer space in switches and bandwidth limitations have prompted the researchers to propose techniques to improve TCP performance or design new transport protocols for DCN. Data center TCP (DCTCP) emerge as one of the most promising solutions in this domain which employs the explicit congestion notification feature of TCP to enhance the TCP congestion control algorithm. While DCTCP has been analyzed for two-tier tree-based DCN topology for traffic between servers in the same rack which is common in cloud applications, it remains oblivious to the traffic patterns common in university and private enterprise networks which traverse the complete network interconnect spanning upper tier layers. We also recognize that DCTCP performance cannot remain unaffected by the underlying DCN architecture hence there is a need to test and compare DCTCP performance when implemented over diverse DCN architectures. Some of the most notable DCN architectures are the legacy three-tier, fat-tree, BCube, DCell, VL2, and CamCube. In this research, we simulate the two switch-centric DCN architectures; the widely deployed legacy three-tier architecture and the promising fat-tree architecture using network simulator and analyze the performance of DCTCP in terms of throughput and delay for realistic traffic patterns. We also examine how DCTCP prevents incast and outcast congestion when realistic DCN traffic patterns are employed in above mentioned topologies. Our results show that the underlying DCN architecture significantly impacts DCTCP performance. We find that DCTCP gives optimal performance in fat-tree topology and is most suitable for large networks.

  10. 75 FR 73166 - Publication of the Tier 2 Tax Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-29

    ... DEPARTMENT OF THE TREASURY Internal Revenue Service Publication of the Tier 2 Tax Rates AGENCY: Internal Revenue Service, Treasury. ACTION: Notice. SUMMARY: Publication of the tier 2 tax rates for...). Tier 2 taxes on railroad employees, employers, and employee representatives are one source of funding...

  11. 76 FR 71623 - Publication of the Tier 2 Tax Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-18

    ... DEPARTMENT OF THE TREASURY Internal Revenue Service Publication of the Tier 2 Tax Rates AGENCY: Internal Revenue Service (IRS), Treasury. ACTION: Notice. SUMMARY: Publication of the tier 2 tax rates for...). Tier 2 taxes on railroad employees, employers, and employee representatives are one source of funding...

  12. The advantage of calculating emission reduction with local emission factor in South Sumatera region

    NASA Astrophysics Data System (ADS)

    Buchari, Erika

    2017-11-01

    Green House Gases (GHG) which have different Global Warming Potential, usually expressed in CO2 equivalent. German has succeeded in emission reduction of CO2 in year 1990s, while Japan since 2001 increased load factor of public transports. Indonesia National Medium Term Development Plan, 2015-2019, has set up the target of minimum 26% and maximum 41% National Emission Reduction in 2019. Intergovernmental Panel on Climate Change (IPCC), defined three types of accuracy in counting emission of GHG, as tier 1, tier 2, and tier 3. In tier 1, calculation is based on fuel used and average emission (default), which is obtained from statistical data. While in tier 2, calculation is based fuel used and local emission factors. Tier 3 is more accurate from those in tier 1 and 2, and the calculation is based on fuel used from modelling method or from direct measurement. This paper is aimed to evaluate the calculation with tier 2 and tier 3 in South Sumatera region. In 2012, Regional Action Plan for Greenhouse Gases of South Sumatera for 2020 is about 6,569,000 ton per year and with tier 3 is about without mitigation and 6,229,858.468 ton per year. It was found that the calculation in tier 3 is more accurate in terms of fuel used of variation vehicles so that the actions of mitigation can be planned more realistically.

  13. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Caballero, J.; Ernst, M.; Guan, W.; Hover, J.; Lesny, D.; Maeno, T.; Nilsson, P.; Tsulaia, V.; van Gemmeren, P.; Vaniachine, A.; Wang, F.; Wenaus, T.; ATLAS Collaboration

    2016-10-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  14. Tier 2 Interventions in Positive Behavior Support: A Survey of School Implementation

    ERIC Educational Resources Information Center

    Rodriguez, Billie Jo; Loman, Sheldon L.; Borgmeier, Christopher

    2016-01-01

    As increasing numbers of schools implement Multi-Tiered Systems of Support (MTSS), schools are looking for and implementing evidence-based practices for students whose needs are not fully met by Tier 1 supports. Although there is relative consistency and clarity in what constitutes Tier 1 behavior support within MTSS, Tier 2 supports may be more…

  15. Critical thinking traits of top-tier experts and implications for computer science education

    NASA Astrophysics Data System (ADS)

    Bushey, Dean E.

    A documented shortage of technical leadership and top-tier performers in computer science jeopardizes the technological edge, security, and economic well-being of the nation. The 2005 President's Information and Technology Advisory Committee (PITAC) Report on competitiveness in computational sciences highlights the major impact of science, technology, and innovation in keeping America competitive in the global marketplace. It stresses the fact that the supply of science, technology, and engineering experts is at the core of America's technological edge, national competitiveness and security. However, recent data shows that both undergraduate and postgraduate production of computer scientists is falling. The decline is "a quiet crisis building in the United States," a crisis that, if allowed to continue unchecked, could endanger America's well-being and preeminence among the world's nations. Past research on expert performance has shown that the cognitive traits of critical thinking, creativity, and problem solving possessed by top-tier performers can be identified, observed and measured. The studies show that the identified attributes are applicable across many domains and disciplines. Companies have begun to realize that cognitive skills are important for high-level performance and are reevaluating the traditional academic standards they have used to predict success for their top-tier performers in computer science. Previous research in the computer science field has focused either on programming skills of its experts or has attempted to predict the academic success of students at the undergraduate level. This study, on the other hand, examines the critical-thinking skills found among experts in the computer science field in order to explore the questions, "What cognitive skills do outstanding performers possess that make them successful?" and "How do currently used measures of academic performance correlate to critical-thinking skills among students?" The results of this study suggest a need to examine how critical-thinking abilities are learned in the undergraduate computer science curriculum and the need to foster these abilities in order to produce the high-level, critical-thinking professionals necessary to fill the growing need for these experts. Due to the fact that current measures of academic performance do not adequately depict students' cognitive abilities, assessment of these skills must be incorporated into existing curricula.

  16. Numerical modeling of flow and sediment transport in Lake Pontchartrain due to flood release from Bonnet Carré Spillway

    USDA-ARS?s Scientific Manuscript database

    In this study, the flow fields and sediment transport in Lake Pontchartrain during a flood release from Bonnet Carré Spillway (BCS) was simulated using the computational model CCHE2D developed at the National Center for Computational Hydroscience and Engineering (NCCHE), the University of Mississipp...

  17. Building Tier 3 Intervention for Long-Term Slow Growers in Grades 3-4: A Pilot Study

    ERIC Educational Resources Information Center

    Sanchez, Victoria; O'Connor, Rollanda E.

    2015-01-01

    Tier 3 interventions are necessary for students who fail to respond adequately to Tier 1 general education instruction and Tier 2 supplemental reading intervention instruction. We identified 8 students in 3rd and 4th grade who had demonstrated a slow response to Tier 2 reading interventions for three years. Students participated in a…

  18. Assessment of the concordance among 2-tier, 3-tier, and 5-tier fetal heart rate classification systems.

    PubMed

    Gyamfi Bannerman, Cynthia; Grobman, William A; Antoniewicz, Leah; Hutchinson, Maria; Blackwell, Sean

    2011-09-01

    In 2008, a National Institute of Child Health and Human Development/Society for Maternal-Fetal Medicine-sponsored workshop on electronic fetal monitoring recommended a new fetal heart tracing interpretation system. Comparison of this 3-tier system with other systems is lacking. Our purpose was to determine the relationships between fetal heart rate categories for the 3 existing systems. Three Maternal-Fetal Medicine specialists reviewed 120 fetal heart rates. All tracings were from term, singleton pregnancies with known umbilical artery pH. The fetal heart rates were classified by a 2-tier, 3-tier, and 5-tier system. Each Maternal-Fetal Medicine examiner reviewed 120 fetal heart rate segments. When compared with the 2-tier system, 0%, 54%, and 100% tracings in categories 1, 2, and 3 were "nonreassuring." There was strong concordance between category 1 and "green" as well as category 3 and "red" tracings. The 3-tier and 5-tier systems were similar in fetal heart rate interpretations for tracings that were either very normal or very abnormal. Whether one system is superior to the others in predicting fetal acidemia remains unknown. Copyright © 2011 Mosby, Inc. All rights reserved.

  19. 12 CFR 567.0 - Scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...)(2) and 567.8 with tier 1 capital, as computed under sections 11 and 12 of Appendix C of this part... applies to all savings associations, except as described in paragraph (b) of this section. (b)(1) A... § 567.11, which supplement the reservations of authority at section 1 of Appendix C of this part. [72 FR...

  20. A Responsive Tier 2 Process for a Middle School Student with Behavior Problems

    ERIC Educational Resources Information Center

    McDaniel, Sara C.; Bruhn, Allison L.; Mitchell, Barbara S.

    2017-01-01

    Students requiring Tier 2 behavioral supports frequently display behavioral deficits in multiple domains (e.g., emotional symptoms and peer problems). The Tier 2 framework developed by McDaniel, Bruhn, & Mitchell (2015a) is a responsive structure for identifying and intervening at Tier 2. This process is described with a practical case example…

  1. Earth Science Computational Architecture for Multi-disciplinary Investigations

    NASA Astrophysics Data System (ADS)

    Parker, J. W.; Blom, R.; Gurrola, E.; Katz, D.; Lyzenga, G.; Norton, C.

    2005-12-01

    Understanding the processes underlying Earth's deformation and mass transport requires a non-traditional, integrated, interdisciplinary, approach dependent on multiple space and ground based data sets, modeling, and computational tools. Currently, details of geophysical data acquisition, analysis, and modeling largely limit research to discipline domain experts. Interdisciplinary research requires a new computational architecture that is optimized to perform complex data processing of multiple solid Earth science data types in a user-friendly environment. A web-based computational framework is being developed and integrated with applications for automatic interferometric radar processing, and models for high-resolution deformation & gravity, forward models of viscoelastic mass loading over short wavelengths & complex time histories, forward-inverse codes for characterizing surface loading-response over time scales of days to tens of thousands of years, and inversion of combined space magnetic & gravity fields to constrain deep crustal and mantle properties. This framework combines an adaptation of the QuakeSim distributed services methodology with the Pyre framework for multiphysics development. The system uses a three-tier architecture, with a middle tier server that manages user projects, available resources, and security. This ensures scalability to very large networks of collaborators. Users log into a web page and have a personal project area, persistently maintained between connections, for each application. Upon selection of an application and host from a list of available entities, inputs may be uploaded or constructed from web forms and available data archives, including gravity, GPS and imaging radar data. The user is notified of job completion and directed to results posted via URLs. Interdisciplinary work is supported through easy availability of all applications via common browsers, application tutorials and reference guides, and worked examples with visual response. At the platform level, multi-physics application development and workflow are available in the enriched environment of the Pyre framework. Advantages for combining separate expert domains include: multiple application components efficiently interact through Python shared libraries, investigators may nimbly swap models and try new parameter values, and a rich array of common tools are inherent in the Pyre system. The first four specific investigations to use this framework are: Gulf Coast subsidence: understanding of partitioning between compaction, subsidence and growth faulting; Gravity & deformation of a layered spherical earth model due to large earthquakes; Rift setting of Lake Vostok, Antarctica; and global ice mass changes.

  2. A four-tier classification system of pulmonary artery metrics on computed tomography for the diagnosis and prognosis of pulmonary hypertension.

    PubMed

    Truong, Quynh A; Bhatia, Harpreet Singh; Szymonifka, Jackie; Zhou, Qing; Lavender, Zachary; Waxman, Aaron B; Semigran, Marc J; Malhotra, Rajeev

    We aimed to develop a severity classification system of the main pulmonary artery diameter (mPA) and its ratio to the ascending aorta diameter (ratio PA) for the diagnosis and prognosis of pulmonary hypertension (PH) on computed tomography (CT) scans. In 228 patients (136 with PH) undergoing right heart catheterization (RHC) and CT for dyspnea, we measured mPA and ratio PA. In a derivation cohort (n = 114), we determined cutpoints for a four-tier severity grading system that would maximize sensitivity and specificity, and validated it in a separate cohort (n = 114). Cutpoints for mPA were defined with ≤27 mm(F) and ≤29 mm(M) as the normal reference range; mild as >27 to <31 mm(F) and >29 to <31 mm(M); moderate≥31-34 mm; and severe>34 mm. Cutpoints for ratio PA were defined as normal ≤0.9; mild>0.9 to 1.0; moderate>1.0 to 1.1; and severe>1.1. Sensitivities for normal tier were 99% for mPA and 93% for ratio PA; while specificities for severe tier were 98% for mPA>34 mm and 100% for ratio PA>1.1. C-statistics for four-tier mPA and ratio PA were both 0.90 (derivation) and both 0.85 (validation). Severity of mPA and ratio PA corresponded to hemodynamics by RHC and echocardiography (both p < 0.001). Moderate-severe mPA values of ≥31 mm and ratio PA>1.1 had worse survival than normal values (all p ≤ 0.01). A CT-based four-tier severity classification system of PA diameter and its ratio to the aortic diameter has high accuracy for PH diagnosis with increased mortality in patients with moderate-severe severity grades. These results may support clinical utilization on chest and cardiac CT reports. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  3. Estimating implementation and operational costs of an integrated tiered CD4 service including laboratory and point of care testing in a remote health district in South Africa.

    PubMed

    Cassim, Naseem; Coetzee, Lindi M; Schnippel, Kathryn; Glencross, Deborah K

    2014-01-01

    An integrated tiered service delivery model (ITSDM) has been proposed to provide 'full-coverage' of CD4 services throughout South Africa. Five tiers are described, defined by testing volumes and number of referring health-facilities. These include: (1) Tier-1/decentralized point-of-care service (POC) in a single site; Tier-2/POC-hub servicing processing < 30-40 samples from 8-10 health-clinics; Tier-3/Community laboratories servicing ∼ 50 health-clinics, processing < 150 samples/day; high-volume centralized laboratories (Tier-4 and Tier-5) processing < 300 or > 600 samples/day and serving > 100 or > 200 health-clinics, respectively. The objective of this study was to establish costs of existing and ITSDM-tiers 1, 2 and 3 in a remote, under-serviced district in South Africa. Historical health-facility workload volumes from the Pixley-ka-Seme district, and the total volumes of CD4 tests performed by the adjacent district referral CD4 laboratories, linked to locations of all referring clinics and related laboratory-to-result turn-around time (LTR-TAT) data, were extracted from the NHLS Corporate-Data-Warehouse for the period April-2012 to March-2013. Tiers were costed separately (as a cost-per-result) including equipment, staffing, reagents and test consumable costs. A one-way sensitivity analyses provided for changes in reagent price, test volumes and personnel time. The lowest cost-per-result was noted for the existing laboratory-based Tiers- 4 and 5 ($6.24 and $5.37 respectively), but with related increased LTR-TAT of > 24-48 hours. Full service coverage with TAT < 6-hours could be achieved with placement of twenty-seven Tier-1/POC or eight Tier-2/POC-hubs, at a cost-per-result of $32.32 and $15.88 respectively. A single district Tier-3 laboratory also ensured 'full service coverage' and < 24 hour LTR-TAT for the district at $7.42 per-test. Implementing a single Tier-3/community laboratory to extend and improve delivery of services in Pixley-ka-Seme, with an estimated local ∼ 12-24-hour LTR-TAT, is ∼ $2 more than existing referred services per-test, but 2-4 fold cheaper than implementing eight Tier-2/POC-hubs or providing twenty-seven Tier-1/POCT CD4 services.

  4. Statewide water-quality network for Massachusetts

    USGS Publications Warehouse

    Desimone, Leslie A.; Steeves, Peter A.; Zimmerman, Marc James

    2001-01-01

    A water-quality monitoring program is proposed that would provide data to meet multiple information needs of Massachusetts agencies and other users concerned with the condition of the State's water resources. The program was designed by the U.S. Geological Survey and the Massachusetts Department of Environmental Protection, Division of Watershed Management, with input from many organizations involved in water-quality monitoring in the State, and focuses on inland surface waters (streams and lakes). The proposed monitoring program consists of several components, or tiers, which are defined in terms of specific monitoring objectives, and is intended to complement the Massachusetts Watershed Initiative (MWI) basin assessments. Several components were developed using the Neponset River Basin in eastern Massachusetts as a pilot area, or otherwise make use of data from and sampling approaches used in that basin as part of a MWI pilot assessment in 1994. To guide development of the monitoring program, reviews were conducted of general principles of network design, including monitoring objectives and approaches, and of ongoing monitoring activities of Massachusetts State agencies.Network tiers described in this report are primarily (1) a statewide, basin-based assessment of existing surface-water-quality conditions, and (2) a fixed-station network for determining contaminant loads carried by major rivers. Other components, including (3) targeted programs for hot-spot monitoring and other objectives, and (4) compliance monitoring, also are discussed. Monitoring programs for the development of Total Maximum Daily Loads for specific water bodies, which would constitute another tier of the network, are being developed separately and are not described in this report. The basin-based assessment of existing conditions is designed to provide information on the status of surface waters with respect to State water-quality standards and designated uses in accordance with the reporting requirements [Section 305(b)] of the Clean Water Act (CWA). Geographic Information System (GIS)-based procedures were developed to inventory streams and lakes in a basin for these purposes. Several monitoring approaches for this tier and their associated resource requirements were investigated. Analysis of the Neponset Basin for this purpose demonstrated that the large number of sites needed in order for all the small streams in a basin to be sampled (about half of stream miles in the basin were headwater or first-order streams) pose substantial resource-based problems for a comprehensive assessment of existing conditions. The many lakes pose similar problems. Thus, a design is presented in which probabilistic monitoring of small streams is combined with deterministic or targeted monitoring of large streams and lakes to meet CWA requirements and to provide data for other information needs of Massachusetts regulatory agencies and MWI teams.The fixed-station network is designed to permit the determination of contaminant loads carried by the State's major rivers to sensitive inland and coastal receiving waters and across State boundaries. Sampling at 19 proposed sites in 17 of the 27 major basins in Massachusetts would provide information on contaminant loads from 67 percent of the total land area of the State; unsampled areas are primarily coastal areas drained by many small streams that would be impossible to sample within realistic resource limitations. Strategies for hot-spot monitoring, a targeted monitoring program focused on identifying contaminant sources, are described with reference to an analysis of the bacteria sampling program of the 1994 Neponset Basin assessment. Finally, major discharge sites permitted under the National Pollutant Discharge Elimination System (NPDES) were evaluated as a basis for ambient water-quality monitoring. The discharge sites are well distributed geographically among basins, but are primarily on large rivers (two-thirds or more

  5. How do quality information and cost affect patient choice of provider in a tiered network setting? Results from a survey.

    PubMed

    Sinaiko, Anna D

    2011-04-01

    To assess how quality information from multiple sources and financial incentives affect consumer choice of physicians in tiered physician networks. Survey of a stratified random sample of Massachusetts state employees. Respondents were assigned a hypothetical structure with differential copayments for "Tier 1" (preferred) and "Tier 2" (nonpreferred) physicians. Half of respondents were told they needed to select a cardiologist, and half were told they needed to select a dermatologist. Patients were asked whether they would choose a Tier 1 doctor, a Tier 2 doctor, or had no preference in a case where they had no further quality information, a case where a family member or friend recommended a Tier 2 doctor, and a case where their personal physician recommended a Tier 2 doctor. The effects of copayments, recommendations, physician specialty, and patient characteristics on the reported probability of selecting a Tier 1 doctor are analyzed using multinomial logit and logistic regression. Relative to a case where there is no copayment differential between tiers, copayment differences of U.S.$10-U.S.$35 increase the number of respondents indicating they would select a Tier 1 physician by 3.5-11.7 percent. Simulations suggest copayments must exceed U.S.$300 to counteract the recommendation for a lower tiered physician from friends, family, or a referring physician. Sensitivity to the copayments varied with physician specialty. Tiered provider networks with these copayment levels appear to have limited influence on physician choice when contradicted by other trusted sources. Consumers' response likely varies with physician specialty. © Health Research and Educational Trust.

  6. Snake River Sockeye Salmon Habitat and Limnological Research; 2002 Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kohler, Andre E.; Taki, Doug; Griswold, Robert G.

    2004-08-01

    In March 1990, the Shoshone-Bannock Tribes petitioned the National Marine Fisheries Service (NMFS) to list the Snake River sockeye salmon (Oncorhynchus nerka) as endangered. As a result of that petition the Snake River sockeye salmon was officially listed as endangered in November 1991 under the Endangered Species Act (56 FR 58619). In 1991, the Snake River Sockeye Salmon Habitat and Limnological Research Program was implemented (Project Number 91-71, Intergovernmental Contract Number DE-BI79-91bp22548). This project is part of an interagency effort to prevent the extinction of the Redfish Lake stock of O. nerka. The Shoshone-Bannock Tribal goal for this project ismore » two tiered: The immediate goal is to increase the population of Snake River sockeye salmon while preserving the unique genetic characteristics of the Evolutionarily Significant Unit (ESU). The Tribes long term goal is to maintain a viable population that warrants delisting and provides Tribal harvest opportunities. The Bonneville Power Administration (BPA) provides funding for this interagency recovery program through the Northwest Power Planning Council Fish and Wildlife Program (NPPCFWP). Collaborators in the recovery effort include the National Marine Fisheries Service (NMFS), the Idaho Department of Fish and Game (IDFG), the University of Idaho (UI), U.S. Forest Service (USFS), and the Shoshone-Bannock Tribe (SBT). This report summarizes activities conducted by Shoshone-Bannock Tribal Fisheries Department personnel during the 2002 calendar year. Project objectives include: (1) monitor over-winter survival and emigration of juvenile anadromous O. nerka stocked from the captive rearing program; (2) fertilize Redfish Lake (3) conduct kokanee salmon (non-anadromous O. nerka) population surveys; (4) monitor spawning kokanee escapement and estimate fry recruitment on Fishhook, Alturas Lake, and Stanley Lake creeks; (5) evaluate potential competition and predation between stocked juvenile O. nerka and a variety of fish species in Redfish, Pettit, and Alturas lakes; and (6) monitor limnological parameters of Sawtooth Valley lakes to assess lake productivity.« less

  7. The ATLAS Tier-0: Overview and operational experience

    NASA Astrophysics Data System (ADS)

    Elsing, Markus; Goossens, Luc; Nairz, Armin; Negri, Guido

    2010-04-01

    Within the ATLAS hierarchical, multi-tier computing infrastructure, the Tier-0 centre at CERN is mainly responsible for prompt processing of the raw data coming from the online DAQ system, to archive the raw and derived data on tape, to register the data with the relevant catalogues and to distribute them to the associated Tier-1 centers. The Tier-0 is already fully functional. It has been successfully participating in all cosmic and commissioning data taking since May 2007, and was ramped up to its foreseen full size, performance and throughput for the cosmic (and short single-beam) run periods between July and October 2008. Data and work flows for collision data taking were exercised in several "Full Dress Rehearsals" (FDRs) in the course of 2008. The transition from an expert to a shifter-based system was successfully established in July 2008. This article will give an overview of the Tier-0 system, its data and work flows, and operations model. It will review the operational experience gained in cosmic, commissioning, and FDR exercises during the past year. And it will give an outlook on planned developments and the evolution of the system towards first collision data taking expected now in late Autumn 2009.

  8. Building Tier 3 Intervention for Long-Term Slow Growers in Grades 3-4: A Pilot Study

    ERIC Educational Resources Information Center

    Sanchez, Victoria M.; O'Connor, Rollanda E.

    2015-01-01

    Tier 3 interventions are necessary for improving the reading performance of students who fail to respond adequately to Tier 1 general education instruction and Tier 2 supplemental reading intervention. In this pilot study, we identified 8 students in 3rd and 4th grade who had demonstrated slow response to Tier 2 reading interventions for three…

  9. 78 FR 38074 - Announcement Regarding a Change in Eligibility for Unemployment Insurance (UI) Claimants in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-25

    ...Announcement regarding a change in eligibility for Unemployment Insurance (UI) claimants in Alabama, Alaska, Delaware, Illinois, Louisiana, Michigan, Mississippi, Ohio, the Virgin Islands and Wisconsin in the Emergency Unemployment Compensation (EUC08) program, and the Federal-State Extended Benefits (EB) program. The U.S. Department of Labor (Department) produces trigger notices indicating which states qualify for both EB and EUC08 benefits, and provides the beginning and ending dates of payable periods for each qualifying state. The trigger notices covering state eligibility for these programs can be found at: http://ows.doleta.gov/unemploy/claims-- arch.asp. The following changes have occurred since the publication of the last notice regarding states EUC08 and EB trigger status: Alabama's trigger value had fallen below the 7.0% threshold and has triggered ``off'' Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted total unemployment rate (TUR) in Alabama was 6.9%, falling below the 7.0% trigger threshold necessary to remain ``on'' Tier 3 of EUC08. The week ending April 13, 2013, was the last week in which EUC08 claimants in Alabama could exhaust Tier 2 and establish Tier 3 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had for Tier 3 after April 13, 2013. Alaska's insured unemployment rate (IUR) has fallen below the 6.0% trigger threshold and has triggered ``off'' of EB. Based on data from Alaska for the week ending April 13, 2013, the 13 week IUR in Alaska fell below the 6.0% trigger threshold necessary to remain ``on'' EB. The payable period in EB for Alaska ended May 4, 2013. Alaska's IUR has fallen below the 6.0% trigger threshold and has triggered ``off'' Tier 4 of EUC08. Based on data from Alaska for the week ending April 13, 2013, the 13 week IUR in Alaska fell below the 6.0% trigger rate threshold to remain ``on'' Tier 4 of EUC08. The week ending May 4, 2013, was the last week in which EUC08 claimants in Alaska could exhaust Tier 3, and establish Tier 4 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had for Tier 4 after May 4, 2013. Delaware's trigger value exceeds the 7.0% trigger threshold and has triggered ``on'' Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted TUR in Delaware was 7.1%, exceeding the 7.0% threshold necessary to trigger ``on'' Tier 3 of EUC08. The week beginning April 7, 2013, was the first week in which EUC08 claimants in Delaware who had exhausted Tier 2, and are otherwise eligible, could establish Tier 3 eligibility. Illinois' trigger value met the 9.0% trigger threshold and has triggered ``on'' Tier 4 of EUC08. Based on data released by the Bureau of Labor Statistics on March 29, 2013, the three month average, seasonally adjusted TUR in Illinois met the 9.0% trigger threshold to trigger ``on'' Tier 4 of EUC08. The week beginning April 14, 2013, was the first week in which EUC08 claimants in Illinois who had exhausted Tier 3, and were otherwise eligible, could establish Tier 4 eligibility. Louisiana's trigger value has fallen below the 6.0% trigger threshold and has triggered ``off'' Tier 2 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted TUR in Louisiana was 5.8%, falling below the 6.0% trigger threshold to remain ``on'' Tier 2 of EUC08. The week ending April 13, 2013, was the last week in which EUC08 claimants in Louisiana could exhaust Tier 1, and establish Tier 2 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had in Tier 2 after April 13, 2013. Michigan's trigger value has fallen below the 9.0% trigger threshold and has triggered ``off'' Tier 4 of EUC08. Based on data released by the Bureau of Labor Statistics on March 18, 2013, the three month average, seasonally adjusted TUR for Michigan was 8.9%, falling below the 9.0% trigger threshold to remain ``on'' Tier 4 of EUC08. The week ending April 13, 2013, was the last week in which EUC08 claimants in Michigan could exhaust Tier 3, and establish Tier 4 eligibility. Under the phase-out provisions, claimants could receive any remaining entitlement they had in Tier 4 after April 13, 2013. Mississippi's trigger value exceeds the 9.0% trigger threshold and has triggered ``on'' Tier 4 of EUC08. Based on data released by the Bureau of Labor Statistics on March 29, 2013, the three month average, seasonally adjusted TUR in Mississippi was 9.3%, exceeding the 9.0% trigger threshold to trigger ``on'' Tier 4 of EUC08. The week beginning April 14, 2013, was the first week in which EUC08 claimants in Mississippi who had exhausted Tier 3, and are otherwise eligible, could establish Tier 4 eligibility. Ohio's trigger value met the 7.0% trigger threshold and has triggered ``on'' Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on April 19, 2013, the three month average, seasonally adjusted total unemployment rate in Ohio had met 7.0% trigger threshold to trigger ``on'' in Tier 3 of EUC08. The week beginning May 5, 2013, was the first week in which EUC08 claimants in Ohio who had exhausted Tier 2, and were otherwise eligible, could establish Tier 3 eligibility. The Virgin Islands' estimated trigger rate fell below the 6.0% threshold and has triggered ``off'' both Tier 2 and Tier 3 of EUC08. Based on data released by the Bureau of Labor Statistics on March 8, 2013, the estimated three month average, seasonally adjusted TUR in the Virgin Islands fell below the 6.0% trigger threshold rate to remain ``on'' both Tier 2 and Tier 3 of EUC08. That triggered the Virgin Islands off both Tier 2 and Tier 3 of EUC08. The week ending March, 30 2013, was the last week in which EUC08 claimants in the Virgin Islands could exhaust Tier 1 and establish Tier 2 eligibility, or exhaust Tier 2 and establish Tier 3 eligibility. Wisconsin's trigger value met the 7.0% threshold and has triggered ``on'' Tier 3 of EUC08, however mandatory 13 week ``off'' period delayed effective date. Based on data released by the Bureau of Labor Statistics on April 19, 2013, the three month average, seasonally adjusted TUR for Wisconsin has met the 7.0% trigger rate threshold to trigger ``on'' Tier 3 of EUC08. However, Wisconsin was in a 13 week mandatory ``off'' period that started February 9, 2013, and did not conclude until May 11, 2013. As a result, Wisconsin remained in an ``off'' period for Tier 3 of EUC08 through May 11, 2013, and triggered ``on'' Tier 3 of EUC08 effective May 12, 2013. The week beginning May 12, 2013, was the first week in which EUC08 claimants in Wisconsin who have exhausted Tier 2, and are otherwise eligible, can establish Tier 3 eligibility.

  10. 40 CFR 86.1861-04 - How do the Tier 2 and interim non-Tier 2 NOX averaging, banking and trading programs work?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 2 NOX averaging, banking and trading programs work? 86.1861-04 Section 86.1861-04 Protection of... work? (a) General provisions for Tier 2 credits and debits. (1) A manufacturer whose Tier 2 fleet... to a full useful life of 100,000 miles, provided that the credits are prorated by a multiplicative...

  11. Cross-species extrapolation of toxicity information using the ...

    EPA Pesticide Factsheets

    In the United States, the Endocrine Disruptor Screening Program (EDSP) was established to identify chemicals that may lead to adverse effects via perturbation of the endocrine system (i.e., estrogen, androgen, and thyroid hormone systems). In the mid-1990s the EDSP adopted a two tiered approach for screening chemicals that applied standardized in vitro and in vivo toxicity tests. The Tier 1 screening assays were designed to identify substances that have the potential of interacting with the endocrine system and Tier 2 testing was developed to identify adverse effects caused by the chemical, with documentation of dose-response relationships. While this tiered approach was effective in identifying possible endocrine disrupting chemicals, the cost and time to screen a single chemical was significant. Therefore, in 2012 the EDSP proposed a transition to make greater use of computational approaches (in silico) and high-throughput screening (HTS; in vitro) assays to more rapidly and cost-efficiently screen chemicals for endocrine activity. This transition from resource intensive, primarily in vivo, screening methods to more pathway-based approaches aligns with the simultaneously occurring transformation in toxicity testing termed “Toxicity Testing in the 21st Century” which shifts the focus to the disturbance of the biological pathway predictive of the observable toxic effects. An example of such screening tools include the US Environmental Protection Agency’s

  12. The diverse use of clouds by CMS

    DOE PAGES

    Andronis, Anastasios; Bauer, Daniela; Chaze, Olivier; ...

    2015-12-23

    The resources CMS is using are increasingly being offered as clouds. In Run 2 of the LHC the majority of CMS CERN resources, both in Meyrin and at the Wigner Computing Centre, will be presented as cloud resources on which CMS will have to build its own infrastructure. This infrastructure will need to run all of the CMS workflows including: Tier 0, production and user analysis. In addition, the CMS High Level Trigger will provide a compute resource comparable in scale to the total offered by the CMS Tier 1 sites, when it is not running as part of themore » trigger system. During these periods a cloud infrastructure will be overlaid on this resource, making it accessible for general CMS use. Finally, CMS is starting to utilise cloud resources being offered by individual institutes and is gaining experience to facilitate the use of opportunistically available cloud resources. Lastly, we present a snap shot of this infrastructure and its operation at the time of the CHEP2015 conference.« less

  13. Combining Tier 2 and Tier 3 Supports for Students with Disabilities in General Education Settings

    ERIC Educational Resources Information Center

    MacLeod, K. Sandra; Hawken, Leanne S.; O'Neill, Robert E.; Bundock, Kaitlin

    2016-01-01

    Secondary level or Tier 2 interventions such as the Check-in Check-out (CICO) intervention effectively reduce problem behaviors of students who are non-responsive to school-wide interventions. However, some students will not be successful with Tier 2 interventions. This study investigated the effects of adding individualized function-based support…

  14. Identifying Students for Secondary and Tertiary Prevention Efforts: How Do We Determine Which Students Have Tier 2 and Tier 3 Needs?

    ERIC Educational Resources Information Center

    Lane, Kathleen Lynne; Oakes, Wendy Peia; Ennis, Robin Parks; Hirsch, Shanna Eisner

    2014-01-01

    In comprehensive, integrated, three-tiered models, it is essential to have a systematic method for identifying students who need supports at Tier 2 or Tier 3. This article provides explicit information on how to use multiple sources of data to determine which students might benefit from these supports. First, the authors provide an overview of how…

  15. Tiered Human Integrated Sequence Search Databases for Shotgun Proteomics.

    PubMed

    Deutsch, Eric W; Sun, Zhi; Campbell, David S; Binz, Pierre-Alain; Farrah, Terry; Shteynberg, David; Mendoza, Luis; Omenn, Gilbert S; Moritz, Robert L

    2016-11-04

    The results of analysis of shotgun proteomics mass spectrometry data can be greatly affected by the selection of the reference protein sequence database against which the spectra are matched. For many species there are multiple sources from which somewhat different sequence sets can be obtained. This can lead to confusion about which database is best in which circumstances-a problem especially acute in human sample analysis. All sequence databases are genome-based, with sequences for the predicted gene and their protein translation products compiled. Our goal is to create a set of primary sequence databases that comprise the union of sequences from many of the different available sources and make the result easily available to the community. We have compiled a set of four sequence databases of varying sizes, from a small database consisting of only the ∼20,000 primary isoforms plus contaminants to a very large database that includes almost all nonredundant protein sequences from several sources. This set of tiered, increasingly complete human protein sequence databases suitable for mass spectrometry proteomics sequence database searching is called the Tiered Human Integrated Search Proteome set. In order to evaluate the utility of these databases, we have analyzed two different data sets, one from the HeLa cell line and the other from normal human liver tissue, with each of the four tiers of database complexity. The result is that approximately 0.8%, 1.1%, and 1.5% additional peptides can be identified for Tiers 2, 3, and 4, respectively, as compared with the Tier 1 database, at substantially increasing computational cost. This increase in computational cost may be worth bearing if the identification of sequence variants or the discovery of sequences that are not present in the reviewed knowledge base entries is an important goal of the study. We find that it is useful to search a data set against a simpler database, and then check the uniqueness of the discovered peptides against a more complex database. We have set up an automated system that downloads all the source databases on the first of each month and automatically generates a new set of search databases and makes them available for download at http://www.peptideatlas.org/thisp/ .

  16. Tiered Human Integrated Sequence Search Databases for Shotgun Proteomics

    PubMed Central

    Deutsch, Eric W.; Sun, Zhi; Campbell, David S.; Binz, Pierre-Alain; Farrah, Terry; Shteynberg, David; Mendoza, Luis; Omenn, Gilbert S.; Moritz, Robert L.

    2016-01-01

    The results of analysis of shotgun proteomics mass spectrometry data can be greatly affected by the selection of the reference protein sequence database against which the spectra are matched. For many species there are multiple sources from which somewhat different sequence sets can be obtained. This can lead to confusion about which database is best in which circumstances – a problem especially acute in human sample analysis. All sequence databases are genome-based, with sequences for the predicted gene and their protein translation products compiled. Our goal is to create a set of primary sequence databases that comprise the union of sequences from many of the different available sources and make the result easily available to the community. We have compiled a set of four sequence databases of varying sizes, from a small database consisting of only the ~20,000 primary isoforms plus contaminants to a very large database that includes almost all non-redundant protein sequences from several sources. This set of tiered, increasingly complete human protein sequence databases suitable for mass spectrometry proteomics sequence database searching is called the Tiered Human Integrated Search Proteome set. In order to evaluate the utility of these databases, we have analyzed two different data sets, one from the HeLa cell line and the other from normal human liver tissue, with each of the four tiers of database complexity. The result is that approximately 0.8%, 1.1%, and 1.5% additional peptides can be identified for Tiers 2, 3, and 4, respectively, as compared with the Tier 1 database, at substantially increasing computational cost. This increase in computational cost may be worth bearing if the identification of sequence variants or the discovery of sequences that are not present in the reviewed knowledge base entries is an important goal of the study. We find that it is useful to search a data set against a simpler database, and then check the uniqueness of the discovered peptides against a more complex database. We have set up an automated system that downloads all the source databases on the first of each month and automatically generates a new set of search databases and makes them available for download at http://www.peptideatlas.org/thisp/. PMID:27577934

  17. Comparison of Three Biomass Sampling Techniques on Submersed Aquatic Plants in a Northern Tier Lake

    DTIC Science & Technology

    2010-07-01

    distribution in 3 out of 14 species when comparing the box-core sampler and the rake method. These included forked duckweed (Lemna trisulca L, p...each site did not exhibit differences. These included coontail (p=0.2949), muskgrass (p=0.2746), American elodea (p=0.7622), forked duckweed (p...collected by the PVC-core sampler. These included coontail (p=0.000), chara (p=0.0219), American elodea (p=0.0061), forked duckweed (p=0.0000), najas (p

  18. Systematic Implementation of a Tier 2 Behavior Intervention

    ERIC Educational Resources Information Center

    Carter, Deborah Russell; Carter, Gabriel M.; Johnson, Evelyn S.; Pool, Juli L.

    2013-01-01

    Schools are increasingly adopting tiered models of prevention to meet the needs of diverse populations of students. This article outlines the steps involved in designing and implementing a systematic Tier 2 behavior intervention within a tiered service delivery model. An elementary school example is provided to outline the identification,…

  19. Estimating Implementation and Operational Costs of an Integrated Tiered CD4 Service including Laboratory and Point of Care Testing in a Remote Health District in South Africa

    PubMed Central

    Cassim, Naseem; Coetzee, Lindi M.; Schnippel, Kathryn; Glencross, Deborah K.

    2014-01-01

    Background An integrated tiered service delivery model (ITSDM) has been proposed to provide ‘full-coverage’ of CD4 services throughout South Africa. Five tiers are described, defined by testing volumes and number of referring health-facilities. These include: (1) Tier-1/decentralized point-of-care service (POC) in a single site; Tier-2/POC-hub servicing processing <30–40 samples from 8–10 health-clinics; Tier-3/Community laboratories servicing ∼50 health-clinics, processing <150 samples/day; high-volume centralized laboratories (Tier-4 and Tier-5) processing <300 or >600 samples/day and serving >100 or >200 health-clinics, respectively. The objective of this study was to establish costs of existing and ITSDM-tiers 1, 2 and 3 in a remote, under-serviced district in South Africa. Methods Historical health-facility workload volumes from the Pixley-ka-Seme district, and the total volumes of CD4 tests performed by the adjacent district referral CD4 laboratories, linked to locations of all referring clinics and related laboratory-to-result turn-around time (LTR-TAT) data, were extracted from the NHLS Corporate-Data-Warehouse for the period April-2012 to March-2013. Tiers were costed separately (as a cost-per-result) including equipment, staffing, reagents and test consumable costs. A one-way sensitivity analyses provided for changes in reagent price, test volumes and personnel time. Results The lowest cost-per-result was noted for the existing laboratory-based Tiers- 4 and 5 ($6.24 and $5.37 respectively), but with related increased LTR-TAT of >24–48 hours. Full service coverage with TAT <6-hours could be achieved with placement of twenty-seven Tier-1/POC or eight Tier-2/POC-hubs, at a cost-per-result of $32.32 and $15.88 respectively. A single district Tier-3 laboratory also ensured ‘full service coverage’ and <24 hour LTR-TAT for the district at $7.42 per-test. Conclusion Implementing a single Tier-3/community laboratory to extend and improve delivery of services in Pixley-ka-Seme, with an estimated local ∼12–24-hour LTR-TAT, is ∼$2 more than existing referred services per-test, but 2–4 fold cheaper than implementing eight Tier-2/POC-hubs or providing twenty-seven Tier-1/POCT CD4 services. PMID:25517412

  20. DSCOVR_EPIC_L2_AER_01

    Atmospheric Science Data Center

    2018-04-23

    DSCOVR_EPIC_L2_AER_01 The Aerosol UV product provides aerosol and UV products in three tiers. Tier 1 products include Absorbing Aerosol Index (AAI) and above-cloud-aerosol optical depth (ACAOD). Tier 2 ...

  1. EDSP Tier 2 test (T2T) guidances and protocols are delivered, including web-based guidance for diagnosing and scoring, and evaluating EDC-induced pathology in fish and amphibian

    EPA Science Inventory

    The Agency’s Endocrine Disruptor Screening Program (EDSP) consists of two tiers. The first tier provides information regarding whether a chemical may have endocrine disruption properties. Tier 2 tests provide confirmation of ED effects and dose-response information to be us...

  2. Scaling up a CMS tier-3 site with campus resources and a 100 Gb/s network connection: what could go wrong?

    NASA Astrophysics Data System (ADS)

    Wolf, Matthias; Woodard, Anna; Li, Wenzhao; Hurtado Anampa, Kenyi; Tovar, Benjamin; Brenner, Paul; Lannon, Kevin; Hildreth, Mike; Thain, Douglas

    2017-10-01

    The University of Notre Dame (ND) CMS group operates a modest-sized Tier-3 site suitable for local, final-stage analysis of CMS data. However, through the ND Center for Research Computing (CRC), Notre Dame researchers have opportunistic access to roughly 25k CPU cores of computing and a 100 Gb/s WAN network link. To understand the limits of what might be possible in this scenario, we undertook to use these resources for a wide range of CMS computing tasks from user analysis through large-scale Monte Carlo production (including both detector simulation and data reconstruction.) We will discuss the challenges inherent in effectively utilizing CRC resources for these tasks and the solutions deployed to overcome them.

  3. Mathematics Intervention for First- and Second-Grade Students with Mathematics Difficulties: The Effects of Tier 2 Intervention Delivered as Booster Lessons

    ERIC Educational Resources Information Center

    Bryant, Diane Pedrotty; Bryant, Brian R.; Gersten, Russell; Scammacca, Nancy; Chavez, Melissa M.

    2008-01-01

    This study sought to examine the effects of Tier 2 intervention in a multitiered model on the performance of first- and second-grade students who were identified as having mathematics difficulties. A regression discontinuity design was utilized. Participants included 126 (Tier 2, n = 26) first graders and 140 (Tier 2, n = 25) second graders. Tier…

  4. Modeling individual exposures to ambient PM2.5 in the diabetes and the environment panel study (DEPS).

    PubMed

    Breen, Michael; Xu, Yadong; Schneider, Alexandra; Williams, Ronald; Devlin, Robert

    2018-06-01

    Air pollution epidemiology studies of ambient fine particulate matter (PM 2.5 ) often use outdoor concentrations as exposure surrogates, which can induce exposure error. The goal of this study was to improve ambient PM 2.5 exposure assessments for a repeated measurements study with 22 diabetic individuals in central North Carolina called the Diabetes and Environment Panel Study (DEPS) by applying the Exposure Model for Individuals (EMI), which predicts five tiers of individual-level exposure metrics for ambient PM 2.5 using outdoor concentrations, questionnaires, weather, and time-location information. Using EMI, we linked a mechanistic air exchange rate (AER) model to a mass-balance PM 2.5 infiltration model to predict residential AER (Tier 1), infiltration factors (F inf_home , Tier 2), indoor concentrations (C in , Tier 3), personal exposure factors (F pex , Tier 4), and personal exposures (E, Tier 5) for ambient PM 2.5 . We applied EMI to predict daily PM 2.5 exposure metrics (Tiers 1-5) for 174 participant-days across the 13 months of DEPS. Individual model predictions were compared to a subset of daily measurements of F pex and E (Tiers 4-5) from the DEPS participants. Model-predicted F pex and E corresponded well to daily measurements with a median difference of 14% and 23%; respectively. Daily model predictions for all 174 days showed considerable temporal and house-to-house variability of AER, F inf_home , and C in (Tiers 1-3), and person-to-person variability of F pex and E (Tiers 4-5). Our study demonstrates the capability of predicting individual-level ambient PM 2.5 exposure metrics for an epidemiological study, in support of improving risk estimation. Copyright © 2018. Published by Elsevier B.V.

  5. Proposed Tier 2 Screening Criteria and Tier 3 Field Procedures for Evaluation of Vapor Intrusion (ESTCP Cost and Performance Report)

    DTIC Science & Technology

    2012-08-01

    Interstate Technology & Regulatory Council, Washington, DC, Copyright 2007. McHugh T.E., D.E. Hammond, T. Nickels , and B. Hartman. 2008. Use of...based corrective action have realized significant cost savings for their corrective action programs (Connor and McHugh , 2002). As described above...Groundwater (Tier 2) VOCs USEPA 8260B 40 mL VOA vial HCl 14 days Vapor (Tier 2 and Tier 3) Radon McHugh et al., 2008 500 mL Tedlar bag None 14

  6. 40 CFR Appendix I to Part 1042 - Summary of Previous Emission Standards

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: (a) Engines below 37 kW. Tier 1 and Tier 2 standards for engines below 37 kW apply as specified in 40... Engines Below 37 kW (g/kW-hr) Rated power (kW) Tier Model year NMHC + NOX CO PM kWTier 1 2000 10.5 8.0 1.0 Tier 2 2005 7.5 8.0 0.80 8≤kWTier 1 2000 9.5 6.6 0.80 Tier 2 2005 7.5 6.6 0.80 19≤kWTier...

  7. 24 CFR 203.605 - Loss mitigation performance.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... performance. (1) HUD will measure and advise mortgagees of their loss mitigation performance through the Tier... mitigation attempts, defaults, and claims. Based on the ratios, HUD will group mortgagees in four tiers (Tiers 1, 2, 3, and 4), with Tier 1 representing the highest or best ranking mortgagees and Tier 4...

  8. Structure of eukaryotic CMG helicase at a replication fork and implications to replisome architecture and origin initiation

    PubMed Central

    Georgescu, Roxana; Yuan, Zuanning; Bai, Lin; de Luna Almeida Santos, Ruda; Sun, Jingchuan; Zhang, Dan; Yurieva, Olga; Li, Huilin; O’Donnell, Michael E.

    2017-01-01

    The eukaryotic CMG (Cdc45, Mcm2–7, GINS) helicase consists of the Mcm2–7 hexameric ring along with five accessory factors. The Mcm2–7 heterohexamer, like other hexameric helicases, is shaped like a ring with two tiers, an N-tier ring composed of the N-terminal domains, and a C-tier of C-terminal domains; the C-tier contains the motor. In principle, either tier could translocate ahead of the other during movement on DNA. We have used cryo-EM single-particle 3D reconstruction to solve the structure of CMG in complex with a DNA fork. The duplex stem penetrates into the central channel of the N-tier and the unwound leading single-strand DNA traverses the channel through the N-tier into the C-tier motor, 5′-3′ through CMG. Therefore, the N-tier ring is pushed ahead by the C-tier ring during CMG translocation, opposite the currently accepted polarity. The polarity of the N-tier ahead of the C-tier places the leading Pol ε below CMG and Pol α-primase at the top of CMG at the replication fork. Surprisingly, the new N-tier to C-tier polarity of translocation reveals an unforeseen quality-control mechanism at the origin. Thus, upon assembly of head-to-head CMGs that encircle double-stranded DNA at the origin, the two CMGs must pass one another to leave the origin and both must remodel onto opposite strands of single-stranded DNA to do so. We propose that head-to-head motors may generate energy that underlies initial melting at the origin. PMID:28096349

  9. A Systematic Review of the Empirical Support for Check-in Check-Out

    ERIC Educational Resources Information Center

    Wolfe, Katie; Pyle, Daniel; Charlton, Cade T.; Sabey, Christian V.; Lund, Emily M.; Ross, Scott W.

    2016-01-01

    Tier 2 interventions play an important role within the Positive Behavioral Interventions and Supports framework, bridging the gap between schoolwide Tier 1 interventions and individualized Tier 3 supports. Check-in Check-out (CICO) is a promising Tier 2 intervention for addressing mild problem behavior and potentially preventing the need for more…

  10. Tier 1 and Tier 2 Early Intervention for Handwriting and Composing

    ERIC Educational Resources Information Center

    Berninger, Virginia W.; Rutberg, Judith E.; Abbott, Robert D.; Garcia, Noelia; Anderson-Youngstrom, Marci; Brooks, Allison; Fulton, Cynthia

    2006-01-01

    Three studies evaluated Tier 1 early intervention for handwriting at a critical period for literacy development in first grade and one study evaluated Tier 2 early intervention in the critical period between third and fourth grades for composing on high stakes tests. The results contribute to knowledge of research-supported handwriting and…

  11. 30 CFR 57.5067 - Engines.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) light duty truck 0.1 g/mile. 40 CFR 86.094-11(a)(1)(iv)(B) heavy duty highway engine 0.1 g/bhp-hr. 40... g/bhp-hr). tier 1 8≤kWbhp-hr). tier 1 19≤kWbhp-hr). tier 2 37≤kWbhp-hr). tier 2 75≤kW<130...

  12. Computation of Flow Through Water-Control Structures Using Program DAMFLO.2

    USGS Publications Warehouse

    Sanders, Curtis L.; Feaster, Toby D.

    2004-01-01

    As part of its mission to collect, analyze, and store streamflow data, the U.S. Geological Survey computes flow through several dam structures throughout the country. Flows are computed using hydraulic equations that describe flow through sluice and Tainter gates, crest gates, lock gates, spillways, locks, pumps, and siphons, which are calibrated using flow measurements. The program DAMFLO.2 was written to compute, tabulate, and plot flow through dam structures using data that describe the physical properties of dams and various hydraulic parameters and ratings that use time-varying data, such as lake elevations or gate openings. The program uses electronic computer files of time-varying data, such as lake elevation or gate openings, retrieved from the U.S. Geological Survey Automated Data Processing System. Computed time-varying flow data from DAMFLO.2 are output in flat files, which can be entered into the Automated Data Processing System database. All computations are made in units of feet and seconds. DAMFLO.2 uses the procedures and language developed by the SAS Institute Inc.

  13. Single-tier testing with the C6 peptide ELISA kit compared with two-tier testing for Lyme disease.

    PubMed

    Wormser, Gary P; Schriefer, Martin; Aguero-Rosenfeld, Maria E; Levin, Andrew; Steere, Allen C; Nadelman, Robert B; Nowakowski, John; Marques, Adriana; Johnson, Barbara J B; Dumler, J Stephen

    2013-01-01

    For the diagnosis of Lyme disease, the 2-tier serologic testing protocol for Lyme disease has a number of shortcomings including low sensitivity in early disease; increased cost, time, and labor; and subjectivity in the interpretation of immunoblots. In this study, the diagnostic accuracy of a single-tier commercial C6 ELISA kit was compared with 2-tier testing. The results showed that the C6 ELISA was significantly more sensitive than 2-tier testing with sensitivities of 66.5% (95% confidence interval [CI] 61.7-71.1) and 35.2% (95% CI 30.6-40.1), respectively (P < 0.001) in 403 sera from patients with erythema migrans. The C6 ELISA had sensitivity statistically comparable to 2-tier testing in sera from Lyme disease patients with early neurologic manifestations (88.6% versus 77.3%, P = 0.13) or arthritis (98.3% versus 95.6%, P = 0.38). The specificities of C6 ELISA and 2-tier testing in over 2200 blood donors, patients with other conditions, and Lyme disease vaccine recipients were found to be 98.9% and 99.5%, respectively (P < 0.05, 95% CI surrounding the 0.6 percentage point difference of 0.04 to 1.15). In conclusion, using a reference standard of 2-tier testing, the C6 ELISA as a single-step serodiagnostic test provided increased sensitivity in early Lyme disease with comparable sensitivity in later manifestations of Lyme disease. The C6 ELISA had slightly decreased specificity. Future studies should evaluate the performance of the C6 ELISA compared with 2-tier testing in routine clinical practice. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Lessons learned from the ATLAS performance studies of the Iberian Cloud for the first LHC running period

    NASA Astrophysics Data System (ADS)

    Sánchez-Martínez, V.; Borges, G.; Borrego, C.; del Peso, J.; Delfino, M.; Gomes, J.; González de la Hoz, S.; Pacheco Pages, A.; Salt, J.; Sedov, A.; Villaplana, M.; Wolters, H.

    2014-06-01

    In this contribution we describe the performance of the Iberian (Spain and Portugal) ATLAS cloud during the first LHC running period (March 2010-January 2013) in the context of the GRID Computing and Data Distribution Model. The evolution of the resources for CPU, disk and tape in the Iberian Tier-1 and Tier-2s is summarized. The data distribution over all ATLAS destinations is shown, focusing on the number of files transferred and the size of the data. The status and distribution of simulation and analysis jobs within the cloud are discussed. The Distributed Analysis tools used to perform physics analysis are explained as well. Cloud performance in terms of the availability and reliability of its sites is discussed. The effect of the changes in the ATLAS Computing Model on the cloud is analyzed. Finally, the readiness of the Iberian Cloud towards the first Long Shutdown (LS1) is evaluated and an outline of the foreseen actions to take in the coming years is given. The shutdown will be a good opportunity to improve and evolve the ATLAS Distributed Computing system to prepare for the future challenges of the LHC operation.

  15. 50 CFR 660.211 - Fixed gear fishery-definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... vessel registered to a limited entry fixed gear permit(s) with a Tier 1, Tier 2, and/or Tier 3... fishery or sablefish tier limit fishery means, for the limited entry fixed gear sablefish fishery north of... tier limit and when they are not eligible to fish in the DTL fishery. Sablefish primary season means...

  16. Operationally Responsive Space: Creating Responsive Space for America

    DTIC Science & Technology

    2008-06-20

    programs. Items identified for improvement include: 1 ) Fractured management and accounting , 2 ) Satellite availability, 3) Facilities (Processing...enveloped by our three output tiers. The three tiers: • Tier 1 is an immediate response taking minutes to days. • Tier 2 is a mid term response taking...needs to improve. 2 . Develop a philosophy. 3. Set a direction with specific goals (Fuchs 1 -7). The Department of Space should follow these

  17. Examining the Predictive Validity of a Dynamic Assessment of Decoding to Forecast Response to Tier 2 Intervention

    ERIC Educational Resources Information Center

    Cho, Eunsoo; Compton, Donald L.; Fuchs, Douglas; Fuchs, Lynn S.; Bouton, Bobette

    2014-01-01

    The purpose of this study was to examine the role of a dynamic assessment (DA) of decoding in predicting responsiveness to Tier 2 small-group tutoring in a response-to-intervention model. First grade students (n = 134) who did not show adequate progress in Tier 1 based on 6 weeks of progress monitoring received Tier 2 small-group tutoring in…

  18. Examining the Predictive Validity of a Dynamic Assessment of Decoding to Forecast Response Tier 2 to Intervention

    PubMed Central

    Cho, Eunsoo; Compton, Donald L.; Fuchs, Doug; Fuchs, Lynn S.; Bouton, Bobette

    2013-01-01

    The purpose of this study was to examine the role of a dynamic assessment (DA) of decoding in predicting responsiveness to Tier 2 small group tutoring in a response-to-intervention model. First-grade students (n=134) who did not show adequate progress in Tier 1 based on 6 weeks of progress monitoring received Tier 2 small-group tutoring in reading for 14 weeks. Student responsiveness to Tier 2 was assessed weekly with word identification fluency (WIF). A series of conditional individual growth curve analyses were completed that modeled the correlates of WIF growth (final level of performance and growth). Its purpose was to examine the predictive validity of DA in the presence of 3 sets of variables: static decoding measures, Tier 1 responsiveness indicators, and pre-reading variables (phonemic awareness, rapid letter naming, oral vocabulary, and IQ). DA was a significant predictor of final level and growth, uniquely explaining 3% – 13% of the variance in Tier 2 responsiveness depending on the competing predictors in the model and WIF outcome (final level of performance or growth). Although the additional variances explained uniquely by DA were relatively small, results indicate the potential of DA in identifying Tier 2 nonresponders. PMID:23213050

  19. Examining the predictive validity of a dynamic assessment of decoding to forecast response to tier 2 intervention.

    PubMed

    Cho, Eunsoo; Compton, Donald L; Fuchs, Douglas; Fuchs, Lynn S; Bouton, Bobette

    2014-01-01

    The purpose of this study was to examine the role of a dynamic assessment (DA) of decoding in predicting responsiveness to Tier 2 small-group tutoring in a response-to-intervention model. First grade students (n = 134) who did not show adequate progress in Tier 1 based on 6 weeks of progress monitoring received Tier 2 small-group tutoring in reading for 14 weeks. Student responsiveness to Tier 2 was assessed weekly with word identification fluency (WIF). A series of conditional individual growth curve analyses were completed that modeled the correlates of WIF growth (final level of performance and growth). Its purpose was to examine the predictive validity of DA in the presence of three sets of variables: static decoding measures, Tier 1 responsiveness indicators, and prereading variables (phonemic awareness, rapid letter naming, oral vocabulary, and IQ). DA was a significant predictor of final level and growth, uniquely explaining 3% to 13% of the variance in Tier 2 responsiveness depending on the competing predictors in the model and WIF outcome (final level of performance or growth). Although the additional variances explained uniquely by DA were relatively small, results indicate the potential of DA in identifying Tier 2 nonresponders. © Hammill Institute on Disabilities 2012.

  20. Deployment and Operational Experiences with CernVM-FS at the GridKa Tier-1 Center

    NASA Astrophysics Data System (ADS)

    Alef, Manfred; Jäger, Axel; Petzold and, Andreas; Verstege, Bernhard

    2012-12-01

    In 2012 the GridKa Tier-1 computing center hosts 130 kHS06 computing resources and 14PB disk and 17PB tape space. These resources are shared between the four LHC VOs and a number of national and international VOs from high energy physics and other sciences. CernVM-FS has been deployed at GridKa to supplement the existing NFS-based system to access VO software on the worker nodes. It provides a solution tailored to the requirement of the LHC VOs. We will focus on the first operational experiences and the monitoring of CernVM-FS on the worker nodes and the squid caches.

  1. Effects of a Tier 3 Self-Management Intervention Implemented with and without Treatment Integrity

    ERIC Educational Resources Information Center

    Lower, Ashley; Young, K. Richard; Christensen, Lynnette; Caldarella, Paul; Williams, Leslie; Wills, Howard

    2016-01-01

    This study investigated the effects of a Tier 3 peer-matching self-management intervention on two elementary school students who had previously been less responsive to Tier 1 and Tier 2 interventions. The Tier 3 self-management intervention, which was implemented in the general education classrooms, included daily electronic communication between…

  2. 26 CFR 1.1248-7 - Taxpayer to establish earnings and profits and foreign taxes.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... shall also show for the first tier corporation, and for each lower tier corporation as to which...) of § 1.1248-2, and (iv) If the amount of earnings and profits of a lower tier corporation... lower tier corporation which the taxpayer owns within the meaning of section 958(a)(2)(b) the total...

  3. A Data-Driven Preschool PD Model for Literacy and Oral Language Instruction

    ERIC Educational Resources Information Center

    Abbott, Mary; Atwater, Jane; Lee, Younwoo; Edwards, Liesl

    2011-01-01

    The purpose of this article is to describe the professional development (PD) model for preschool literacy and language instruction that took place in a 3-year, 2-tiered Early Reading First project in 9 Head Start and community-based school classrooms. In our tiered model, the Tier 1 level was classroom instruction and Tier 2 was intervention…

  4. Tonle Sap Lake Water Storage Change Over 24 Years From Satellite Observation and Its Link With Mekong River Discharge and Climate Events

    NASA Astrophysics Data System (ADS)

    Biancamaria, S.; Frappart, F.; Normandin, C.; Blarel, F.; Bourrel, L.; Aumont, M.; Azema, P.; Vu, P. L.; Lubac, B.; Darrozes, J.

    2017-12-01

    The Tonle Sap lake is the largest freshwater lake in Southeast Asia and is located within the Mekong basin (mainly in Cambodia). It is one of he most productive ecosystem of the world and provide two thirds of Cambodia fish catch. It also plays a unique role on the Mekong basin hydrological cycle: during the monsoon period, the Mekong river partially flows to the lake, whereas during the dry season, the lake flows to the Mekong delta. It is therefore crucial to monitor and take into account this lake to estimate Mekong discharge to the ocean. However, in situ measurements of lake level and river discharge are very sparse (especially during the last decades) and computing lake storage variation from in situ data only is difficult due to the huge annual variation of lake area. That's why, satellite data (nadir radar altimetry and visible imagery) have been used to study its volume variation and its relationship with climate events and Mekong river discharge. Multi-mission altimetry data have been extracted (Topex, ERS-2, ENVISAT, Jason-1, Jason-2, Saral and Jason-3, using CTOH data extraction tools) to derive a lake water level from1993 to 2016, which varies from 3 m to 12 m. Lake area have been computed from MODIS data from 2000 to 2016 and varies from 3,400 km2 to 11,800 km2. These dataset clearly shows a relationship between lake water level and area, which has been used to estimate lake water volume change from 1995 to 2016, with a minimum in 2015 and a maximum in 2011. Lake's droughts and floods can be observed during moderate and strong El Nino/La Nina events, enhanced by the Pacific Decadal Oscillation. Besides, comparison with in situ discharge at the outlet of the Mekong basin (over 1995/2000 time period) shows that lake water level is 20 days time lagged and increases/decreases after Mekong discharge at its outlet. This time lag results of Mekong river partially flowing to the lake. Finally, high correlation between lake level and outlet discharge allows to use lake water level to derive Mekong discharge at its outlet after 2000, when in situ time series are not available anymore to the international scientific community. In the future, to improve time sampling, Sentinel-2 images and data from Sentinel-3 altimeter will be used.

  5. Role of ion-pair states in the predissociation dynamics of Rydberg states of molecular iodine.

    PubMed

    von Vangerow, J; Bogomolov, A S; Dozmorov, N V; Schomas, D; Stienkemeier, F; Baklanov, A V; Mudrich, M

    2016-07-28

    Using femtosecond pump-probe ion imaging spectroscopy, we establish the key role of I(+) + I(-) ion-pair (IP) states in the predissociation dynamics of molecular iodine I2 excited to Rydberg states. Two-photon excitation of Rydberg states lying above the lowest IP state dissociation threshold (1st tier) is found to be followed by direct parallel transitions into IP states of the 1st tier asymptotically correlating to a pair of I ions in their lowest states I(+)((3)P2) + I(-)((1)S0), of the 2nd tier correlating to I(+)((3)P0) + I(-)((1)S0), and of the 3rd tier correlating to I(+)((1)D2) + I(-)((1)S0). Predissociation via the 1st tier proceeds presumably with a delay of 1.6-1.7 ps which is close to the vibrational period in the 3rd tier state (3rd tier-mediated process). The 2nd tier IP state is concluded to be the main precursor for predissociation via lower lying Rydberg states proceeding with a characteristic time of 7-8 ps and giving rise to Rydberg atoms I(5s(2)5p(4)6s(1)). The channel generating I((2)P3/2) + I((2)P1/2) atoms with total kinetic energy corresponding to one-photon excitation is found to proceed via a pump - dump mechanism with dramatic change of angular anisotropy of this channel as compared with earlier nanosecond experiments.

  6. Benchmarking multimedia performance

    NASA Astrophysics Data System (ADS)

    Zandi, Ahmad; Sudharsanan, Subramania I.

    1998-03-01

    With the introduction of faster processors and special instruction sets tailored to multimedia, a number of exciting applications are now feasible on the desktops. Among these is the DVD playback consisting, among other things, of MPEG-2 video and Dolby digital audio or MPEG-2 audio. Other multimedia applications such as video conferencing and speech recognition are also becoming popular on computer systems. In view of this tremendous interest in multimedia, a group of major computer companies have formed, Multimedia Benchmarks Committee as part of Standard Performance Evaluation Corp. to address the performance issues of multimedia applications. The approach is multi-tiered with three tiers of fidelity from minimal to full compliant. In each case the fidelity of the bitstream reconstruction as well as quality of the video or audio output are measured and the system is classified accordingly. At the next step the performance of the system is measured. In many multimedia applications such as the DVD playback the application needs to be run at a specific rate. In this case the measurement of the excess processing power, makes all the difference. All these make a system level, application based, multimedia benchmark very challenging. Several ideas and methodologies for each aspect of the problems will be presented and analyzed.

  7. 78 FR 68895 - Self-Regulatory Organizations; NASDAQ OMX PHLX LLC; Notice of Filing and Immediate Effectiveness...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-15

    ... Options Classes, Category A Category B excluding SPY Options (Monthly) Tier 1 0.00%-0.75% 0.00 0.00 Tier 2 Above 0.75%-1.60% 0.12 0.17 Tier 3 Above 1.60%-2.60% 0.14 0.17 Tier 4 Above 2.60% 0.15 0.17 The Exchange... Securities Exchange Act of 1934 (``Act'') \\1\\ and Rule 19b-4 thereunder,\\2\\ notice is hereby given that, on...

  8. The Influence Of Team Rating On Running Performance In Elite Gaelic Football.

    PubMed

    Mangan, Shane; Malone, Shane; Ryan, Martin; Gahan, Jason Mc; Warne, Joe; Martin, Denise; O'Neill, Cian; Burns, Con; Collins, Kieran

    2017-11-06

    It is currently unknown how team rating influences running performance in Gaelic football. GPS technologies were used to quantify match-running performance within 5 elite Gaelic football teams over a period of 5 years (2012-2016). In total 780 player data sets were collected over 95 matches. Running performance variables included total distance, high-speed distance (≥17 km h) and the percentage of high-speed distance. Team ratings were determined objectively using the Elo Ratings System for Gaelic football. Reference team rating had trivial effects on total distance (p = 0.011, partial η2 = 0.008) and high-speed distance (p = 0.011, partial η2 = 0.008). Opposition team rating had small effects on total distance (p = 0.005, partial η2 = 0.016) and high-speed distance (p = 0.001, partial η2 = 0.020). Top tier teams cover greater total distances and high-speed distance than lower tier teams. Players cover considerably less total distance and high-speed distance against tier 3 and tier 4 teams. Tier 1 players ran a significantly higher percentage of distance at high-speed, than players who played for tier 2 teams (p = 0.020). The competitive advantage of top tier Gaelic football teams is closely linked with their ability to demonstrate a higher physical intensity than lower tier teams.

  9. Regulatory Compliance in Multi-Tier Supplier Networks

    NASA Technical Reports Server (NTRS)

    Goossen, Emray R.; Buster, Duke A.

    2014-01-01

    Over the years, avionics systems have increased in complexity to the point where 1st tier suppliers to an aircraft OEM find it financially beneficial to outsource designs of subsystems to 2nd tier and at times to 3rd tier suppliers. Combined with challenging schedule and budgetary pressures, the environment in which safety-critical systems are being developed introduces new hurdles for regulatory agencies and industry. This new environment of both complex systems and tiered development has raised concerns in the ability of the designers to ensure safety considerations are fully addressed throughout the tier levels. This has also raised questions about the sufficiency of current regulatory guidance to ensure: proper flow down of safety awareness, avionics application understanding at the lower tiers, OEM and 1st tier oversight practices, and capabilities of lower tier suppliers. Therefore, NASA established a research project to address Regulatory Compliance in a Multi-tier Supplier Network. This research was divided into three major study efforts: 1. Describe Modern Multi-tier Avionics Development 2. Identify Current Issues in Achieving Safety and Regulatory Compliance 3. Short-term/Long-term Recommendations Toward Higher Assurance Confidence This report presents our findings of the risks, weaknesses, and our recommendations. It also includes a collection of industry-identified risks, an assessment of guideline weaknesses related to multi-tier development of complex avionics systems, and a postulation of potential modifications to guidelines to close the identified risks and weaknesses.

  10. Task Selection, Task Switching and Multitasking during Computer-Based Independent Study

    ERIC Educational Resources Information Center

    Judd, Terry

    2015-01-01

    Detailed logs of students' computer use, during independent study sessions, were captured in an open-access computer laboratory. Each log consisted of a chronological sequence of tasks representing either the application or the Internet domain displayed in the workstation's active window. Each task was classified using a three-tier schema…

  11. 20 CFR 225.23 - Combined Earnings PIA used in survivor annuities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... section 215 of the Social Security Act as in effect on December 31, 1974. It is computed using the... RAILROAD RETIREMENT ACT PRIMARY INSURANCE AMOUNT DETERMINATIONS PIA's Used in Computing Survivor Annuities... annuities. The Combined Earnings PIA used in survivor annuities may be used in computing the tier II...

  12. 20 CFR 225.24 - SS Earnings PIA used in survivor annuities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Security Earnings PIA (SS Earnings PIA) used in survivor annuities may be used in computing the tier II... the Social Security Act as in effect on December 31, 1974. It is computed using the deceased employee... RETIREMENT ACT PRIMARY INSURANCE AMOUNT DETERMINATIONS PIA's Used in Computing Survivor Annuities and the...

  13. Wireless Testbed Bonsai

    DTIC Science & Technology

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  14. Screening Chemicals for Estrogen Receptor Bioactivity Using a Computational Model.

    PubMed

    Browne, Patience; Judson, Richard S; Casey, Warren M; Kleinstreuer, Nicole C; Thomas, Russell S

    2015-07-21

    The U.S. Environmental Protection Agency (EPA) is considering high-throughput and computational methods to evaluate the endocrine bioactivity of environmental chemicals. Here we describe a multistep, performance-based validation of new methods and demonstrate that these new tools are sufficiently robust to be used in the Endocrine Disruptor Screening Program (EDSP). Results from 18 estrogen receptor (ER) ToxCast high-throughput screening assays were integrated into a computational model that can discriminate bioactivity from assay-specific interference and cytotoxicity. Model scores range from 0 (no activity) to 1 (bioactivity of 17β-estradiol). ToxCast ER model performance was evaluated for reference chemicals, as well as results of EDSP Tier 1 screening assays in current practice. The ToxCast ER model accuracy was 86% to 93% when compared to reference chemicals and predicted results of EDSP Tier 1 guideline and other uterotrophic studies with 84% to 100% accuracy. The performance of high-throughput assays and ToxCast ER model predictions demonstrates that these methods correctly identify active and inactive reference chemicals, provide a measure of relative ER bioactivity, and rapidly identify chemicals with potential endocrine bioactivities for additional screening and testing. EPA is accepting ToxCast ER model data for 1812 chemicals as alternatives for EDSP Tier 1 ER binding, ER transactivation, and uterotrophic assays.

  15. 40 CFR 1033.102 - Transition to the standards of this part.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Tier 0 and Tier 1 standards of § 1033.101 apply for new locomotives beginning January 1, 2010, except as specified in § 1033.150(a). The Tier 0 and Tier 1 standards of 40 CFR part 92 apply for earlier... locomotives beginning January 1, 2013. The Tier 2 standards of 40 CFR part 92 apply for earlier model years...

  16. ATLAS user analysis on private cloud resources at GoeGrid

    NASA Astrophysics Data System (ADS)

    Glaser, F.; Nadal Serrano, J.; Grabowski, J.; Quadt, A.

    2015-12-01

    User analysis job demands can exceed available computing resources, especially before major conferences. ATLAS physics results can potentially be slowed down due to the lack of resources. For these reasons, cloud research and development activities are now included in the skeleton of the ATLAS computing model, which has been extended by using resources from commercial and private cloud providers to satisfy the demands. However, most of these activities are focused on Monte-Carlo production jobs, extending the resources at Tier-2. To evaluate the suitability of the cloud-computing model for user analysis jobs, we developed a framework to launch an ATLAS user analysis cluster in a cloud infrastructure on demand and evaluated two solutions. The first solution is entirely integrated in the Grid infrastructure by using the same mechanism, which is already in use at Tier-2: A designated Panda-Queue is monitored and additional worker nodes are launched in a cloud environment and assigned to a corresponding HTCondor queue according to the demand. Thereby, the use of cloud resources is completely transparent to the user. However, using this approach, submitted user analysis jobs can still suffer from a certain delay introduced by waiting time in the queue and the deployed infrastructure lacks customizability. Therefore, our second solution offers the possibility to easily deploy a totally private, customizable analysis cluster on private cloud resources belonging to the university.

  17. HIV Neutralizing Antibodies Induced by Native-like Envelope Trimers

    PubMed Central

    Sanders, Rogier W.; van Gils, Marit J.; Derking, Ronald; Sok, Devin; Ketas, Thomas J.; Burger, Judith A.; Ozorowski, Gabriel; Cupo, Albert; Simonich, Cassandra; Goo, Leslie; Arendt, Heather; Kim, Helen J.; Lee, Jeong Hyun; Pugach, Pavel; Williams, Melissa; Debnath, Gargi; Moldt, Brian; van Breemen, Mariëlle J.; Isik, Gözde; Medina-Ramírez, Max; Back, Jaap Willem; Koff, Wayne; Julien, Jean-Philippe; Rakasz, Eva G.; Seaman, Michael S.; Guttman, Miklos; Lee, Kelly K.; Klasse, Per Johan; LaBranche, Celia; Schief, William R.; Wilson, Ian A.; Overbaugh, Julie; Burton, Dennis R.; Ward, Andrew B.; Montefiori, David C.; Dean, Hansi; Moore, John P.

    2015-01-01

    A challenge for HIV-1 immunogen design is inducing neutralizing antibodies (NAbs) against neutralization-resistant (Tier-2) viruses that dominate human transmissions. We show that a soluble recombinant HIV-1 envelope glycoprotein trimer that adopts a native conformation (BG505 SOSIP.664) induced NAbs potently against the sequence-matched Tier-2 virus in rabbits and similar but weaker responses in macaques. The trimer also consistently induced cross-reactive NAbs against more sensitive (Tier-1) viruses. Tier-2 NAbs recognized conformational epitopes that differed between animals and in some cases overlapped with those recognized by broadly neutralizing antibodies (bNAbs), whereas Tier-1 responses targeted linear V3 epitopes. A second trimer, B41 SOSIP.664, also induced a strong autologous Tier-2 NAb response in rabbits. Thus, native-like trimers represent a promising starting point for developing HIV-1 vaccines aimed at inducing bNAbs. PMID:26089353

  18. How willing are landowners to supply land for bioenergy crops in the Northern Great Lakes Region?

    DOE PAGES

    Swinton, Scott M.; Tanner, Sophia; Barham, Bradford L.; ...

    2016-04-30

    Land to produce biomass is essential if the United States is to expand bioenergy supply. Use of agriculturally marginal land avoids the food vs. fuel problems of food price rises and carbon debt that are associated with crop and forestland. Recent remote sensing studies have identified large areas of US marginal land deemed suitable for bioenergy crops. Yet the sustainability benefits of growing bioenergy crops on marginal land only pertain if land is economically available. Scant attention has been paid to the willingness of landowners to supply land for bioenergy crops. Focusing on the northern tier of the Great Lakes,more » where grassland transitions to forest and land prices are low, this contingent valuation study reports on the willingness of a representative sample of 1124 private, noncorporate landowners to rent land for three bioenergy crops: corn, switchgrass, and poplar. Of the 11% of land that was agriculturally marginal, they were willing to make available no more than 21% for any bioenergy crop (switchgrass preferred on marginal land) at double the prevailing land rental rate in the region. At the same generous rental rate, of the 28% that is cropland, they would rent up to 23% for bioenergy crops (corn preferred), while of the 55% that is forestland, they would rent up to 15% for bioenergy crops (poplar preferred). Regression results identified deterrents to land rental for bioenergy purposes included appreciation of environmental amenities and concern about rental disamenities. In sum, like landowners in the southern Great Lakes region, landowners in the Northern Tier are reluctant to supply marginal land for bioenergy crops. If rental markets existed, they would rent more crop and forestland for bioenergy crops than they would marginal land, which would generate carbon debt and opportunity costs in wood product and food markets.« less

  19. How willing are landowners to supply land for bioenergy crops in the Northern Great Lakes Region?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swinton, Scott M.; Tanner, Sophia; Barham, Bradford L.

    Land to produce biomass is essential if the United States is to expand bioenergy supply. Use of agriculturally marginal land avoids the food vs. fuel problems of food price rises and carbon debt that are associated with crop and forestland. Recent remote sensing studies have identified large areas of US marginal land deemed suitable for bioenergy crops. Yet the sustainability benefits of growing bioenergy crops on marginal land only pertain if land is economically available. Scant attention has been paid to the willingness of landowners to supply land for bioenergy crops. Focusing on the northern tier of the Great Lakes,more » where grassland transitions to forest and land prices are low, this contingent valuation study reports on the willingness of a representative sample of 1124 private, noncorporate landowners to rent land for three bioenergy crops: corn, switchgrass, and poplar. Of the 11% of land that was agriculturally marginal, they were willing to make available no more than 21% for any bioenergy crop (switchgrass preferred on marginal land) at double the prevailing land rental rate in the region. At the same generous rental rate, of the 28% that is cropland, they would rent up to 23% for bioenergy crops (corn preferred), while of the 55% that is forestland, they would rent up to 15% for bioenergy crops (poplar preferred). Regression results identified deterrents to land rental for bioenergy purposes included appreciation of environmental amenities and concern about rental disamenities. In sum, like landowners in the southern Great Lakes region, landowners in the Northern Tier are reluctant to supply marginal land for bioenergy crops. If rental markets existed, they would rent more crop and forestland for bioenergy crops than they would marginal land, which would generate carbon debt and opportunity costs in wood product and food markets.« less

  20. Eurogrid: a new glideinWMS based portal for CDF data analysis

    NASA Astrophysics Data System (ADS)

    Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.

    2012-12-01

    The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.

  1. Cryosat-2 and Sentinel-3 tropospheric corrections: their evaluation over rivers and lakes

    NASA Astrophysics Data System (ADS)

    Fernandes, Joana; Lázaro, Clara; Vieira, Telmo; Restano, Marco; Ambrózio, Américo; Benveniste, Jérôme

    2017-04-01

    In the scope of the Sentinel-3 Hydrologic Altimetry PrototypE (SHAPE) project, errors that presently affect the tropospheric corrections i.e. dry and wet tropospheric corrections (DTC and WTC, respectively) given in satellite altimetry products are evaluated over inland water regions. These errors arise because both corrections, function of altitude, are usually computed with respect to an incorrect altitude reference. Several regions of interest (ROI) where CryoSat-2 (CS-2) is operating in SAR/SAR-In modes were selected for this evaluation. In this study, results for Danube River, Amazon Basin, Vanern and Titicaca lakes, and Caspian Sea, using Level 1B CS-2 data, are shown. DTC and WTC have been compared to those derived from ECMWF Operational model and computed at different altitude references: i) ECMWF orography; ii) ACE2 (Altimeter Corrected Elevations 2) and GWD-LR (Global Width Database for Large Rivers) global digital elevation models; iii) mean lake level, derived from Envisat mission data, or river profile derived in the scope of SHAPE project by AlongTrack (ATK) using Jason-2 data. Whenever GNSS data are available in the ROI, a GNSS-derived WTC was also generated and used for comparison. Overall, results show that the tropospheric corrections present in CS-2 L1B products are provided at the level of ECMWF orography, which can depart from the mean lake level or river profile by hundreds of metres. Therefore, the use of the model orography originates errors in the corrections. To mitigate these errors, both DTC and WTC should be provided at the mean river profile/lake level. For example, for the Caspian Sea with a mean level of -27 m, the tropospheric corrections provided in CS-2 products were computed at mean sea level (zero level), leading therefore to a systematic error in the corrections. In case a mean lake level is not available, it can be easily determined from satellite altimetry. In the absence of a mean river profile, both mentioned DEM, considered better altimetric surfaces when compared to the ECMWF orography, can be used. When using the model orography, systematic errors up to 3-5 cm are found in the DTC for most of the selected regions, which can induce significant errors in e.g. the determination of mean river profiles or lake level time series. For the Danube River, larger DTC errors up to 10 cm, due to terrain characteristics, can appear. For the WTC, with higher spatial variability, model errors of magnitude 1-3 cm are expected over inland waters. In the Danube region, the comparison of GNSS- and ECMWF-derived WTC has shown that the error in the WTC computed at orography level can be up to 3 cm. WTC errors with this magnitude have been found for all ROI. Although globally small, these errors are systematic and must be corrected prior to the generation of CS-2 Level 2 products. Once computed at the mean profile and mean lake level, the results show that tropospheric corrections have accuracy better than 1 cm. This analysis is currently being extended to S3 data and the first results are shown.

  2. 40 CFR 92.305 - Credit generation and use calculation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and Tier 1 PM line-haul credits; Std=0.59 g/kW-hr, for Tier 0 and Tier 1 PM switch credits; and Std.... For Tier 1 and Tier 2 engine families, the FEL may not exceed the limit established in § 92.304(k) for...). Consistent units are to be used throughout the calculation. (1) When useful life is expressed in terms of...

  3. Examining the Effects and Feasibility of a Teacher-Implemented Tier 1 and Tier 2 Intervention in Word Reading, Fluency, and Comprehension

    ERIC Educational Resources Information Center

    Solari, Emily J.; Denton, Carolyn A.; Petscher, Yaacov; Haring, Christa

    2018-01-01

    This study investigates the effects and feasibility of an intervention for first-grade students at risk for reading difficulties or disabilities (RD). The intervention was provided by general education classroom teachers and consisted of 15 min whole-class comprehension lessons (Tier 1) and 30 min Tier 2 intervention sessions in word reading,…

  4. To Wait in Tier 1 or Intervene Immediately: A Randomized Experiment Examining First Grade Response to Intervention (RTI) in Reading

    ERIC Educational Resources Information Center

    Al Otaiba, Stephanie; Connor, Carol M.; Folsom, Jessica S.; Wanzek, Jeanne; Greulich, Luana; Schatschneider, Christopher; Wagner, Richard K.

    2015-01-01

    This randomized control study compares the efficacy of two response-to-intervention (RTI) models: (1) Dynamic RTI, which immediately refers grade 1 students with the weakest skills to the most intensive intervention supports (Tier 2 or Tier 3); and (2) Typical RTI, which starts all students in Tier 1 and after 8 weeks, decides whether students who…

  5. Developing the Capacity to Implement Tier 2 and Tier 3 Supports: How Do We Support Our Faculty and Staff in Preparing for Sustainability?

    ERIC Educational Resources Information Center

    Oakes, Wendy Peia; Lane, Kathleen Lynne; Germer, Kathryn A.

    2014-01-01

    School-site and district-level leadership teams rely on the existing knowledge base to select, implement, and evaluate evidence-based practices meeting students' multiple needs within the context of multitiered systems of support. The authors focus on the stages of implementation science as applied to Tier 2 and Tier 3 supports; the…

  6. OSiRIS: a distributed Ceph deployment using software defined networking for multi-institutional research

    NASA Astrophysics Data System (ADS)

    McKee, Shawn; Kissel, Ezra; Meekhof, Benjeman; Swany, Martin; Miller, Charles; Gregorowicz, Michael

    2017-10-01

    We report on the first year of the OSiRIS project (NSF Award #1541335, UM, IU, MSU and WSU) which is targeting the creation of a distributed Ceph storage infrastructure coupled together with software-defined networking to provide high-performance access for well-connected locations on any participating campus. The projects goal is to provide a single scalable, distributed storage infrastructure that allows researchers at each campus to read, write, manage and share data directly from their own computing locations. The NSF CC*DNI DIBBS program which funded OSiRIS is seeking solutions to the challenges of multi-institutional collaborations involving large amounts of data and we are exploring the creative use of Ceph and networking to address those challenges. While OSiRIS will eventually be serving a broad range of science domains, its first adopter will be the LHC ATLAS detector project via the ATLAS Great Lakes Tier-2 (AGLT2) jointly located at the University of Michigan and Michigan State University. Part of our presentation will cover how ATLAS is using the OSiRIS infrastructure and our experiences integrating our first user community. The presentation will also review the motivations for and goals of the project, the technical details of the OSiRIS infrastructure, the challenges in providing such an infrastructure, and the technical choices made to address those challenges. We will conclude with our plans for the remaining 4 years of the project and our vision for what we hope to deliver by the projects end.

  7. Sea/Lake Water Air Conditioning at Naval Facilities.

    DTIC Science & Technology

    1980-05-01

    ECONOMICS AT TWO FACILITIES ......... ................... 2 Facilities ........... .......................... 2 Computer Models...of an operational test at Naval Security Group Activity (NSGA) Winter Harbor, Me., and the economics of Navywide application. In FY76 an assessment of... economics of Navywide application of sea/lake water AC indicated that cost and energy savings at the sites of some Naval facilities are possible, depending

  8. A comparison of Tier 1 and Tier 3 medical homes under Oklahoma Medicaid program.

    PubMed

    Kumar, Jay I; Anthony, Melody; Crawford, Steven A; Arky, Ronald A; Bitton, Asaf; Splinter, Garth L

    2014-04-01

    The patient-centered medical home (PCMH) is a team-based model of care that seeks to improve quality of care and control costs. The Oklahoma Health Care Authority (OHCA) directs Oklahoma's Medicaid program and contracts with 861 medical home practices across the state in one of three tiers of operational capacity: Tier 1 (Basic), Tier 2 (Advanced) and Tier 3 (Optimal). Only 13.5% (n = 116) homes are at the optimal level; the majority (59%, n = 508) at the basic level. In this study, we sought to determine the barriers that prevented Tier 1 homes from advancing to Tier 3 level and the incentives that would motivate providers to advance from Tier 1 to 3. Our hypotheses were that Tier 1 medical homes were located in smaller practices with limited resources and the providers are not convinced that the expense of advancing from Tier 1 status to Tier 3 status was worth the added value. We analyzed OHCA records to compare the 508 Tier 1 (entry-level) with 116 Tier 3 (optimal) medical homes for demographic differences with regards to location: urban or rural, duration as medical home, percentage of contracts that were group contracts, number of providers per group contract, panel age range, panel size, and member-provider ratio. We surveyed all 508 Tier 1 homes with a mail-in survey, and with focused follow up visits to identify the barriers to, and incentives for, upgrading from Tier 1 to Tier 2 or 3. We found that Tier 1 homes were more likely to be in rural areas, run by solo practitioners, serve exclusively adult panels, have smaller panel sizes, and have higher member-to-provider ratios in comparison with Tier 3 homes. Our survey had a 35% response rate. Results showed that the most difficult changes for Tier 1 homes to implement were providing 4 hours of after-hours care and a dedicated program for mental illness and substance abuse. The results also showed that the most compelling incentives for encouraging Tier 1 homes to upgrade their tier status were less"red tape"with prior authorizations, higher pay, and help with panel member follow-up. Multiple interventions may help medical homes in Oklahoma advance from the basic to the optimal level such as sharing of resources among nearby practices, expansion of OHCA online resources to help with preauthorizations and patient follow up, and the generation and transmission of data on the benefits of medical homes.

  9. Storing files in a parallel computing system based on user or application specification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faibish, Sorin; Bent, John M.; Nick, Jeffrey M.

    2016-03-29

    Techniques are provided for storing files in a parallel computing system based on a user-specification. A plurality of files generated by a distributed application in a parallel computing system are stored by obtaining a specification from the distributed application indicating how the plurality of files should be stored; and storing one or more of the plurality of files in one or more storage nodes of a multi-tier storage system based on the specification. The plurality of files comprise a plurality of complete files and/or a plurality of sub-files. The specification can optionally be processed by a daemon executing on onemore » or more nodes in a multi-tier storage system. The specification indicates how the plurality of files should be stored, for example, identifying one or more storage nodes where the plurality of files should be stored.« less

  10. LHCb experience with LFC replication

    NASA Astrophysics Data System (ADS)

    Bonifazi, F.; Carbone, A.; Perez, E. D.; D'Apice, A.; dell'Agnello, L.; Duellmann, D.; Girone, M.; Re, G. L.; Martelli, B.; Peco, G.; Ricci, P. P.; Sapunenko, V.; Vagnoni, V.; Vitlacil, D.

    2008-07-01

    Database replication is a key topic in the framework of the LHC Computing Grid to allow processing of data in a distributed environment. In particular, the LHCb computing model relies on the LHC File Catalog, i.e. a database which stores information about files spread across the GRID, their logical names and the physical locations of all the replicas. The LHCb computing model requires the LFC to be replicated at Tier-1s. The LCG 3D project deals with the database replication issue and provides a replication service based on Oracle Streams technology. This paper describes the deployment of the LHC File Catalog replication to the INFN National Center for Telematics and Informatics (CNAF) and to other LHCb Tier-1 sites. We performed stress tests designed to evaluate any delay in the propagation of the streams and the scalability of the system. The tests show the robustness of the replica implementation with performance going much beyond the LHCb requirements.

  11. 76 FR 11381 - Magnuson-Stevens Act Provisions; Fisheries Off West Coast States; Pacific Coast Groundfish...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... calculate the fixed gear primary sablefish fishery tier limits for 2011 at a level that will reduce concerns..., 2011, NMFS is implementing the following decrease in the annual tier limits for sablefish for 2011 and beyond: From Tier 1 at 56,081-lb (25,437 kg), Tier 2 at 25,492-lb (11,562 kg), and Tier 3 at 14,567-lb (6...

  12. A Two-Tier Test-Based Approach to Improving Students' Computer-Programming Skills in a Web-Based Learning Environment

    ERIC Educational Resources Information Center

    Yang, Tzu-Chi; Hwang, Gwo-Jen; Yang, Stephen J. H.; Hwang, Gwo-Haur

    2015-01-01

    Computer programming is an important skill for engineering and computer science students. However, teaching and learning programming concepts and skills has been recognized as a great challenge to both teachers and students. Therefore, the development of effective learning strategies and environments for programming courses has become an important…

  13. 26 CFR 1.1503-2 - Dual consolidated loss.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... tiers of separate units. If a separate unit of a domestic corporation is owned indirectly through... upper-tier separate unit were a subsidiary of the domestic corporation and the lower-tier separate unit were a lower-tier subsidiary. (4) Examples. The following examples illustrate the application of this...

  14. 38 CFR 36.4318 - Servicer tier ranking-temporary procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 2 2010-07-01 2010-07-01 false Servicer tier ranking... § 36.4318 Servicer tier ranking—temporary procedures. (a) The Secretary shall assign to each servicer a “Tier Ranking” based upon the servicer's performance in servicing guaranteed loans. There shall be four...

  15. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support

    PubMed Central

    Camargo, João; Rochol, Juergen; Gerla, Mario

    2018-01-01

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends. PMID:29364172

  16. Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.

    PubMed

    Rosário, Denis; Schimuneck, Matias; Camargo, João; Nobre, Jéferson; Both, Cristiano; Rochol, Juergen; Gerla, Mario

    2018-01-24

    A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends.

  17. Neutralization tiers of HIV-1

    PubMed Central

    Montefiori, David C.; Roederer, Mario; Morris, Lynn; Seaman, Michael S.

    2018-01-01

    Purpose of review HIV-1 isolates are often classified on the basis of neutralization ‘tier’ phenotype. Tier classification has important implications for the monitoring and interpretation of vaccine-elicited neutralizing antibody responses. The molecular basis that distinguishes the multiple neutralization phenotypes of HIV-1 has been unclear. We present a model based on the dynamic nature of the HIV-1 envelope glycoproteins and its impact on epitope exposure. We also describe a new approach for ranking HIV-1 vaccine-elicited neutralizing antibody responses. Recent findings The unliganded trimeric HIV-1 envelope glycoprotein spike spontaneously transitions through at least three conformations. Neutralization tier phenotypes correspond to the frequency by which the trimer exists in a closed (tiers 2 and 3), open (tier 1A), or intermediate (tier 1B) conformation. An increasing number of epitopes become exposed as the trimer opens, making the virus more sensitive to neutralization by certain antibodies. The closed conformation is stabilized by many broadly neutralizing antibodies. Summary The tier 2 neutralization phenotype is typical of most circulating strains and is associated with a predominantly closed Env trimer configuration that is a high priority to target with vaccines. Assays with tier 1A viruses should be interpreted with caution and with the understanding that they detect many antibody specificities that do not neutralize tier 2 viruses and do not protect against HIV-1 infection. PMID:29266013

  18. Transporting Motivational Interviewing to School Settings to Improve the Engagement and Fidelity of Tier 2 Interventions

    ERIC Educational Resources Information Center

    Frey, Andy J.; Lee, Jon; Small, Jason W.; Seeley, John R.; Walker, Hill M.; Feil, Edward G.

    2013-01-01

    The majority of Tier 2 interventions are facilitated by specialized instructional support personnel, such as a school psychologists, school social workers, school counselors, or behavior consultants. Many professionals struggle to involve parents and teachers in Tier 2 behavior interventions. However, attention to the motivational issues for…

  19. 78 FR 71039 - Publication of the Tier 2 Tax Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-27

    ...Publication of the tier 2 tax rates for calendar year 2014 as required by section 3241(d) of the Internal Revenue Code (26 U.S.C. section 3241). Tier 2 taxes on railroad employees, employers, and employee representatives are one source of funding for benefits under the Railroad Retirement Act.

  20. 77 FR 71481 - Publication of the Tier 2 Tax Rates

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ... DEPARTMENT OF THE TREASURY Internal Revenue Service Publication of the Tier 2 Tax Rates AGENCY... tax rates for calendar year 2013 as required by section 3241(d) of the Internal Revenue Code (26 U.S.C. 3241). Tier 2 taxes on railroad employees, employers, and employee representatives are one source of...

  1. Essential Features of Tier 2 Social-Behavioral Interventions

    ERIC Educational Resources Information Center

    Yong, Minglee; Cheney, Douglas A.

    2013-01-01

    The purpose of this study is to identify the essential features of Tier 2 interventions conducted within multitier systems of behavior support in schools. A systematic literature search identified 12 empirical studies that were coded and scored according to a list of Tier 2 specific RE-AIM criteria, related to the Reach, Effectiveness, Adoption,…

  2. Source apportionment of emissions from light-duty gasoline vehicles and other sources in the United States for ozone and particulate matter.

    PubMed

    Vijayaraghavan, Krish; Lindhjem, Chris; Koo, Bonyoung; DenBleyker, Allison; Tai, Edward; Shah, Tejas; Alvarez, Yesica; Yarwood, Greg

    2016-02-01

    Federal Tier 3 motor vehicle emission and fuel sulfur standards have been promulgated in the United States to help attain air quality standards for ozone and PM2.5 (particulate matter with an aerodynamic diameter <2.5 μm). The authors modeled a standard similar to Tier 3 (a hypothetical nationwide implementation of the California Low Emission Vehicle [LEV] III standards) and prior Tier 2 standards for on-road gasoline-fueled light-duty vehicles (gLDVs) to assess incremental air quality benefits in the United States (U.S.) and the relative contributions of gLDVs and other major source categories to ozone and PM2.5 in 2030. Strengthening Tier 2 to a Tier 3-like (LEV III) standard reduces the summertime monthly mean of daily maximum 8-hr average (MDA8) ozone in the eastern U.S. by up to 1.5 ppb (or 2%) and the maximum MDA8 ozone by up to 3.4 ppb (or 3%). Reducing gasoline sulfur content from 30 to 10 ppm is responsible for up to 0.3 ppb of the improvement in the monthly mean ozone and up to 0.8 ppb of the improvement in maximum ozone. Across four major urban areas-Atlanta, Detroit, Philadelphia, and St. Louis-gLDV contributions range from 5% to 9% and 3% to 6% of the summertime mean MDA8 ozone under Tier 2 and Tier 3, respectively, and from 7% to 11% and 3% to 7% of the maximum MDA8 ozone under Tier 2 and Tier 3, respectively. Monthly mean 24-hr PM2.5 decreases by up to 0.5 μg/m(3) (or 3%) in the eastern U.S. from Tier 2 to Tier 3, with about 0.1 μg/m(3) of the reduction due to the lower gasoline sulfur content. At the four urban areas under the Tier 3 program, gLDV emissions contribute 3.4-5.0% and 1.7-2.4% of the winter and summer mean 24-hr PM2.5, respectively, and 3.8-4.6% and 1.5-2.0% of the mean 24-hr PM2.5 on days with elevated PM2.5 in winter and summer, respectively. Following U.S. Tier 3 emissions and fuel sulfur standards for gasoline-fueled passenger cars and light trucks, these vehicles are expected to contribute less than 6% of the summertime mean daily maximum 8-hr ozone and less than 7% and 4% of the winter and summer mean 24-hr PM2.5 in the eastern U.S. in 2030. On days with elevated ozone or PM2.5 at four major urban areas, these vehicles contribute less than 7% of ozone and less than 5% of PM2.5, with sources outside North America and U.S. area source emissions constituting some of the main contributors to ozone and PM2.5, respectively.

  3. 47 CFR 76.980 - Charges for customer changes.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....980 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... charge for customer changes in service tiers effected solely by coded entry on a computer terminal or by... involve more than coded entry on a computer or other similarly simple method shall be based on actual cost...

  4. 47 CFR 76.980 - Charges for customer changes.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....980 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... charge for customer changes in service tiers effected solely by coded entry on a computer terminal or by... involve more than coded entry on a computer or other similarly simple method shall be based on actual cost...

  5. 47 CFR 76.980 - Charges for customer changes.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....980 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... charge for customer changes in service tiers effected solely by coded entry on a computer terminal or by... involve more than coded entry on a computer or other similarly simple method shall be based on actual cost...

  6. 47 CFR 76.980 - Charges for customer changes.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....980 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... charge for customer changes in service tiers effected solely by coded entry on a computer terminal or by... involve more than coded entry on a computer or other similarly simple method shall be based on actual cost...

  7. 47 CFR 76.980 - Charges for customer changes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....980 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... charge for customer changes in service tiers effected solely by coded entry on a computer terminal or by... involve more than coded entry on a computer or other similarly simple method shall be based on actual cost...

  8. INFN-Pisa scientific computation environment (GRID, HPC and Interactive Analysis)

    NASA Astrophysics Data System (ADS)

    Arezzini, S.; Carboni, A.; Caruso, G.; Ciampa, A.; Coscetti, S.; Mazzoni, E.; Piras, S.

    2014-06-01

    The INFN-Pisa Tier2 infrastructure is described, optimized not only for GRID CPU and Storage access, but also for a more interactive use of the resources in order to provide good solutions for the final data analysis step. The Data Center, equipped with about 6700 production cores, permits the use of modern analysis techniques realized via advanced statistical tools (like RooFit and RooStat) implemented in multicore systems. In particular a POSIX file storage access integrated with standard SRM access is provided. Therefore the unified storage infrastructure is described, based on GPFS and Xrootd, used both for SRM data repository and interactive POSIX access. Such a common infrastructure allows a transparent access to the Tier2 data to the users for their interactive analysis. The organization of a specialized many cores CPU facility devoted to interactive analysis is also described along with the login mechanism integrated with the INFN-AAI (National INFN Infrastructure) to extend the site access and use to a geographical distributed community. Such infrastructure is used also for a national computing facility in use to the INFN theoretical community, it enables a synergic use of computing and storage resources. Our Center initially developed for the HEP community is now growing and includes also HPC resources fully integrated. In recent years has been installed and managed a cluster facility (1000 cores, parallel use via InfiniBand connection) and we are now updating this facility that will provide resources for all the intermediate level HPC computing needs of the INFN theoretical national community.

  9. SWPBIS Tiered Fidelity Inventory. Version 2.1

    ERIC Educational Resources Information Center

    Algozzine, B.; Barrett, S.; Eber, L.; George, H.; Horner, R.; Lewis, T.; Putnam, B.; Swain-Bradway, J.; McIntosh, K.; Sugai, G.

    2014-01-01

    The purpose of the SWPBIS Tiered Fidelity Inventory (TFI) is to provide a valid, reliable, and efficient measure of the extent to which school personnel are applying the core features of school-wide positive behavioral interventions and supports (SWPBIS). The TFI is divided into three sections (Tier I: Universal SWPBIS Features; Tier II: Targeted…

  10. 40 CFR 1033.135 - Labeling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Tier 1 and later locomotives. The label on the engine is replaced each time the locomotive is... 0 and Tier 1 locomotives, the label may be made up of more than one piece, as long as all pieces are... to Tier 1+ locomotives.” (4) “This locomotive conforms to U.S. EPA regulations applicable to Tier 2...

  11. Do Students Know What They Know and What They Don't Know? Using a Four-Tier Diagnostic Test to Assess the Nature of Students' Alternative Conceptions

    ERIC Educational Resources Information Center

    Caleon, Imelda S.; Subramaniam, R.

    2010-01-01

    This study reports on the development and application of a four-tier multiple-choice (4TMC) diagnostic instrument, which has not been reported in the literature. It is an enhanced version of the two-tier multiple-choice (2TMC) test. As in 2TMC tests, its answer and reason tiers measure students' content knowledge and explanatory knowledge,…

  12. Development, deployment and operations of ATLAS databases

    NASA Astrophysics Data System (ADS)

    Vaniachine, A. V.; Schmitt, J. G. v. d.

    2008-07-01

    In preparation for ATLAS data taking, a coordinated shift from development towards operations has occurred in ATLAS database activities. In addition to development and commissioning activities in databases, ATLAS is active in the development and deployment (in collaboration with the WLCG 3D project) of the tools that allow the worldwide distribution and installation of databases and related datasets, as well as the actual operation of this system on ATLAS multi-grid infrastructure. We describe development and commissioning of major ATLAS database applications for online and offline. We present the first scalability test results and ramp-up schedule over the initial LHC years of operations towards the nominal year of ATLAS running, when the database storage volumes are expected to reach 6.1 TB for the Tag DB and 1.0 TB for the Conditions DB. ATLAS database applications require robust operational infrastructure for data replication between online and offline at Tier-0, and for the distribution of the offline data to Tier-1 and Tier-2 computing centers. We describe ATLAS experience with Oracle Streams and other technologies for coordinated replication of databases in the framework of the WLCG 3D services.

  13. Bayesian Computational Sensor Networks for Aircraft Structural Health Monitoring

    DTIC Science & Technology

    2016-02-02

    LAKE CITY Final Report 02/02/2016 DISTRIBUTION A: Distribution approved for public release. AF Office Of Scientific Research (AFOSR)/ RTA2 Arlington...Adams Grant Number: FA9550-12-1-0291 AFOSR PI: Dr. Frederica Darema 25 January 2016 University of Utah, Salt lake City UT 84112 Executive Summary...Boonsirisumpun, Kyle Luthy and Edward Grant, University of Utah Technical Report, UUCS-13-003, Salt Lake City, UT, May 2013. [5] ``Robot Cognition using

  14. An Examination of the Efficacy of a Multitiered Intervention on Early Reading Outcomes for First Grade Students at Risk for Reading Difficulties.

    PubMed

    Fien, Hank; Smith, Jean Louise M; Smolkowski, Keith; Baker, Scott K; Nelson, Nancy J; Chaparro, Erin

    2015-01-01

    This article presents findings of an efficacy trial examining the effect of a multitiered instruction and intervention model on first grade at-risk students' reading outcomes. Schools (N = 16) were randomly assigned to the treatment or control condition. In the fall of Grade 1, students were assigned to an instructional tier on the basis of Stanford Achievement Test-10th Edition scores (31st percentile and above = Tier 1; from the 10th to the 30th percentile = Tier 2). In both conditions, students identified as at risk (i.e., Tier 2; n = 267) received 90 min of whole group instruction (Tier 1) and an additional 30 min of daily small group intervention (Tier 2). In the treatment condition, teachers were trained to enhance core reading instruction by making instruction more explicit and increasing practice opportunities for students in Tier 1. In addition, at-risk readers were provided an additional 30-min daily small group intervention with content that was highly aligned with the Tier 1 core reading program. Results indicate significant, positive effects of the intervention on students' decoding and first semester fluent reading and potentially positive effects on reading comprehension and total reading achievement. © Hammill Institute on Disabilities 2014.

  15. The Impact of Tier 2 Mathematics Instruction on Second Graders with Mathematics Difficulties

    ERIC Educational Resources Information Center

    Dennis, Minyi Shih; Bryant, Brian R.; Drogan, Robin

    2015-01-01

    Although research on Tier 2 interventions for early mathematics is accumulating, such efforts remain far behind those for reading, especially regarding specific features such as the ideal time to begin an intervention. The present study investigated the effectiveness of a Tier 2 intervention using a single subject multiple baseline, across-groups…

  16. A Step-by-Step Guide to Tier 2 Behavioral Progress Monitoring

    ERIC Educational Resources Information Center

    Bruhn, Allison L.; McDaniel, Sara C.; Rila, Ashley; Estrapala, Sara

    2018-01-01

    Students who are at risk for or show low-intensity behavioral problems may need targeted, Tier 2 interventions. Often, Tier 2 problem-solving teams are charged with monitoring student responsiveness to intervention. This process may be difficult for those who are not trained in data collection and analysis procedures. To aid practitioners in these…

  17. 12 CFR 34.81 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Real Estate Owned § 34.81 Definitions. (a) Capital and surplus means: (1) A bank's Tier 1 and Tier 2... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Definitions. 34.81 Section 34.81 Banks and... (2) The balance of a bank's allowance for loan and lease losses not included in the bank's Tier 2...

  18. Using Regression Discontinuity to Test the Impact of a Tier 2 Reading Intervention in First Grade

    ERIC Educational Resources Information Center

    Baker, Scott K.; Smolkowski, Keith; Chaparro, Erin A.; Smith, Jean L. M.; Fien, Hank

    2015-01-01

    Multitiered systems of reading instruction and intervention, including response to intervention, are widely used in early reading by schools to provide more intense services to students who need them. Research using randomized controlled trials has compared innovative Tier 2 interventions to business-as-usual Tier 2 approaches and established a…

  19. Effect of Advection on Evaporative Fluxes and Vapor Isotopic Ratios: The Lake Size Effect

    NASA Astrophysics Data System (ADS)

    Feng, X.; Lauder, A. M.; Kopec, B. G.; Posmentier, E. S.

    2015-12-01

    It has been reported that advection of air from land can be identified hundreds of kilometers off shore. With advection, moisture builds up downwind, and the evaporative flux decreases and isotopic flux ratios increase with distance. If a lake is small relative to the equilibration distance, the fluxes of all water isotopologues averaged over the lake are different from those calculated using models without advection. The magnitude of the discrepancy depends on the lake size; we refer to this as the "lake size effect". In Kangerlussuaq, Greenland, we observed significant horizontal gradients in concentration, δD, and δ18O of vapor up to 5 km along the wind direction. Over a 0.5 km long lake, the observed average gradients were 1380 ppm/km for vapor content, 21‰/km for δD, 2.4‰/km for δ18O, and 5‰/km for d-excess. These gradients decreased with distance from the upwind shore. Over a stretch of another, much larger lake 4-5 km from the upwind shore, we observed gradients of 354 ppm/km, 1.5‰/km, 0.22‰/km and 0.3‰/km, for vapor concentration, δD, δ18O, and d-excess, respectively. These observations were modeled successfully using a two-dimensional (2-D, horizontal and vertical) steady state advection diffusion model. This model also computes evaporative fluxes. Using the model results, we assess the magnitude of the lake size effect and its impact on water balance calculations. Under the condition of our field observations and for lakes less than 500 m along the wind direction, the mean flux δ18O and δD were at least 2‰ lower than the corresponding values from a 1-D model (vertical only). If using biased isotopic flux values for water balance calculations, the lake size effect would lead to an underestimation of the lake I/E (input to evaporation) ratio. For example, if the lake effect is 1‰, the corresponding underestimation of the I/E ratio is about 10% if using δ18O, and less than 2% if using δD for the computation. This argues for advantageous use of δD over δ18O in water balance and paleoclimate studies when the lake size is small or changes significantly over time. Still greater accuracy in water balance assessment can be achieved by using the 2-D model to correct for the lake size effect under the environmental conditions at the location of interest.

  20. Application of Computational Toxicological Approaches in Supporting Human Health Risk Assessment, Project Summary

    EPA Science Inventory

    Summary

    This project has three parts. The first part focuses on developing a tiered strategy and applying computational toxicological approaches to support human health risk assessment by deriving a surrogate point-of-departure (e.g., NOAEL, LOAEL, etc.) using a test c...

  1. Improved Serodiagnostic Performance for Lyme Disease by Use of Two Recombinant Proteins in Enzyme-Linked Immunosorbent Assay Compared to Standardized Two-Tier Testing.

    PubMed

    Bradshaw, Gary L; Thueson, R Kelley; Uriona, Todd J

    2017-10-01

    The most reliable test method for the serological confirmation of Lyme disease (LD) is a 2-tier method recommended by the CDC in 1995. The first-tier test is a low-specificity enzyme-linked immunosorbent assay (ELISA), and the second-tier tests are higher-specificity IgG and IgM Western blots. This study describes the selection of two Borrelia burgdorferi recombinant proteins and evaluation of their performance in a simple 1-tier test for the serological confirmation of LD. These two proteins were generated from (i) the full-length dbpA gene combined with the invariable region 6 of the vlsE gene (DbpA/C6) and (b) the full-length ospC gene (OspC). The expressed DbpA/C6 and OspC proteins were useful in detecting anti- Borrelia IgG and IgM antibodies, respectively. A blind study was conducted on a well-characterized panel of 279 human sera from the CDC, comparing ELISAs using these two recombinant antigens with the 2-tier test method. The two methods (DbpA/C6-OspC versus 2-tier test) were equivalent in identifying sera from negative-control subjects (99% and 100% specificity, respectively) and in detecting stage II and III LD patient sera (100% and 100% sensitivity). However, the DbpA/C6-OspC ELISA was markedly better (80% versus 63%) than the 2-tier test method in detecting anti- Borrelia antibodies in stage I LD patients. The findings suggest that these antigens could be used in a simple 1-tier ELISA that is faster to perform, easier to interpret, and less expensive than the 2-tier test method and which is better at detecting Borrelia -specific antibodies in sera from patients with stage I LD. Copyright © 2017 Bradshaw et al.

  2. Self-Regulated Strategy Development as a Tier 2 Writing Intervention

    ERIC Educational Resources Information Center

    Johnson, Evelyn S.; Hancock, Christine; Carter, Deborah R.; Pool, Juli L.

    2013-01-01

    In a response to intervention framework, the implication of limited writing instruction suggests an immediate need for Tier 2 interventions to support struggling writers while at the same time addressing instructional gaps in Tier 1. Many schools struggle with implementing writing intervention, partly because of the limited number of…

  3. 7 CFR 1416.302 - Eligible crops and producers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determine payment rates are as follows: Tier I—75 percent or greater crop loss and associated tree damage. Tier II—50 to 74 percent crop loss and associated tree damage/loss. Tier III—35 to 49 percent crop loss and associated tree damage/loss. Tier IV —15 percent and greater associated tree damage only. (2...

  4. 25 CFR 542.30 - What is a Tier B gaming operation?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false What is a Tier B gaming operation? 542.30 Section 542.30 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR HUMAN SERVICES MINIMUM INTERNAL CONTROL STANDARDS § 542.30 What is a Tier B gaming operation? A Tier B gaming operation is one with gross...

  5. 25 CFR 542.40 - What is a Tier C gaming operation?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false What is a Tier C gaming operation? 542.40 Section 542.40 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR HUMAN SERVICES MINIMUM INTERNAL CONTROL STANDARDS § 542.40 What is a Tier C gaming operation? A Tier C gaming operation is one with annual...

  6. 25 CFR 542.20 - What is a Tier A gaming operation?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false What is a Tier A gaming operation? 542.20 Section 542.20 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR HUMAN SERVICES MINIMUM INTERNAL CONTROL STANDARDS § 542.20 What is a Tier A gaming operation? A Tier A gaming operation is one with annual...

  7. 76 FR 3184 - Self-Regulatory Organizations; Chicago Board Options Exchange, Incorporated; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-19

    ...); (ii) establish a VIX Tier Appointment; (iii) amend the monthly fee for Floor Broker Trading Permits... demutualization, CBOE amended its Fees Schedule to establish Trading Permit, tier appointment and bandwidth packet... permit ($) 1 permit 10 permits 6,000 Tier 1 11 permits 20 permits 4,800 Tier 2 21 or more permits...

  8. Effects of a Tier 3 Phonological Awareness Intervention on Preschoolers' Emergent Literacy

    ERIC Educational Resources Information Center

    Noe, Sean; Spencer, Trina D.; Kruse, Lydia; Goldstein, Howard

    2014-01-01

    This multiple baseline design study examined the effects of a Tier 3 early literacy intervention on low-income preschool children's phonological awareness (PA). Seven preschool children who did not make progress on identifying first sounds in words during a previous Tier 2 intervention participated in a more intensive Tier 3 intervention. Children…

  9. 12 CFR 325.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Report.” (m) Leverage ratio means the ratio of Tier 1 capital to total assets, as calculated under this... assets may be included in calculating the bank's Tier 1 capital. (v) Tier 1 capital or core capital means... in excess of the limit set forth in § 325.5(g), minus identified losses (to the extent that Tier 1...

  10. Water Level Prediction of Lake Cascade Mahakam Using Adaptive Neural Network Backpropagation (ANNBP)

    NASA Astrophysics Data System (ADS)

    Mislan; Gaffar, A. F. O.; Haviluddin; Puspitasari, N.

    2018-04-01

    A natural hazard information and flood events are indispensable as a form of prevention and improvement. One of the causes is flooding in the areas around the lake. Therefore, forecasting the surface of Lake water level to anticipate flooding is required. The purpose of this paper is implemented computational intelligence method namely Adaptive Neural Network Backpropagation (ANNBP) to forecasting the Lake Cascade Mahakam. Based on experiment, performance of ANNBP indicated that Lake water level prediction have been accurate by using mean square error (MSE) and mean absolute percentage error (MAPE). In other words, computational intelligence method can produce good accuracy. A hybrid and optimization of computational intelligence are focus in the future work.

  11. Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes

    EPA Science Inventory

    Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...

  12. The JINR Tier1 Site Simulation for Research and Development Purposes

    NASA Astrophysics Data System (ADS)

    Korenkov, V.; Nechaevskiy, A.; Ososkov, G.; Pryahina, D.; Trofimov, V.; Uzhinskiy, A.; Voytishin, N.

    2016-02-01

    Distributed complex computing systems for data storage and processing are in common use in the majority of modern scientific centers. The design of such systems is usually based on recommendations obtained via a preliminary simulated model used and executed only once. However big experiments last for years and decades, and the development of their computing system is going on, not only quantitatively but also qualitatively. Even with the substantial efforts invested in the design phase to understand the systems configuration, it would be hard enough to develop a system without additional research of its future evolution. The developers and operators face the problem of the system behaviour predicting after the planned modifications. A system for grid and cloud services simulation is developed at LIT (JINR, Dubna). This simulation system is focused on improving the effciency of the grid/cloud structures development by using the work quality indicators of some real system. The development of such kind of software is very important for making a new grid/cloud infrastructure for such big scientific experiments like the JINR Tier1 site for WLCG. The simulation of some processes of the Tier1 site is considered as an example of our application approach.

  13. Positive Behavior Supports: Tier 2 Interventions in Middle Schools

    ERIC Educational Resources Information Center

    Hoyle, Carol G.; Marshall, Kathleen J.; Yell, Mitchell L.

    2011-01-01

    School personnel are using Schoolwide Positive Behavior Supports in public schools throughout the United States. A number of studies have evaluated the universal level, or Tier 1, of Schoolwide Positive Behavior Supports. In this study, the authors describe and analyze the interventions offered as options for use for Tier 2 in middle schools…

  14. Response to Intervention: Evaluation Report and Executive Summary

    ERIC Educational Resources Information Center

    Gorard, Stephen; Siddiqui, Nadia; See, Beng Huat

    2014-01-01

    Response to Intervention (RTI) is a targeted programme that uses a tiered approach to identify the needs of low achieving pupils. The approach begins with whole class teaching (Tier 1), followed by small group tuition (Tier 2) for those who need more attention, and one to one tutoring (Tier 3) for those who do not respond to the small group…

  15. Response to Instruction in Preschool: Results of Two Randomized Studies with Children At Significant Risk of Reading Difficulties

    PubMed Central

    Lonigan, Christopher J.; Phillips, Beth M.

    2015-01-01

    Although response-to-instruction (RTI) approaches have received increased attention, few studies have evaluated the potential impacts of RTI approaches with preschool populations. This manuscript presents results of two studies examining impacts of Tier II instruction with preschool children. Participating children were identified as substantially delayed in the acquisition of early literacy skills despite exposure to high-quality, evidence-based classroom instruction. Study 1 included 93 children (M age = 58.2 months; SD = 3.62) attending 12 Title I preschools. Study 2 included 184 children (M age = 58.2 months; SD = 3.38) attending 19 Title I preschools. The majority of children were Black/African American, and about 60% were male. In both studies, eligible children were randomized to receive either 11 weeks of need-aligned, small-group instruction or just Tier I. Tier II instruction in Study 1 included variations of activities for code- and language-focused domains with prior evidence of efficacy in non-RTI contexts. Tier II instruction in Study 2 included instructional activities narrower in scope, more intensive, and delivered to smaller groups of children. Impacts of Tier II instruction in Study 1 were minimal; however, there were significant and moderate-to-large impacts in Study 2. These results identify effective Tier II instruction but indicate that the context in which children are identified may alter the nature of Tier II instruction that is required. Children identified as eligible for Tier II in an RTI framework likely require more intensive and more narrowly focused instruction than do children at general risk of later academic difficulties. PMID:26869730

  16. Development of a tier 1 R5 clade C simian-human immunodeficiency virus as a tool to test neutralizing antibody-based immunoprophylaxis.

    PubMed

    Siddappa, Nagadenahalli B; Hemashettar, Girish; Wong, Yin Ling; Lakhashe, Samir; Rasmussen, Robert A; Watkins, Jennifer D; Novembre, Francis J; Villinger, François; Else, James G; Montefiori, David C; Ruprecht, Ruth M

    2011-04-01

    While some recently transmitted HIV clade C (HIV-C) strains exhibited tier 1 neutralization phenotypes, most were tier 2 strains (J Virol 2010; 84:1439). Because induction of neutralizing antibodies (nAbs) through vaccination against tier 2 viruses has proven difficult, we have generated a tier 1, clade C simian-human immunodeficiency virus (SHIV-C) to permit efficacy testing of candidate AIDS vaccines against tier 1 viruses. SHIV-1157ipEL was created by swapping env of a late-stage virus with that of a tier 1, early form. After adaptation to rhesus macaques (RM), passaged SHIV-1157ipEL-p replicated vigorously in vitro and in vivo while maintaining R5 tropism. The virus was reproducibly transmissible intrarectally. Phylogenetically, SHIV-1157ipEL-p Env clustered with HIV-C sequences. All RM chronically infected with SHIV-1157ipEL-p developed high nAb titers against autologous as well as heterologous tier 1 strains. SHIV-1157ipEL-p was reproducibly transmitted in RM, induced cross-clade nAbs, and represents a tool to evaluate anti-HIV-C nAb responses in primates. © 2010 John Wiley & Sons A/S.

  17. Research and development of web oriented remote sensing image publication system based on Servlet technique

    NASA Astrophysics Data System (ADS)

    Juanle, Wang; Shuang, Li; Yunqiang, Zhu

    2005-10-01

    According to the requirements of China National Scientific Data Sharing Program (NSDSP), the research and development of web oriented RS Image Publication System (RSIPS) is based on Java Servlet technique. The designing of RSIPS framework is composed of 3 tiers, which is Presentation Tier, Application Service Tier and Data Resource Tier. Presentation Tier provides user interface for data query, review and download. For the convenience of users, visual spatial query interface is included. Served as a middle tier, Application Service Tier controls all actions between users and databases. Data Resources Tier stores RS images in file and relationship databases. RSIPS is developed with cross platform programming based on Java Servlet tools, which is one of advanced techniques in J2EE architecture. RSIPS's prototype has been developed and applied in the geosciences clearinghouse practice which is among the experiment units of NSDSP in China.

  18. Effect of E85 on Tailpipe Emissions from Light-Duty Vehicles.

    PubMed

    Yanowitz, Janet; McCormick, Robert L

    2009-02-01

    E85, which consists of nominally 85% fuel grade ethanol and 15% gasoline, must be used in flexible-fuel (or "flex-fuel") vehicles (FFVs) that can operate on fuel with an ethanol content of 0-85%. Published studies include measurements of the effect of E85 on tailpipe emissions for Tier 1 and older vehicles. Car manufacturers have also supplied a large body of FFV certification data to the U.S. Environmental Protection Agency, primarily on Tier 2 vehicles. These studies and certification data reveal wide variability in the effects of E85 on emissions from different vehicles. Comparing Tier 1 FFVs running on E85 to similar non-FFVs running on gasoline showed, on average, significant reductions in emissions of oxides of nitrogen (NO x ; 54%), non-methane hydrocarbons (NMHCs; 27%), and carbon monoxide (CO; 18%) for E85. Comparing Tier 2 FFVs running on E85 and comparable non-FFVs running on gasoline shows, for E85 on average, a signifi-cant reduction in emissions of CO (20%), and no signifi-cant effect on emissions of non-methane organic gases (NMOGs). NO x emissions from Tier 2 FFVs averaged approximately 28% less than comparable non-FFVs. However, perhaps because of the wide range of Tier 2 NO x standards, the absolute difference in NO x emissions between Tier 2 FFVs and non-FFVs is not significant (P =0.28). It is interesting that Tier 2 FFVs operating on gasoline produced approximately 13% less NMOGs than non-FFVs operating on gasoline. The data for Tier 1 vehicles show that E85 will cause significant reductions in emissions of benzene and butadiene, and significant increases in emissions of formaldehyde and acetaldehyde, in comparison to emissions from gasoline in both FFVs and non-FFVs. The compound that makes up the largest proportion of organic emissions from E85-fueled FFVs is ethanol.

  19. ERTS computer compatible tape data processing and analysis. Appendix 1: The utility of imaging radars for the study of lake ice

    NASA Technical Reports Server (NTRS)

    Polcyn, F. C.; Thomson, F. J.; Porcello, L. J.; Sattinger, I. J.; Malila, W. A.; Wezernak, C. T.; Horvath, R.; Vincent, R. K. (Principal Investigator); Bryan, M. L.

    1972-01-01

    There are no author-identified significant results in this report. Remotely sensed multispectral scanner and return beam vidicon imagery from ERTS-1 is being used for: (1) water depth measurements in the Virgin Islands and Upper Lake Michigan areas; (2) mapping of the Yellowstone National Park; (3) assessment of atmospheric effects in Colorado; (4) lake ice surveillance in Canada and Great Lakes areas; (5) recreational land use in Southeast Michigan; (6) International Field Year on the Great Lakes investigations of Lake Ontario; (7) image enhancement of multispectral scanner data using existing techniques; (8) water quality monitoring of the New York Bight, Tampa Bay, Lake Michigan, Santa Barbara Channel, and Lake Erie; (9) oil pollution detection in the Chesapeake Bay, Gulf of Mexico southwest of New Orleans, and Santa Barbara Channel; and (10) mapping iron compounds in the Wind River Mountains.

  20. 20 CFR 228.2 - Tier I and tier II annuity components.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Social Security Act if all of the employee's earnings after 1936 under both the railroad retirement system and the social security system had been creditable under the Social Security Act. (b) Tier II...

  1. Three-tier rough superhydrophobic surfaces

    NASA Astrophysics Data System (ADS)

    Cao, Yuanzhi; Yuan, Longyan; Hu, Bin; Zhou, Jun

    2015-08-01

    A three-tier rough superhydrophobic surface was fabricated by growing hydrophobic modified (fluorinated silane) zinc oxide (ZnO)/copper oxide (CuO) hetero-hierarchical structures on silicon (Si) micro-pillar arrays. Compared with the other three control samples with a less rough tier, the three-tier surface exhibits the best water repellency with the largest contact angle 161° and the lowest sliding angle 0.5°. It also shows a robust Cassie state which enables the water to flow with a speed over 2 m s-1. In addition, it could prevent itself from being wetted by the droplet with low surface tension (mixed water and ethanol 1:1 in volume) which reveals a flow speed of 0.6 m s-1 (dropped from the height of 2 cm). All these features prove that adding another rough tier on a two-tier rough surface could futher improve its water-repellent properties.

  2. Three-tier rough superhydrophobic surfaces.

    PubMed

    Cao, Yuanzhi; Yuan, Longyan; Hu, Bin; Zhou, Jun

    2015-08-07

    A three-tier rough superhydrophobic surface was fabricated by growing hydrophobic modified (fluorinated silane) zinc oxide (ZnO)/copper oxide (CuO) hetero-hierarchical structures on silicon (Si) micro-pillar arrays. Compared with the other three control samples with a less rough tier, the three-tier surface exhibits the best water repellency with the largest contact angle 161° and the lowest sliding angle 0.5°. It also shows a robust Cassie state which enables the water to flow with a speed over 2 m s(-1). In addition, it could prevent itself from being wetted by the droplet with low surface tension (mixed water and ethanol 1:1 in volume) which reveals a flow speed of 0.6 m s(-1) (dropped from the height of 2 cm). All these features prove that adding another rough tier on a two-tier rough surface could futher improve its water-repellent properties.

  3. Building a Prototype of LHC Analysis Oriented Computing Centers

    NASA Astrophysics Data System (ADS)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  4. The biogeochemical vertical structure renders a meromictic volcanic lake a trap for geogenic CO2 (Lake Averno, Italy)

    PubMed Central

    Fazi, Stefano; Rossetti, Simona; Pratesi, Paolo; Ceccotti, Marco; Cabassi, Jacopo; Capecchiacci, Francesco; Venturi, Stefania; Vaselli, Orlando

    2018-01-01

    Volcanic lakes are characterized by physicochemical favorable conditions for the development of reservoirs of C-bearing greenhouse gases that can be dispersed to air during occasional rollover events. By combining a microbiological and geochemical approach, we showed that the chemistry of the CO2- and CH4-rich gas reservoir hosted within the meromictic Lake Averno (Campi Flegrei, southern Italy) are related to the microbial niche differentiation along the vertical water column. The simultaneous occurrence of diverse functional groups of microbes operating under different conditions suggests that these habitats harbor complex microbial consortia that impact on the production and consumption of greenhouse gases. In the epilimnion, the activity of aerobic methanotrophic bacteria and photosynthetic biota, together with CO2 dissolution at relatively high pH, enhanced CO2- and CH4 consumption, which also occurred in the hypolimnion. Moreover, results from computations carried out to evaluate the dependence of the lake stability on the CO2/CH4 ratios, suggested that the water density vertical gradient was mainly controlled by salinity and temperature, whereas the effect of dissolved gases was minor, excepting if extremely high increases of CH4 are admitted. Therefore, biological processes, controlling the composition of CO2 and CH4, contributed to stabilize the lake stratification of the lake. Overall, Lake Averno, and supposedly the numerous worldwide distributed volcanic lakes having similar features (namely bio-activity lakes), acts as a sink for the CO2 supplied from the hydrothermal/magmatic system, displaying a significant influence on the local carbon budget. PMID:29509779

  5. 20 CFR 228.16 - Adjustments in the age reduction factor (ARF).

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Adjustments in the age reduction factor (ARF... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.16 Adjustments in the age reduction factor (ARF). Upon the attainment of retirement age, the previously-computed age reduction factor...

  6. 20 CFR 228.16 - Adjustments in the age reduction factor (ARF).

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 1 2014-04-01 2012-04-01 true Adjustments in the age reduction factor (ARF... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.16 Adjustments in the age reduction factor (ARF). Upon the attainment of retirement age, the previously-computed age reduction factor...

  7. 20 CFR 228.16 - Adjustments in the age reduction factor (ARF).

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 1 2013-04-01 2012-04-01 true Adjustments in the age reduction factor (ARF... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.16 Adjustments in the age reduction factor (ARF). Upon the attainment of retirement age, the previously-computed age reduction factor...

  8. 20 CFR 228.16 - Adjustments in the age reduction factor (ARF).

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 1 2011-04-01 2011-04-01 false Adjustments in the age reduction factor (ARF... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.16 Adjustments in the age reduction factor (ARF). Upon the attainment of retirement age, the previously-computed age reduction factor...

  9. 20 CFR 228.16 - Adjustments in the age reduction factor (ARF).

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 1 2012-04-01 2012-04-01 false Adjustments in the age reduction factor (ARF... RETIREMENT ACT COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.16 Adjustments in the age reduction factor (ARF). Upon the attainment of retirement age, the previously-computed age reduction factor...

  10. 20 CFR 226.14 - Employee regular annuity rate.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Employee regular annuity rate. 226.14 Section... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing an Employee Annuity § 226.14 Employee regular annuity rate. The regular annuity rate payable to the employee is the total of the employee tier I...

  11. Exploiting analytics techniques in CMS computing monitoring

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Kuznetsov, V.; Magini, N.; Repečka, A.; Vaandering, E.

    2017-10-01

    The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster for further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.

  12. Storage element performance optimization for CMS analysis jobs

    NASA Astrophysics Data System (ADS)

    Behrmann, G.; Dahlblom, J.; Guldmyr, J.; Happonen, K.; Lindén, T.

    2012-12-01

    Tier-2 computing sites in the Worldwide Large Hadron Collider Computing Grid (WLCG) host CPU-resources (Compute Element, CE) and storage resources (Storage Element, SE). The vast amount of data that needs to processed from the Large Hadron Collider (LHC) experiments requires good and efficient use of the available resources. Having a good CPU efficiency for the end users analysis jobs requires that the performance of the storage system is able to scale with I/O requests from hundreds or even thousands of simultaneous jobs. In this presentation we report on the work on improving the SE performance at the Helsinki Institute of Physics (HIP) Tier-2 used for the Compact Muon Experiment (CMS) at the LHC. Statistics from CMS grid jobs are collected and stored in the CMS Dashboard for further analysis, which allows for easy performance monitoring by the sites and by the CMS collaboration. As part of the monitoring framework CMS uses the JobRobot which sends every four hours 100 analysis jobs to each site. CMS also uses the HammerCloud tool for site monitoring and stress testing and it has replaced the JobRobot. The performance of the analysis workflow submitted with JobRobot or HammerCloud can be used to track the performance due to site configuration changes, since the analysis workflow is kept the same for all sites and for months in time. The CPU efficiency of the JobRobot jobs at HIP was increased approximately by 50 % to more than 90 %, by tuning the SE and by improvements in the CMSSW and dCache software. The performance of the CMS analysis jobs improved significantly too. Similar work has been done on other CMS Tier-sites, since on average the CPU efficiency for CMSSW jobs has increased during 2011. Better monitoring of the SE allows faster detection of problems, so that the performance level can be kept high. The next storage upgrade at HIP consists of SAS disk enclosures which can be stress tested on demand with HammerCloud workflows, to make sure that the I/O-performance is good.

  13. National Dam Inspection Program. Big Elk Lake Dam (NDI I.D. PA-0056 DER I.D. 058-019) Susquehanna River Basin, Elk Lake Stream, Susquehanna County, Pennsylvania. Phase I Inspection Report.

    DTIC Science & Technology

    1981-03-19

    Drainage Area 2.88 square miles(") b. Discharge at Dam Site ( cfs ) Maximum known flood at dam site Unknown Outlet conduit at maximum pool Not...the spillway was determined to be 164 cfs , based on the available 2.7-foot freeboard relative to the crest of the embankment. The Big Elk Lake watershed...computer analysis are presented in Appendix D. The 100-year flood, determined according to the recommended procedure, was found to have a peak of 2290 cfs

  14. A Review of Tier 2 Interventions Conducted within Multitiered Models of Behavioral Prevention

    ERIC Educational Resources Information Center

    Bruhn, Allison Leigh; Lane, Kathleen Lynne; Hirsch, Shanna Eisner

    2014-01-01

    To support students' academic, behavioral, and social needs, many schools have adopted multitiered models of prevention. Because Tier 3 interventions are costly in terms of time and resources, schools must find efficient and effective Tier 2 interventions prior to providing such intense supports. In this article, we review the literature base on…

  15. Expanding clarity or confusion? Volatility of the 5-tier ratings assessing quality of transplant centers in the United States.

    PubMed

    Schold, Jesse D; Andreoni, Kenneth A; Chandraker, Anil K; Gaston, Robert S; Locke, Jayme E; Mathur, Amit K; Pruett, Timothy L; Rana, Abbas; Ratner, Lloyd E; Buccini, Laura D

    2018-06-01

    Outcomes of patients receiving solid organ transplants in the United States are systematically aggregated into bi-annual Program-Specific Reports (PSRs) detailing risk-adjusted survival by transplant center. Recently, the Scientific Registry of Transplant Recipients (SRTR) issued 5-tier ratings evaluating centers based on risk-adjusted 1-year graft survival. Our primary aim was to examine the reliability of 5-tier ratings over time. Using 10 consecutive PSRs for adult kidney transplant centers from June 2012 to December 2016 (n = 208), we applied 5-tier ratings to center outcomes and evaluated ratings over time. From the baseline period (June 2012), 47% of centers had at least a 1-unit tier change within 6 months, 66% by 1 year, and 94% by 3 years. Similarly, 46% of centers had at least a 2-unit tier change by 3 years. In comparison, 15% of centers had a change in the traditional 3-tier rating at 3 years. The 5-tier ratings at 4 years had minimal association with baseline rating (Kappa 0.07, 95% confidence interval [CI] -0.002 to 0.158). Centers had a median of 3 different 5-tier ratings over the period (q1 = 2, q3 = 4). Findings were consistent for center volume, transplant rate, and baseline 5-tier rating. Cumulatively, results suggest that 5-tier ratings are highly volatile, limiting their utility for informing potential stakeholders, particularly transplant candidates given expected waiting times between wait listing and transplantation. © 2018 The American Society of Transplantation and the American Society of Transplant Surgeons.

  16. Toward a Fault Tolerant Architecture for Vital Medical-Based Wearable Computing.

    PubMed

    Abdali-Mohammadi, Fardin; Bajalan, Vahid; Fathi, Abdolhossein

    2015-12-01

    Advancements in computers and electronic technologies have led to the emergence of a new generation of efficient small intelligent systems. The products of such technologies might include Smartphones and wearable devices, which have attracted the attention of medical applications. These products are used less in critical medical applications because of their resource constraint and failure sensitivity. This is due to the fact that without safety considerations, small-integrated hardware will endanger patients' lives. Therefore, proposing some principals is required to construct wearable systems in healthcare so that the existing concerns are dealt with. Accordingly, this paper proposes an architecture for constructing wearable systems in critical medical applications. The proposed architecture is a three-tier one, supporting data flow from body sensors to cloud. The tiers of this architecture include wearable computers, mobile computing, and mobile cloud computing. One of the features of this architecture is its high possible fault tolerance due to the nature of its components. Moreover, the required protocols are presented to coordinate the components of this architecture. Finally, the reliability of this architecture is assessed by simulating the architecture and its components, and other aspects of the proposed architecture are discussed.

  17. The Use of Scale-Dependent Precision to Increase Forecast Accuracy in Earth System Modelling

    NASA Astrophysics Data System (ADS)

    Thornes, Tobias; Duben, Peter; Palmer, Tim

    2016-04-01

    At the current pace of development, it may be decades before the 'exa-scale' computers needed to resolve individual convective clouds in weather and climate models become available to forecasters, and such machines will incur very high power demands. But the resolution could be improved today by switching to more efficient, 'inexact' hardware with which variables can be represented in 'reduced precision'. Currently, all numbers in our models are represented as double-precision floating points - each requiring 64 bits of memory - to minimise rounding errors, regardless of spatial scale. Yet observational and modelling constraints mean that values of atmospheric variables are inevitably known less precisely on smaller scales, suggesting that this may be a waste of computer resources. More accurate forecasts might therefore be obtained by taking a scale-selective approach whereby the precision of variables is gradually decreased at smaller spatial scales to optimise the overall efficiency of the model. To study the effect of reducing precision to different levels on multiple spatial scales, we here introduce a new model atmosphere developed by extending the Lorenz '96 idealised system to encompass three tiers of variables - which represent large-, medium- and small-scale features - for the first time. In this chaotic but computationally tractable system, the 'true' state can be defined by explicitly resolving all three tiers. The abilities of low resolution (single-tier) double-precision models and similar-cost high resolution (two-tier) models in mixed-precision to produce accurate forecasts of this 'truth' are compared. The high resolution models outperform the low resolution ones even when small-scale variables are resolved in half-precision (16 bits). This suggests that using scale-dependent levels of precision in more complicated real-world Earth System models could allow forecasts to be made at higher resolution and with improved accuracy. If adopted, this new paradigm would represent a revolution in numerical modelling that could be of great benefit to the world.

  18. Assessing the nutritional quality of diets of Canadian children and adolescents using the 2014 Health Canada Surveillance Tool Tier System.

    PubMed

    Jessri, Mahsa; Nishi, Stephanie K; L'Abbe, Mary R

    2016-05-10

    Health Canada's Surveillance Tool (HCST) Tier System was developed in 2014 with the aim of assessing the adherence of dietary intakes with Eating Well with Canada's Food Guide (EWCFG). HCST uses a Tier system to categorize all foods into one of four Tiers based on thresholds for total fat, saturated fat, sodium, and sugar, with Tier 4 reflecting the unhealthiest and Tier 1 the healthiest foods. This study presents the first application of the HCST to examine (i) the dietary patterns of Canadian children, and (ii) the applicability and relevance of HCST as a measure of diet quality. Data were from the nationally-representative, cross-sectional Canadian Community Health Survey 2.2. A total of 13,749 participants aged 2-18 years who had complete lifestyle and 24-hour dietary recall data were examined. Dietary patterns of Canadian children and adolescents demonstrated a high prevalence of Tier 4 foods within the sub-groups of processed meats and potatoes. On average, 23-31 % of daily calories were derived from "other" foods and beverages not recommended in EWCFG. However, the majority of food choices fell within the Tier 2 and 3 classifications due to lenient criteria used by the HCST for classifying foods. Adherence to the recommendations presented in the HCST was associated with closer compliance to meeting nutrient Dietary Reference Intake recommendations, however it did not relate to reduced obesity as assessed by body mass index (p > 0.05). EWCFG recommendations are currently not being met by most children and adolescents. Future nutrient profiling systems need to incorporate both positive and negative nutrients and an overall score. In addition, a wider range of nutrient thresholds should be considered for HCST to better capture product differences, prevent categorization of most foods as Tiers 2-3 and provide incentives for product reformulation.

  19. 2 CFR 1326.332 - What methods must I use to pass requirements down to participants at lower tiers with whom I...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false What methods must I use to pass requirements down to participants at lower tiers with whom I intend to do business? 1326.332 Section 1326.332 Grants...-tier participants to comply with subpart C of the OMB guidance in 2 CFR Part 180, as supplemented by...

  20. 2 CFR 1200.332 - What methods must I use to pass requirements down to participants at lower tiers with whom I...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false What methods must I use to pass requirements down to participants at lower tiers with whom I intend to do business? 1200.332 Section 1200.332 Grants...-tier participants to comply with subpart C of the OMB guidance in 2 CFR part 180, as supplemented by...

  1. 2 CFR 1400.332 - What methods must I use to pass requirements down to participants at lower tiers with whom I...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false What methods must I use to pass requirements down to participants at lower tiers with whom I intend to do business? 1400.332 Section 1400.332 Grants...-tier participants to comply with subpart C of the OMB guidance in 2 CFR part 180. ...

  2. 2 CFR 901.332 - What methods must I use to pass requirements down to participants at lower tiers with whom I...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false What methods must I use to pass requirements down to participants at lower tiers with whom I intend to do business? 901.332 Section 901.332 Grants... lower-tier participants to comply with subpart C of the OMB guidance in 2 CFR part 180, as supplemented...

  3. Named Data Networking in Climate Research and HEP Applications

    NASA Astrophysics Data System (ADS)

    Shannigrahi, Susmit; Papadopoulos, Christos; Yeh, Edmund; Newman, Harvey; Jerzy Barczyk, Artur; Liu, Ran; Sim, Alex; Mughal, Azher; Monga, Inder; Vlimant, Jean-Roch; Wu, John

    2015-12-01

    The Computing Models of the LHC experiments continue to evolve from the simple hierarchical MONARC[2] model towards more agile models where data is exchanged among many Tier2 and Tier3 sites, relying on both large scale file transfers with strategic data placement, and an increased use of remote access to object collections with caching through CMS's AAA, ATLAS' FAX and ALICE's AliEn projects, for example. The challenges presented by expanding needs for CPU, storage and network capacity as well as rapid handling of large datasets of file and object collections have pointed the way towards future more agile pervasive models that make best use of highly distributed heterogeneous resources. In this paper, we explore the use of Named Data Networking (NDN), a new Internet architecture focusing on content rather than the location of the data collections. As NDN has shown considerable promise in another data intensive field, Climate Science, we discuss the similarities and differences between the Climate and HEP use cases, along with specific issues HEP faces and will face during LHC Run2 and beyond, which NDN could address.

  4. Genetic and economic analyses of female replacement rates in the dam-daughter pathway of a hierarchical swine breeding structure.

    PubMed

    Faust, M A; Robison, O W; Tess, M W

    1992-07-01

    A stochastic life-cycle swine production model was used to study the effect of female replacement rates in the dam-daughter pathway for a tiered breeding structure on genetic change and returns to the breeder. Genetic, environmental, and economic parameters were used to simulate characteristics of individual pigs in a system producing F1 female replacements. Evaluated were maximum culling ages for nucleus and multiplier tier sows. System combinations included one- and five-parity alternatives for both levels and 10-parity options for the multiplier tier. Yearly changes and average phenotypic levels were computed for performance and economic measures. Generally, at the nucleus level, responses to 10 yr of selection for sow and pig performance in five-parity herds were 70 to 85% of response in one-parity herds. Similarly, the highest selection responses in multiplier herds were from systems with one-parity nucleus tiers. Responses in these were typically greater than 115% of the response for systems with the smallest yearly change, namely, the five-parity nucleus and five- and 10-parity multiplier levels. In contrast, the most profitable multiplier tiers (10-parity) had the lowest replacement costs. Within a multiplier culling strategy, rapid genetic change was desirable. Differences between systems that culled after five or 10 parities were smaller than differences between five- and one-parity multiplier options. To recover production costs, systems with the lowest returns required 140% of market hog value for gilts available to commercial tiers, whereas more economically efficient systems required no premium.

  5. Assessing the Nutritional Quality of Diets of Canadian Adults Using the 2014 Health Canada Surveillance Tool Tier System.

    PubMed

    Jessri, Mahsa; Nishi, Stephanie K; L'Abbé, Mary R

    2015-12-12

    The 2014 Health Canada Surveillance Tool (HCST) was developed to assess adherence of dietary intakes with Canada's Food Guide. HCST classifies foods into one of four Tiers based on thresholds for sodium, total fat, saturated fat and sugar, with Tier 1 representing the healthiest and Tier 4 foods being the unhealthiest. This study presents the first application of HCST to assess (a) dietary patterns of Canadians; and (b) applicability of this tool as a measure of diet quality among 19,912 adult participants of Canadian Community Health Survey 2.2. Findings indicated that even though most of processed meats and potatoes were Tier 4, the majority of reported foods in general were categorized as Tiers 2 and 3 due to the adjustable lenient criteria used in HCST. Moving from the 1st to the 4th quartile of Tier 4 and "other" foods/beverages, there was a significant trend towards increased calories (1876 kcal vs. 2290 kcal) and "harmful" nutrients (e.g., sodium) as well as decreased "beneficial" nutrients. Compliance with the HCST was not associated with lower body mass index. Future nutrient profiling systems need to incorporate both "positive" and "negative" nutrients, an overall score and a wider range of nutrient thresholds to better capture food product differences.

  6. Bathymetric Contour Maps of Lakes Surveyed in Iowa in 2005

    USGS Publications Warehouse

    Linhart, S.M.; Lund, K.D.

    2008-01-01

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, conducted bathymetric surveys on seven lakes in Iowa during 2005 (Arrowhead Pond, Central Park Lake, Lake Keomah, Manteno Park Pond, Lake Miami, Springbrook Lake, and Yellow Smoke Lake). The surveys were conducted to provide the Iowa Department of Natural Resources with information for the development of total maximum daily load limits, particularly for estimating sediment load and deposition rates. The bathymetric surveys provide a baseline for future work on sediment loads and deposition rates for these lakes. All of the lakes surveyed in 2005 are man-made lakes with fixed spillways. Bathymetric data were collected using boat-mounted, differential global positioning system, echo depth-sounding equipment, and computer software. Data were processed with commercial hydrographic software and exported into a geographic information system for mapping and calculating area and volume. Lake volume estimates ranged from 47,784,000 cubic feet (1,100 acre-feet) at Lake Miami to 2,595,000 cubic feet (60 acre-feet) at Manteno Park Pond. Surface area estimates ranged from 5,454,000 square feet (125 acres) at Lake Miami to 558,000 square feet (13 acres) at Springbrook Lake.

  7. Hydrogeologic controls on the groundwater interactions with an acidic lake in karst terrain, Lake Barco, Florida

    USGS Publications Warehouse

    Lee, T.M.

    1996-01-01

    Transient groundwater interactions and lake stage were simulated for Lake Barco, an acidic seepage lake in the mantled karst of north central Florida. Karst subsidence features affected groundwater flow patterns in the basin and groundwater fluxes to and from the lake. Subsidence features peripheral to the lake intercepted potential groundwater inflow and increased leakage from the shallow perimeter of the lake bed. Simulated groundwater fluxes were checked against net groundwater flow derived from a detailed lake hydrologic budget with short-term lake evaporation computed by the energy budget method. Discrepancies between modeled and budget-derived net groundwater flows indicated that the model underestimated groundwater inflow, possibly contributed to by transient water table mounding near the lake. Recharge from rainfall reduced lake leakage by 10 to 15 times more than it increased groundwater inflow. As a result of the karst setting, the contributing groundwater basin to the lake was 2.4 ha for simulated average rainfall conditions, compared to the topographically derived drainage basin area of 81 ha. Short groundwater inflow path lines and rapid travel times limit the contribution of acid-neutralizing solutes from the basin, making Lake Barco susceptible to increased acidification by acid rain.

  8. A Randomised Control Trial of a Tier-2 Small-Group Intervention ("MiniLit") for Young Struggling Readers

    ERIC Educational Resources Information Center

    Buckingham, Jennifer; Wheldall, Kevin; Beaman, Robyn

    2012-01-01

    The response-to-intervention model is predicated upon increasingly intensive tiers of instruction. The aim of the present study was to examine the efficacy of a Tier-2 small-group literacy intervention ("MiniLit") designed for young readers who are still struggling after experiencing whole-class initial instruction. A total of 22…

  9. R5 clade C SHIV strains with tier 1 or 2 neutralization sensitivity: tools to dissect env evolution and to develop AIDS vaccines in primate models.

    PubMed

    Siddappa, Nagadenahalli B; Watkins, Jennifer D; Wassermann, Klemens J; Song, Ruijiang; Wang, Wendy; Kramer, Victor G; Lakhashe, Samir; Santosuosso, Michael; Poznansky, Mark C; Novembre, Francis J; Villinger, François; Else, James G; Montefiori, David C; Rasmussen, Robert A; Ruprecht, Ruth M

    2010-07-21

    HIV-1 clade C (HIV-C) predominates worldwide, and anti-HIV-C vaccines are urgently needed. Neutralizing antibody (nAb) responses are considered important but have proved difficult to elicit. Although some current immunogens elicit antibodies that neutralize highly neutralization-sensitive (tier 1) HIV strains, most circulating HIVs exhibiting a less sensitive (tier 2) phenotype are not neutralized. Thus, both tier 1 and 2 viruses are needed for vaccine discovery in nonhuman primate models. We constructed a tier 1 simian-human immunodeficiency virus, SHIV-1157ipEL, by inserting an "early," recently transmitted HIV-C env into the SHIV-1157ipd3N4 backbone [1] encoding a "late" form of the same env, which had evolved in a SHIV-infected rhesus monkey (RM) with AIDS. SHIV-1157ipEL was rapidly passaged to yield SHIV-1157ipEL-p, which remained exclusively R5-tropic and had a tier 1 phenotype, in contrast to "late" SHIV-1157ipd3N4 (tier 2). After 5 weekly low-dose intrarectal exposures, SHIV-1157ipEL-p systemically infected 16 out of 17 RM with high peak viral RNA loads and depleted gut CD4+ T cells. SHIV-1157ipEL-p and SHIV-1157ipd3N4 env genes diverge mostly in V1/V2. Molecular modeling revealed a possible mechanism for the increased neutralization resistance of SHIV-1157ipd3N4 Env: V2 loops hindering access to the CD4 binding site, shown experimentally with nAb b12. Similar mutations have been linked to decreased neutralization sensitivity in HIV-C strains isolated from humans over time, indicating parallel HIV-C Env evolution in humans and RM. SHIV-1157ipEL-p, the first tier 1 R5 clade C SHIV, and SHIV-1157ipd3N4, its tier 2 counterpart, represent biologically relevant tools for anti-HIV-C vaccine development in primates.

  10. Simulated Effects of Ground-Water Augmentation on the Hydrology of Round and Halfmoon Lakes in Northwestern Hillsborough County, Florida

    USGS Publications Warehouse

    Yager, Richard M.; Metz, P.A.

    2004-01-01

    Pumpage from the Upper Floridan aquifer in northwest Hillsborough County near Tampa, Florida, has induced downward leakage from the overlying surficial aquifer and lowered the water table in many areas. Leakage is highest where the confining layer separating the aquifers is breached, which is common beneath many of the lakes in the study area. Leakage of water to the Upper Floridan aquifer has lowered the water level in many lakes and drained many wetlands. Ground water from the Upper Floridan aquifer has been added (augmented) to some lakes in an effort to maintain lake levels, but the resulting lake-water chemistry and lake leakage patterns are substantially different from those of natural lakes. Changes in lake-water chemistry can cause changes in lake flora, fauna, and lake sediment composition, and large volumes of lake leakage are suspected to enhance the formation of sinkholes near the shoreline of augmented lakes. The leakage rate of lake water through the surficial aquifer to the Upper Floridan aquifer was estimated in this study using ground-water-flow models developed for an augmented lake (Round Lake) and non-augmented lake (Halfmoon Lake). Flow models developed with MODFLOW were calibrated through nonlinear regression with UCODE to measured water levels and monthly net ground-water-flow rates from the lakes estimated from lake-water budgets. Monthly estimates of ground-water recharge were computed using an unsaturated flow model (LEACHM) that simulated daily changes in storage of water in the soil profile, thus estimating recharge as drainage to the water table. Aquifer properties in the Round Lake model were estimated through transient-state simulations using two sets of monthly recharge rates computed during July 1996 to February 1999, which spanned both average conditions (July 1996 through October 1997), and an El Ni?o event (November 1997 through September 1998) when the recharge rate doubled. Aquifer properties in the Halfmoon Lake model were estimated through steady-state simulations of average conditions in July 1996. Simulated hydrographs computed by the Round and Halfmoon Lake models closely matched measured water-level fluctuations, except during El Ni?o, when the Halfmoon Lake model was unable to accurately reproduce water levels. Possibly, potential recharge during El Ni?o was diverted through ground-water-flow outlets that were not represented in the Halfmoon Lake model, or a large part of the rainfall was diverted into runoff before it could become recharge. Solute transport simulations with MT3D indicate that leakage of lake water extended 250 to 400 feet into the surficial aquifer around Round Lake, and from 75 to 150 feet around Halfmoon Lake before flowing to the underlying Upper Floridan aquifer. These results are in agreement with concentrations of stable isotopes of oxygen-18 (d18O) and deuterium (dD) in the surficial aquifer. Schedules of monthly augmentation rates to maintain constant stages in Round and Halfmoon Lakes were computed using an equation that accounted for changes in the Upper Floridan aquifer head and the deviation from the mean recharge rate. Resulting lake stages were nearly constant during the first half of the study, but increased above target lake stages during El Ni?o; modifying the computation of augmentation rates to account for the higher recharge rate during El Ni?o resulted in lake stages that were closer to the target lake stage. Substantially more lake leakage flows to the Upper Floridan aquifer from Round Lake than from Halfmoon Lake, because the estimated vertical hydraulic conductivities of lake and confining layer sediments and breaches in the confining layer beneath Round Lake are much greater. Augmentation rates required to maintain the low guidance stages in Round Lake (53 feet) and Halfmoon Lake (42 feet) under average Upper Floridan aquifer heads are estimated as 33,850 cubic feet per day and 1,330 to 10,000 cubic feet per day, respectively. T

  11. EMERGING TECHNOLOGIES FOR THE MANAGEMENT AND UTILIZATION OF LANDFILL GAS

    EPA Science Inventory

    The report gives information on emerging technologies that are considered to be commercially available (Tier 1), currently undergoing research and development (Tier 2), or considered as potentially applicable (Tier 3) for the management of landfill gas (LFG) emissions or for the ...

  12. A single, continuous metric to define tiered serum neutralization potency against HIV

    DOE PAGES

    Hraber, Peter Thomas; Korber, Bette Tina Marie; Wagh, Kshitij; ...

    2018-01-19

    HIV-1 Envelope (Env) variants are grouped into tiers by their neutralization-sensitivity phenotype. This helped to recognize that tier 1 neutralization responses can be elicited readily, but do not protect against new infections. Tier 3 viruses are the least sensitive to neutralization. Because most circulating viruses are tier 2, vaccines that elicit neutralization responses against them are needed. While tier classification is widely used for viruses, a way to rate serum or antibody neutralization responses in comparable terms is needed. Logistic regression of neutralization outcomes summarizes serum or antibody potency on a continuous, tier-like scale. It also tests significance of themore » neutralization score, to indicate cases where serum response does not depend on virus tiers. The method can standardize results from different virus panels, and could lead to high-throughput assays, which evaluate a single serum dilution, rather than a dilution series, for more efficient use of limited resources to screen samples from vaccinees.« less

  13. A single, continuous metric to define tiered serum neutralization potency against HIV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hraber, Peter Thomas; Korber, Bette Tina Marie; Wagh, Kshitij

    HIV-1 Envelope (Env) variants are grouped into tiers by their neutralization-sensitivity phenotype. This helped to recognize that tier 1 neutralization responses can be elicited readily, but do not protect against new infections. Tier 3 viruses are the least sensitive to neutralization. Because most circulating viruses are tier 2, vaccines that elicit neutralization responses against them are needed. While tier classification is widely used for viruses, a way to rate serum or antibody neutralization responses in comparable terms is needed. Logistic regression of neutralization outcomes summarizes serum or antibody potency on a continuous, tier-like scale. It also tests significance of themore » neutralization score, to indicate cases where serum response does not depend on virus tiers. The method can standardize results from different virus panels, and could lead to high-throughput assays, which evaluate a single serum dilution, rather than a dilution series, for more efficient use of limited resources to screen samples from vaccinees.« less

  14. Advanced Broadband Links for TIER III UAV Data Communication

    NASA Astrophysics Data System (ADS)

    Griethe, Wolfgang; Gregory, Mark; Heine, Frank; Kampfner, Hartmut

    2011-08-01

    Unmanned Aeronautical Vehicle (UAV) are getting more and more importance because of their prominent role as national reconnaissance systems, for disaster monitoring, and environmental mapping. However, the existence of reliable and robust data links are indispensable for Unmanned Aircraft System (UAS) missions. In particular for Beyond Line-Of-Sight operations (BLOS) of Tier III UAVs, satellite data links are a key element since extensive sensor data have to be transmitted preferably in real-time or near real-time.The paper demonstrates that the continuously increasing number of UAS and the intensified use of high resolution sensors will reveal RF-bandwidth as a limitating factor in the communication chain of Tier III UAVs. The RF-bandwidth gap can be partly closed by use of high-order modulation, of course, but much more progress in terms of bandwidth allocation can be achieved by using optical transmission technology. Consequently, the paper underlines that meanwhile this technology has been sufficiently verified in space, and shows that optical links are suited as well for broadband communications of Tier III UAVs. Moreover, the advantages of LaserCom in UAV scenarios and its importance for Network Centric Warfare (NCW) as well as for Command, Control, Communications, Computers, Intelligens, Surveillance, and Reconnaissance (C4ISR) are emphasized. Numerous practical topics and design requirements, relevant for the establishment of optical links onboard of Tier III UAVs, are discussed.

  15. Screening Analysis for the Environmental Risk Evaluation System Fiscal Year 2011 Report Environmental Effects of Offshore Wind Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copping, Andrea E.; Hanna, Luke A.

    2011-11-01

    Potential environmental effects of offshore wind (OSW) energy development are not well understood, and yet regulatory agencies are required to make decisions in spite of substantial uncertainty about environmental impacts and their long-term consequences. An understanding of risks associated with interactions between OSW installations and avian and aquatic receptors, including animals, habitats, and ecosystems, can help define key uncertainties and focus regulatory actions and scientific studies on interactions of most concern. During FY 2011, Pacific Northwest National Laboratory (PNNL) scientists adapted and applied the Environmental Risk Evaluation System (ERES), first developed to examine the effects of marine and hydrokinetic energymore » devices on aquatic environments, to offshore wind development. PNNL scientists conducted a risk screening analysis on two initial OSW cases: a wind project in Lake Erie and a wind project off the Atlantic coast of the United States near Atlantic City, New Jersey. The screening analysis revealed that top-tier stressors in the two OSW cases were the dynamic effects of the device (e.g., strike), accidents/disasters, and effects of the static physical presence of the device, such as alterations in bottom habitats. Receptor interactions with these stressors at the highest tiers of risk were dominated by threatened and endangered animals. Risk to the physical environment from changes in flow regime also ranked high. Peer review of this process and results will be conducted during FY 2012. The ERES screening analysis provides an assessment of the vulnerability of environmental receptors to stressors associated with OSW installations; a probability analysis is needed to determine specific risk levels to receptors. As more data become available that document effects of offshore wind farms on specific receptors in U.S. coastal and Great Lakes waters, probability analyses will be performed.« less

  16. Impact of multi-tiered pharmacy benefits on attitudes of plan members with chronic disease states.

    PubMed

    Nair, Kavita V; Ganther, Julie M; Valuck, Robert J; McCollum, Marianne M; Lewis, Sonya J

    2002-01-01

    To evaluate the effects of 2- and 3-tiered pharmacy benefit plans on member attitudes regarding their pharmacy benefits. We performed a mail survey and cross-sectional comparison of the outcome variables in a large managed care population in the western United States. Participants were persons with chronic disease states who were in 2- or 3-tier copay drug plans. A random sample of 10,662 was selected from a total of 25,008 members who had received 2 or more prescriptions for a drug commonly used to treat one of 5 conditions: hypertension, diabetes, dyslipidemia, gastroesophageal reflux disease (GERD), or arthritis. Statistical analysis included bivariate comparisons and regression analysis of the factors affecting member attitudes, including satisfaction, loyalty, health plan choices, and willingness to pay a higher out-of-pocket cost for medications. A response rate of 35.8% was obtained from continuously enrolled plan members. Respondents were older, sicker, and consumed more prescriptions than nonrespondents. There were significant differences in age and health plan characteristics between 2- and 3-tier plan members: respondents aged 65 or older represented 11.7% of 2-tier plan members and 54.7% of 3-tier plan members, and 10.0% of 2-tier plan members were in Medicare+Choice plans versus 61.4% in Medicare+Choice plans for 3-tier plan members (P<0.05). Controlling for demographic characteristics, number of comorbidities, and the cost of health care, 2-tier plan members were more satisfied with their plan, more likely to recommend their plan to others, and less likely to switch their current plans to obtain better prescription drug coverage than 3-tier plan members. While members were willing to purchase higher cost nonformulary and brand-name medications, in general, they were not willing to pay more than 10 dollars (in addition to their copayment amount) for these medications. Older respondents and sicker individuals (those with higher scores on the Chronic Disease Indicator) appeared to have more positive attitudes toward their pharmacy benefit plans in general. Higher reported incomes by respondents were also associated with greater satisfaction with prescription drug coverage and increased loyalty toward the pharmacy benefit plan. Conversely, the more individuals spent for either their health care or prescription medications, the less satisfied they were with their prescription drug coverage and less loyalty they appeared to have for their health plans. An inverse relationship also appeared to exist between the out-of-pocket costs for prescription medications and members' willingness to pay for nonformulary medications. Three-tier members had lower reported satisfaction with their plans compared to members in 2-tier plans. The financial resources available to members (which may be a function of being older and having more education and higher incomes), the number of chronic disease states that members have, and other factors may influence their attitudes toward their prescription drug coverage.

  17. 12 CFR 390.74 - Civil money penalties.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... 1467a(i)(3) Holding Company Act Violation 32,500 12 U.S.C. 1467a(r)(1) Late/Inaccurate Reports—1st Tier 2,200 12 U.S.C. 1467a(r)(2) Late/Inaccurate Reports—2nd Tier 32,500 12 U.S.C. 1467a(r)(3) Late...) Violation of Law or Unsafe or Unsound Practice—3rd Tier 1,375,000 12 U.S.C. 1820(k)(6)(A)(ii) Violation of...

  18. 12 CFR 109.103 - Civil money penalties.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    .... 1467a(r)(1) Late/Inaccurate Reports—1st Tier 2,200 12 U.S.C. 1467a(r)(2) Late/Inaccurate Reports—2nd Tier 32,500 12 U.S.C. 1467a(r)(3) Late/Inaccurate Reports—3rd Tier 1,375,000 12 U.S.C. 1817(j)(16)(A...,375,000 12 U.S.C. 1820(k)(6)(A)(ii) Violation of Post Employment Restrictions 275,000 12 U.S.C. 1884...

  19. 2-tiered antibody testing for early and late Lyme disease using only an immunoglobulin G blot with the addition of a VlsE band as the second-tier test.

    PubMed

    Branda, John A; Aguero-Rosenfeld, Maria E; Ferraro, Mary Jane; Johnson, Barbara J B; Wormser, Gary P; Steere, Allen C

    2010-01-01

    Standard 2-tiered immunoglobulin G (IgG) testing has performed well in late Lyme disease (LD), but IgM testing early in the illness has been problematic. IgG VlsE antibody testing, by itself, improves early sensitivity, but may lower specificity. We studied whether elements of the 2 approaches could be combined to produce a second-tier IgG blot that performs well throughout the infection. Separate serum sets from LD patients and control subjects were tested independently at 2 medical centers using whole-cell enzyme immunoassays and IgM and IgG immunoblots, with recombinant VlsE added to the IgG blots. The results from both centers were combined, and a new second-tier IgG algorithm was developed. With standard 2-tiered IgM and IgG testing, 31% of patients with active erythema migrans (stage 1), 63% of those with acute neuroborreliosis or carditis (stage 2), and 100% of those with arthritis or late neurologic involvement (stage 3) had positive results. Using new IgG criteria, in which only the VlsE band was scored as a second-tier test among patients with early LD (stage 1 or 2) and 5 of 11 IgG bands were required in those with stage 3 LD, 34% of patients with stage 1, 96% of those with stage 2, and 100% of those with stage 3 infection had positive responses. Both new and standard testing achieved 100% specificity. Compared with standard IgM and IgG testing, the new IgG algorithm (with VlsE band) eliminates the need for IgM testing; it provides comparable or better sensitivity, and it maintains high specificity.

  20. 76 FR 45742 - Fisheries of the Northeastern United States; Atlantic Mackerel, Squid, and Butterfish Fisheries...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-01

    ... vessel must have landed at least 400,000 lb (181.44 mt) in any one year 1997-2005 to qualify for a Tier 1... Tier 2 permit; or at least 1,000 lb (0.45 mt) in any one year March 1, 1994--December 31, 2005, to qualify for a Tier 3 permit, with Tier 3 allocated up to 7 percent of the commercial quota, through the...

  1. Framework for a clinical information system.

    PubMed

    Van De Velde, R; Lansiers, R; Antonissen, G

    2002-01-01

    The design and implementation of Clinical Information System architecture is presented. This architecture has been developed and implemented based on components following a strong underlying conceptual and technological model. Common Object Request Broker and n-tier technology featuring centralised and departmental clinical information systems as the back-end store for all clinical data are used. Servers located in the "middle" tier apply the clinical (business) model and application rules. The main characteristics are the focus on modelling and reuse of both data and business logic. Scalability as well as adaptability to constantly changing requirements via component driven computing are the main reasons for that approach.

  2. Tier 2 Supports to Improve Motivation and Performance of Elementary Students with Behavioral Challenges and Poor Work Completion

    ERIC Educational Resources Information Center

    Oakes, Wendy Peia; Lane, Kathleen Lynne; Cox, Meredith; Magrane, Ashley; Jenkins, Abbie; Hankins, Katy

    2012-01-01

    We offer a methodological illustration for researchers and practitioners of how to conduct a development study consistent with the parameters delineated by the Institute of Education Sciences (IES; U.S. Department of Education [USDE], 2010) to explore the utility of an existing Tier 1 intervention applied as a Tier 2 support within a three-tiered…

  3. SURFACE WATER AND GROUND WATER QUALITY MONITORING FOR RESTORATION OF URBAN LAKES IN GREATER HYDERABAD, INDIA

    NASA Astrophysics Data System (ADS)

    Mohanty, A. K.

    2009-12-01

    SURFACE WATER AND GROUND WATER QUALITY MONITORING FOR RESTORATION OF URBAN LAKES IN GREATER HYDERABAD, INDIA A.K. Mohanty, K. Mahesh Kumar, B. A. Prakash and V.V.S. Gurunadha Rao Ecology and Environment Group National Geophysical Research Institute, (CSIR) Hyderabad - 500 606, India E-mail:atulyakumarmohanty@yahoo.com Abstract: Hyderabad Metropolitan Development Authority has taken up restoration of urban lakes around Hyderabad city under Green Hyderabad Environment Program. Restoration of Mir Alam Tank, Durgamcheruvu, Patel cheruvu, Pedda Cheruvu and Nallacheruvu lakes have been taken up under the second phase. There are of six lakes viz., RKPuramcheruvu, Nadimicheruvu (Safilguda), Bandacheruvu Patelcheruvu, Peddacheruvu, Nallacheruvu, in North East Musi Basin covering 38 sq km. Bimonthly monitoring of lake water quality for BOD, COD, Total Nitrogen, Total phosphorous has been carried out for two hydrological cycles during October 2002- October 2004 in all the five lakes at inlet channels and outlets. The sediments in the lake have been also assessed for nutrient status. The nutrient parameters have been used to assess eutrophic condition through computation of Trophic Status Index, which has indicated that all the above lakes under study are under hyper-eutrophic condition. The hydrogeological, geophysical, water quality and groundwater data base collected in two watersheds covering 4 lakes has been used to construct groundwater flow and mass transport models. The interaction of lake-water with groundwater has been computed for assessing the lake water budget combining with inflow and outflow measurements on streams entering and leaving the lakes. Individual lake water budget has been used for design of appropriate capacity of Sewage Treatment Plants (STPs) on the inlet channels of the lakes for maintaining Full Tank Level (FTL) in each lake. STPs are designed for tertiary treatment i.e. removal of nutrient load viz., Phosphates and Nitrates. Phosphates are removed through addition of Alum to the influent stream to the STPs whereas Nitrates reduction is achieved by sending the treated wastewater from the STP through a wetland before entering the lake. STP Capacity ranging from 2-10 MLD have been recommended depending on lake water budget of individual lake and considering surrounding urbanization. Sediment nutrient data has helped for deciding the need for dredging of lake bed for removal of phosphates. Key Words: Lake water budget, Eutrophication, Trophic Status Index, Urban Lakes Restoration

  4. South Lake Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Nancy Nichols, principal of South Lake Elementary School, Titusville, Fla., joins students in teacher Michelle Butler's sixth grade class who are unwrapping computer equipment donated by Kennedy Space Center. South Lake is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  5. 18 CFR 707.9 - Tiering.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 2 2010-04-01 2010-04-01 false Tiering. 707.9 Section 707.9 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COMPLIANCE WITH THE NATIONAL ENVIRONMENTAL POLICY ACT (NEPA) Water Resources Council Implementing Procedures § 707.9 Tiering. In accordance...

  6. 18 CFR 707.9 - Tiering.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 2 2012-04-01 2012-04-01 false Tiering. 707.9 Section 707.9 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COMPLIANCE WITH THE NATIONAL ENVIRONMENTAL POLICY ACT (NEPA) Water Resources Council Implementing Procedures § 707.9 Tiering. In accordance...

  7. 18 CFR 707.9 - Tiering.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 2 2011-04-01 2011-04-01 false Tiering. 707.9 Section 707.9 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COMPLIANCE WITH THE NATIONAL ENVIRONMENTAL POLICY ACT (NEPA) Water Resources Council Implementing Procedures § 707.9 Tiering. In accordance...

  8. 18 CFR 707.9 - Tiering.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 2 2013-04-01 2012-04-01 true Tiering. 707.9 Section 707.9 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COMPLIANCE WITH THE NATIONAL ENVIRONMENTAL POLICY ACT (NEPA) Water Resources Council Implementing Procedures § 707.9 Tiering. In accordance...

  9. 18 CFR 707.9 - Tiering.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 2 2014-04-01 2014-04-01 false Tiering. 707.9 Section 707.9 Conservation of Power and Water Resources WATER RESOURCES COUNCIL COMPLIANCE WITH THE NATIONAL ENVIRONMENTAL POLICY ACT (NEPA) Water Resources Council Implementing Procedures § 707.9 Tiering. In accordance...

  10. 12 CFR 3.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., deferred tax assets, and credit-enhancing interest-only strips, that are deducted from Tier 1 capital, and minus nonfinancial equity investments for which a Tier 1 capital deduction is required pursuant to... carry out the purposes of this part. (b) Bank means a national banking association. (c) Tier 1 capital...

  11. The effect of cigarette prices on brand-switching in China: a longitudinal analysis of data from the ITC China Survey

    PubMed Central

    White, Justin S; Li, Jing; Hu, Teh-wei; Fong, Geoffrey T; Jiang, Yuan

    2014-01-01

    Background Recent studies have found that Chinese smokers are relatively unresponsive to cigarette prices. As the Chinese government contemplates higher tobacco taxes, it is important to understand the reasons for this low response. One possible explanation is that smokers buffer themselves from rising cigarette prices by switching to cheaper cigarette brands. Objective This study examines how cigarette prices influence consumers’ choices of cigarette brands in China. Methods This study uses panel data from the first three waves of the International Tobacco Control China Survey, drawn from six large cities in China and collected between 2006 and 2009. The study sample includes 3477 smokers who are present in at least two waves (8552 person-years). Cigarette brands are sorted by price into four tiers, using excise tax categories to determine the cut-off for each tier. The analysis relies on a conditional logit model to identify the relationship between price and brand choice. Findings Overall, 38% of smokers switched price tiers from one wave to the next. A ¥1 change in the price of cigarettes alters the tier choice of 4–7% of smokers. Restricting the sample to those who chose each given tier at baseline, a ¥1 increase in price in a given tier would decrease the share choosing that tier by 4% for Tier 1 and 1–2% for Tiers 2 and 3. Conclusions China's large price spread across cigarette brands appears to alter the brand selection of some consumers, especially smokers of cheaper brands. Tobacco pricing and tax policy can influence consumers’ incentives to switch brands. In particular, whereas ad valorem taxes in a tiered pricing system like China's encourage trading down, specific excise taxes discourage the practice. PMID:23697645

  12. 12 CFR 509.103 - Civil money penalties.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Company Act Violation 32,500 12 U.S.C. 1467a(i)(3) Holding Company Act Violation 32,500 12 U.S.C. 1467a(r)(1) Late/Inaccurate Reports—1st Tier 2,200 12 U.S.C. 1467a(r)(2) Late/Inaccurate Reports—2nd Tier 32,500 12 U.S.C. 1467a(r)(3) Late/Inaccurate Reports—3rd Tier 1,375,000 12 U.S.C. 1817(j)(16)(A) Change...

  13. 12 CFR 509.103 - Civil money penalties.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Company Act Violation 32,500 12 U.S.C. 1467a(i)(3) Holding Company Act Violation 32,500 12 U.S.C. 1467a(r)(1) Late/Inaccurate Reports—1st Tier 2,200 12 U.S.C. 1467a(r)(2) Late/Inaccurate Reports—2nd Tier 32,500 12 U.S.C. 1467a(r)(3) Late/Inaccurate Reports—3rd Tier 1,375,000 12 U.S.C. 1817(j)(16)(A) Change...

  14. 12 CFR 509.103 - Civil money penalties.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Company Act Violation 32,500 12 U.S.C. 1467a(i)(3) Holding Company Act Violation 32,500 12 U.S.C. 1467a(r)(1) Late/Inaccurate Reports—1st Tier 2,200 12 U.S.C. 1467a(r)(2) Late/Inaccurate Reports—2nd Tier 32,500 12 U.S.C. 1467a(r)(3) Late/Inaccurate Reports—3rd Tier 1,375,000 12 U.S.C. 1817(j)(16)(A) Change...

  15. 12 CFR 509.103 - Civil money penalties.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Company Act Violation 32,500 12 U.S.C. 1467a(i)(3) Holding Company Act Violation 32,500 12 U.S.C. 1467a(r)(1) Late/Inaccurate Reports—1st Tier 2,200 12 U.S.C. 1467a(r)(2) Late/Inaccurate Reports—2nd Tier 32,500 12 U.S.C. 1467a(r)(3) Late/Inaccurate Reports—3rd Tier 1,375,000 12 U.S.C. 1817(j)(16)(A) Change...

  16. 12 CFR 509.103 - Civil money penalties.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...,500 12 U.S.C. 1467a(r)(3) Late/Inaccurate Reports—3rd Tier 1,375,000 12 U.S.C. 1817(j)(16)(A) Change in Control—1st Tier 7,500 12 U.S.C. 1817(j)(16)(B) Change in Control—2nd Tier 37,500 12 U.S.C. 1817(j.... 4012a(f) Flood Insurance 1 385 2 135,000 1 Per day. 2 Per year. [56 FR 38306, Aug. 12, 1991, as amended...

  17. Transforming Big Data into cancer-relevant insight: An initial, multi-tier approach to assess reproducibility and relevance* | Office of Cancer Genomics

    Cancer.gov

    The Cancer Target Discovery and Development (CTD^2) Network was established to accelerate the transformation of "Big Data" into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding.

  18. Bathymetry and capacity of Chambers Lake, Chester County, Pennsylvania

    USGS Publications Warehouse

    Gyves, Matthew C.

    2015-10-26

    This report describes the methods used to create a bathymetric map of Chambers Lake for the computation of reservoir storage capacity as of September 2014. The product is a bathymetric map and a table showing the storage capacity of the reservoir at 2-foot increments from minimum usable elevation up to full capacity at the crest of the auxiliary spillway.

  19. Assessing the Nutritional Quality of Diets of Canadian Adults Using the 2014 Health Canada Surveillance Tool Tier System

    PubMed Central

    Jessri, Mahsa; Nishi, Stephanie K.; L’Abbé, Mary R.

    2015-01-01

    The 2014 Health Canada Surveillance Tool (HCST) was developed to assess adherence of dietary intakes with Canada’s Food Guide. HCST classifies foods into one of four Tiers based on thresholds for sodium, total fat, saturated fat and sugar, with Tier 1 representing the healthiest and Tier 4 foods being the unhealthiest. This study presents the first application of HCST to assess (a) dietary patterns of Canadians; and (b) applicability of this tool as a measure of diet quality among 19,912 adult participants of Canadian Community Health Survey 2.2. Findings indicated that even though most of processed meats and potatoes were Tier 4, the majority of reported foods in general were categorized as Tiers 2 and 3 due to the adjustable lenient criteria used in HCST. Moving from the 1st to the 4th quartile of Tier 4 and “other” foods/beverages, there was a significant trend towards increased calories (1876 kcal vs. 2290 kcal) and “harmful” nutrients (e.g., sodium) as well as decreased “beneficial” nutrients. Compliance with the HCST was not associated with lower body mass index. Future nutrient profiling systems need to incorporate both “positive” and “negative” nutrients, an overall score and a wider range of nutrient thresholds to better capture food product differences. PMID:26703721

  20. Hydrologic characterization for Spring Creek and hydrologic budget and model scenarios for Sheridan Lake, South Dakota, 1962-2007

    USGS Publications Warehouse

    Driscoll, Daniel G.; Norton, Parker A.

    2009-01-01

    The U.S. Geological Survey cooperated with South Dakota Game, Fish and Parks to characterize hydrologic information relevant to management of water resources associated with Sheridan Lake, which is formed by a dam on Spring Creek. This effort consisted primarily of characterization of hydrologic data for a base period of 1962 through 2006, development of a hydrologic budget for Sheridan Lake for this timeframe, and development of an associated model for simulation of storage deficits and drawdown in Sheridan Lake for hypothetical release scenarios from the lake. Historically, the dam has been operated primarily as a 'pass-through' system, in which unregulated outflows pass over the spillway; however, the dam recently was retrofitted with an improved control valve system that would allow controlled releases of about 7 cubic feet per second (ft3/s) or less from a fixed depth of about 60 feet (ft). Development of a hydrologic budget for Sheridan Lake involved compilation, estimation, and characterization of data sets for streamflow, precipitation, and evaporation. The most critical data need was for extrapolation of available short-term streamflow records for Spring Creek to be used as the long-term inflow to Sheridan Lake. Available short-term records for water years (WY) 1991-2004 for a gaging station upstream from Sheridan Lake were extrapolated to WY 1962-2006 on the basis of correlations with streamflow records for a downstream station and for stations located along two adjacent streams. Comparisons of data for the two streamflow-gaging stations along Spring Creek indicated that tributary inflow is approximately proportional to the intervening drainage area, which was used as a means of estimating tributary inflow for the hydrologic budget. Analysis of evaporation data shows that sustained daily rates may exceed maximum monthly rates by a factor of about two. A long-term (1962-2006) hydrologic budget was developed for computation of reservoir outflow from Sheridan Lake for the historical pass-through operating system. Two inflow components (stream inflow and precipitation) and one outflow component (evaporation) were considered. The hydrologic budget uses monthly time steps within a computational year that includes two 6-month periods - May through October, for which evaporation is accounted for, and November through April, when evaporation is considered negligible. Results indicate that monthly evaporation rates can substantially exceed inflow during low-flow periods, and potential exists for outflows to begin approaching zero-flow conditions substantially prior to the onset of zero-inflow conditions, especially when daily inflow and evaporation are considered. Results also indicate that September may be the month for greatest potential benefit for enhancing fish habitat and other ecosystem values in downstream reaches of Spring Creek with managed releases of cool water. Computed monthly outflows from Sheridan Lake for September are less than 1.0 ft3/s for 8 of the 44 years (18 percent) and are less than 2.0 ft3/s for 14 of the 44 years (32 percent). Conversely, none of the computed outflows for May are less than 2.0 ft3/s. A short-term (July through September 2007) data set was used to calculate daily evaporation from Sheridan Lake and to evaluate the applicability of published pan coefficients. Computed values of pan coefficients of approximately 1.0 and 1.1 for two low-flow periods are larger than the mean annual pan coefficient of 0.74 for the area that is reported in the literature; however, the computed values are consistent with pan coefficients reported elsewhere for similar late summer and early fall periods. Thus, these results supported the use of variable monthly pan coefficients for the long-term hydrologic budget. A hydrologic model was developed using the primary components of the hydrologic budget and was used to simulate monthly storage deficits and drawdown for Sheridan Lake using hypothetical

  1. A multi-tiered architecture for content retrieval in mobile peer-to-peer networks.

    DOT National Transportation Integrated Search

    2012-01-01

    In this paper, we address content retrieval in Mobile Peer-to-Peer (P2P) Networks. We design a multi-tiered architecture for content : retrieval, where at Tier 1, we design a protocol for content similarity governed by a parameter that trades accu...

  2. XENOENDOCRINE DISRUPTERS-TIERED SCREENING AND TESTING: FILLING KEY DATA GAPS

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) is developing a screening and testing program for endocrine disrupting chemicals (EDCs). High priority chemicals would be evaluated in the Tier 1 Screening (T1S) battery. Chemicals positive in T1S would then be tested (Tier 2). T1S...

  3. Response to Intervention with Secondary School Students with Reading Difficulties

    ERIC Educational Resources Information Center

    Vaughn, Sharon; Fletcher, Jack M.

    2012-01-01

    The authors summarize evidence from a multiyear study with secondary students with reading difficulties on (a) the potential efficacy of primary-level (Tier 1), secondary-level (Tier 2), and tertiary-level (Tier 3) interventions in remediating reading difficulties with middle school students, (b) the likelihood of resolving reading disabilities…

  4. Methods for Tier 2 Modeling Within the Training Range Environmental Evaluation and Characterization System

    DTIC Science & Technology

    2011-03-01

    acre-yr, compared with 54 tons/acre-yr as computed with the Universal Soil Loss Equation ( USLE ). Thus, it appears that the Einstein and Brown equations... USLE that is already needed for soil erosion that exports aqueous phase (adsorbed and dissolved) MC. This will mean that solid phase MC will not affect...phase MC mass to soil mass b = soil dry bulk density, g/m3 A = AOI site area, m2 E = soil erosion rate as determined from the USLE , m/yr It is

  5. Immunologic response among HIV-infected patients enrolled in a graduated cost-recovery programme of antiretroviral therapy delivery in Chennai, India.

    PubMed

    Solomon, Sunil Suhas; Ganesh, Aylur K; Mehta, Shruti H; Yepthomi, Tokugha; Balaji, Kavitha; Anand, Santhanam; Gallant, Joel E; Solomon, Suniti

    2013-06-01

    Sustainability of free antiretroviral therapy (ART) roll out programmes in resource-limited settings is challenging given the need for lifelong therapy and lack of effective vaccine. This study was undertaken to compare treatment outcomes among HIV-infected patients enrolled in a graduated cost-recovery programme of ART delivery in Chennai, India. Financial status of patients accessing care at a tertiary care centre, YRGCARE, Chennai, was assessed using an economic survey; patients were distributed into tiers 1- 4 requiring them to pay 0, 50, 75 or 100 per cent of their medication costs, respectively. A total of 1754 participants (ART naοve = 244) were enrolled from February 2005-January 2008 with the following distribution: tier 1=371; tier 2=338; tier 3=693; tier 4=352. Linear regression models with generalized estimating equations were used to examine immunological response among patients across the four tiers. Median age was 34; 73 per cent were male, and the majority were on nevirapine-based regimens. Median follow up was 11.1 months. The mean increase in CD4 cell count within the 1 st three months of HAART was 50.3 cells/μl per month in tier 1. Compared to those in tier 1, persons in tiers 2, 3 and 4 had comparable increases (49.7, 57.0, and 50.9 cells/μl per month, respectively). Increases in subsequent periods (3-18 and >18 months) were also comparable across tiers. No differential CD4 gains across tiers were observed when the analysis was restricted to patients initiating ART under the GCR programme. This ART delivery model was associated with significant CD4 gains with no observable difference by how much patients paid. Importantly, gains were comparable to those in other free rollout programmes. Additional cost-effectiveness analyses and mathematical modelling would be needed to determine whether such a delivery programme is a sustainable alternative to free ART programmes.

  6. Energy Frontier Research With ATLAS: Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, John; Black, Kevin; Ahlen, Steve

    2016-06-14

    The Boston University (BU) group is playing key roles across the ATLAS experiment: in detector operations, the online trigger, the upgrade, computing, and physics analysis. Our team has been critical to the maintenance and operations of the muon system since its installation. During Run 1 we led the muon trigger group and that responsibility continues into Run 2. BU maintains and operates the ATLAS Northeast Tier 2 computing center. We are actively engaged in the analysis of ATLAS data from Run 1 and Run 2. Physics analyses we have contributed to include Standard Model measurements (W and Z cross sections,more » t\\bar{t} differential cross sections, WWW^* production), evidence for the Higgs decaying to \\tau^+\\tau^-, and searches for new phenomena (technicolor, Z' and W', vector-like quarks, dark matter).« less

  7. Evaluation of a Two-Phase Implementation of a Tier-2 (Small Group) Reading Intervention for Young Low-Progress Readers

    ERIC Educational Resources Information Center

    Buckingham, Jennifer; Wheldall, Kevin; Beaman-Wheldall, Robyn

    2014-01-01

    In a response to intervention (RtI) model, reading is taught in increasingly intensive tiers of instruction. The aim of the study was to examine the efficacy of a Tier-2 (small group) literacy intervention for young struggling readers. This article focuses on the second phase of a randomised control trial involving 14 students in kindergarten as…

  8. 78 FR 77178 - Self-Regulatory Organizations; International Securities Exchange, LLC; Notice of Filing and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-20

    ... ADV (excluding SPY) symbols SPY Tier 1; 0-39,999 ($0.33) ($0.66) ($0.36) Tier 2; 40,000-74,999 ($0.37... complex contracts. For Select Symbols (excluding SPY) this rebate is $0.33 per contract for Members with a... with a Priority Customer Complex ADV of 40,000-74,999 contracts (i.e., Tier 2), $0.37 per contract for...

  9. 2 CFR 180.440 - What action may I take if a primary tier participant knowingly does business with an excluded or...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 2 Grants and Agreements 1 2011-01-01 2011-01-01 false What action may I take if a primary tier... Transactions § 180.440 What action may I take if a primary tier participant knowingly does business with an... person, you as a Federal agency official may refer the matter for suspension and debarment consideration...

  10. Prospective Environmental Risk Assessment for Sediment-Bound Organic Chemicals: A Proposal for Tiered Effect Assessment.

    PubMed

    Diepens, Noël J; Koelmans, Albert A; Baveco, Hans; van den Brink, Paul J; van den Heuvel-Greve, Martine J; Brock, Theo C M

    A broadly accepted framework for prospective environmental risk assessment (ERA) of sediment-bound organic chemicals is currently lacking. Such a framework requires clear protection goals, evidence-based concepts that link exposure to effects and a transparent tiered-effect assessment. In this paper, we provide a tiered prospective sediment ERA procedure for organic chemicals in sediment, with a focus on the applicable European regulations and the underlying data requirements. Using the ecosystem services concept, we derived specific protection goals for ecosystem service providing units: microorganisms, benthic algae, sediment-rooted macrophytes, benthic invertebrates and benthic vertebrates. Triggers for sediment toxicity testing are discussed.We recommend a tiered approach (Tier 0 through Tier 3). Tier-0 is a cost-effective screening based on chronic water-exposure toxicity data for pelagic species and equilibrium partitioning. Tier-1 is based on spiked sediment laboratory toxicity tests with standard benthic test species and standardised test methods. If comparable chronic toxicity data for both standard and additional benthic test species are available, the Species Sensitivity Distribution (SSD) approach is a more viable Tier-2 option than the geometric mean approach. This paper includes criteria for accepting results of sediment-spiked single species toxicity tests in prospective ERA, and for the application of the SSD approach. We propose micro/mesocosm experiments with spiked sediment, to study colonisation success by benthic organisms, as a Tier-3 option. Ecological effect models can be used to supplement the experimental tiers. A strategy for unifying information from various tiers by experimental work and exposure-and effect modelling is provided.

  11. Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.

  12. BridgeUP: STEM. Creating Opportunities for Women through Tiered Mentorship

    NASA Astrophysics Data System (ADS)

    Secunda, Amy; Cornelis, Juliette; Ferreira, Denelis; Gomez, Anay; Khan, Ariba; Li, Anna; Soo, Audrey; Mac Low, Mordecai

    2018-01-01

    BridgeUP: STEM is an ambitious, and exciting initiative responding to the extensive gender and opportunity gaps that exist in the STEM pipeline for women, girls, and under-resourced youth. BridgeUP: STEM has developed a distinct identity in the landscape of computer science education by embedding programming in the context of scientific research. One of the ways in which this is accomplished is through a tiered mentorship program. Five Helen Fellows are chosen from a pool of female, postbaccalaureate applicants to be mentored by researchers at the American Museum of Natural History in a computational research project. The Helen Fellows then act as mentors to six high school women (Brown Scholars), guiding them through a computational project aligned with their own research. This year, three of the Helen Fellows, and by extension, eighteen Brown Scholars, are performing computational astrophysics research. This poster presents one example of a tiered mentorship working on modeling the migration of stellar mass black holes (BH) in active galactic nucleus (AGN) disks. Making an analogy from the well-studied migration and formation of planets in protoplanetary disks to the newer field of migration and formation of binary BH in AGN disks, the Helen Fellow is working with her mentors to make the necessary adaptations of an N-body code incorporating migration torques from the protoplanetary disk case to the AGN disk case to model how binary BH form. This is in order to better understand and make predictions for gravitational wave observations from the Laser Interferometer Gravitational-Wave Observatory (LIGO). The Brown Scholars then implement the Helen Fellow’s code for a variety of different distributions of initial stellar mass BH populations that they generate using python, and produce visualizations of the output to be used in a published paper. Over the course of the project, students will develop a basic understanding of the physics related to their project and develop their practical computational skills.

  13. Bathymetric Contour Maps for Lakes Surveyed in Iowa in 2006

    USGS Publications Warehouse

    Linhart, S.M.; Lund, K.D.

    2008-01-01

    The U.S. Geological Survey, in cooperation with the Iowa Department of Natural Resources, conducted bathymetric surveys on two lakes in Iowa during 2006 (Little Storm Lake and Silver Lake). The surveys were conducted to provide the Iowa Department of Natural Resources with information for the development of total maximum daily load limits, particularly for estimating sediment load and deposition rates. The bathymetric surveys can provide a baseline for future work on sediment loads and deposition rates for these lakes. Both of the lakes surveyed in 2006 are natural lakes. For Silver Lake, bathymetric data were collected using boat-mounted, differential global positioning system, echo depth-sounding equipment, and computer software. For Little Storm Lake, because of its shallow nature, bathymetric data were collected using manual depth measurements. Data were processed with commercial hydrographic software and exported into a geographic information system for mapping and calculating area and volume. Lake volumes were estimated to be 7,547,000 cubic feet (173 acre-feet) at Little Storm Lake and 126,724,000 cubic feet (2,910 acre-feet) at Silver Lake. Surface areas were estimated to be 4,110,000 square feet (94 acres) at Little Storm Lake and 27,957,000 square feet (640 acres) at Silver Lake.

  14. Development and implementation of an Integrated Water Resources Management System (IWRMS)

    NASA Astrophysics Data System (ADS)

    Flügel, W.-A.; Busch, C.

    2011-04-01

    One of the innovative objectives in the EC project BRAHMATWINN was the development of a stakeholder oriented Integrated Water Resources Management System (IWRMS). The toolset integrates the findings of the project and presents it in a user friendly way for decision support in sustainable integrated water resources management (IWRM) in river basins. IWRMS is a framework, which integrates different types of basin information and which supports the development of IWRM options for climate change mitigation. It is based on the River Basin Information System (RBIS) data models and delivers a graphical user interface for stakeholders. A special interface was developed for the integration of the enhanced DANUBIA model input and the NetSyMod model with its Mulino decision support system (mulino mDss) component. The web based IWRMS contains and combines different types of data and methods to provide river basin data and information for decision support. IWRMS is based on a three tier software framework which uses (i) html/javascript at the client tier, (ii) PHP programming language to realize the application tier, and (iii) a postgresql/postgis database tier to manage and storage all data, except the DANUBIA modelling raw data, which are file based and registered in the database tier. All three tiers can reside on one or different computers and are adapted to the local hardware infrastructure. IWRMS as well as RBIS are based on Open Source Software (OSS) components and flexible and time saving access to that database is guaranteed by web-based interfaces for data visualization and retrieval. The IWRMS is accessible via the BRAHMATWINN homepage: http://www.brahmatwinn.uni-jena.de and a user manual for the RBIS is available for download as well.

  15. Is Computer-Aided Instruction an Effective Tier-One Intervention for Kindergarten Students at Risk for Reading Failure in an Applied Setting?

    ERIC Educational Resources Information Center

    Kreskey, Donna DeVaughn; Truscott, Stephen D.

    2016-01-01

    This study investigated the use of computer-aided instruction (CAI) as an intervention for kindergarten students at risk for reading failure. Headsprout Early Reading (Headsprout 2005), a type of CAI, provides internet-based, reading instruction incorporating the critical components of reading instruction cited by the National Reading Panel (NRP…

  16. Towards Wearable Cognitive Assistance

    DTIC Science & Technology

    2013-12-01

    ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Keywords: mobile computing, cloud...It presents a muli-tiered mobile system architecture that offers tight end-to-end latency bounds on compute-intensive cognitive assistance...to an entire neighborhood or an entire city is extremely expensive and time-consuming. Physical infrastructure in public spaces tends to evolve very

  17. Application of LANDSAT to the surveillance and control of lake eutrophication in the Great Lakes basin. [Madison and Spooner, Wisconsin

    NASA Technical Reports Server (NTRS)

    Rogers, R. H. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. By use of distilled water samples in the laboratory, and very clear lakes in the field, a technique was developed where the atmosphere and surface noise effects on LANDSAT signals from water bodies can be removed. The residual signal dependent only on the material in water was used as a basis for computer categorization of lakes by type and concentration of suspended material. Several hundred lakes in the Madison and Spooner, Wisconsin area were categorized by computer techniques for tannin or nontannin waters and for the degree of algae, silt, weeds, and bottom effects present. When the lakes are categorized as having living algae or weeds, their concentration is related to the enrichment or eutrophication of the lake.

  18. Exploiting Analytics Techniques in CMS Computing Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonacorsi, D.; Kuznetsov, V.; Magini, N.

    The CMS experiment has collected an enormous volume of metadata about its computing operations in its monitoring systems, describing its experience in operating all of the CMS workflows on all of the Worldwide LHC Computing Grid Tiers. Data mining efforts into all these information have rarely been done, but are of crucial importance for a better understanding of how CMS did successful operations, and to reach an adequate and adaptive modelling of the CMS operations, in order to allow detailed optimizations and eventually a prediction of system behaviours. These data are now streamed into the CERN Hadoop data cluster formore » further analysis. Specific sets of information (e.g. data on how many replicas of datasets CMS wrote on disks at WLCG Tiers, data on which datasets were primarily requested for analysis, etc) were collected on Hadoop and processed with MapReduce applications profiting of the parallelization on the Hadoop cluster. We present the implementation of new monitoring applications on Hadoop, and discuss the new possibilities in CMS computing monitoring introduced with the ability to quickly process big data sets from mulltiple sources, looking forward to a predictive modeling of the system.« less

  19. Integrating Model-Based Transmission Reduction into a multi-tier architecture

    NASA Astrophysics Data System (ADS)

    Straub, J.

    A multi-tier architecture consists of numerous craft as part of the system, orbital, aerial, and surface tiers. Each tier is able to collect progressively greater levels of information. Generally, craft from lower-level tiers are deployed to a target of interest based on its identification by a higher-level craft. While the architecture promotes significant amounts of science being performed in parallel, this may overwhelm the computational and transmission capabilities of higher-tier craft and links (particularly the deep space link back to Earth). Because of this, a new paradigm in in-situ data processing is required. Model-based transmission reduction (MBTR) is such a paradigm. Under MBTR, each node (whether a single spacecraft in orbit of the Earth or another planet or a member of a multi-tier network) is given an a priori model of the phenomenon that it is assigned to study. It performs activities to validate this model. If the model is found to be erroneous, corrective changes are identified, assessed to ensure their significance for being passed on, and prioritized for transmission. A limited amount of verification data is sent with each MBTR assertion message to allow those that might rely on the data to validate the correct operation of the spacecraft and MBTR engine onboard. Integrating MBTR with a multi-tier framework creates an MBTR hierarchy. Higher levels of the MBTR hierarchy task lower levels with data collection and assessment tasks that are required to validate or correct elements of its model. A model of the expected conditions is sent to the lower level craft; which then engages its own MBTR engine to validate or correct the model. This may include tasking a yet lower level of craft to perform activities. When the MBTR engine at a given level receives all of its component data (whether directly collected or from delegation), it randomly chooses some to validate (by reprocessing the validation data), performs analysis and sends its own results (v- lidation and/or changes of model elements and supporting validation data) to its upstream node. This constrains data transmission to only significant (either because it includes a change or is validation data critical for assessing overall performance) information and reduces the processing requirements (by not having to process insignificant data) at higher-level nodes. This paper presents a framework for multi-tier MBTR and two demonstration mission concepts: an Earth sensornet and a mission to Mars. These multi-tier MBTR concepts are compared to a traditional mission approach.

  20. Single-channel kinetics of BK (Slo1) channels

    PubMed Central

    Geng, Yanyan; Magleby, Karl L.

    2014-01-01

    Single-channel kinetics has proven a powerful tool to reveal information about the gating mechanisms that control the opening and closing of ion channels. This introductory review focuses on the gating of large conductance Ca2+- and voltage-activated K+ (BK or Slo1) channels at the single-channel level. It starts with single-channel current records and progresses to presentation and analysis of single-channel data and the development of gating mechanisms in terms of discrete state Markov (DSM) models. The DSM models are formulated in terms of the tetrameric modular structure of BK channels, consisting of a central transmembrane pore-gate domain (PGD) attached to four surrounding transmembrane voltage sensing domains (VSD) and a large intracellular cytosolic domain (CTD), also referred to as the gating ring. The modular structure and data analysis shows that the Ca2+ and voltage dependent gating considered separately can each be approximated by 10-state two-tiered models with five closed states on the upper tier and five open states on the lower tier. The modular structure and joint Ca2+ and voltage dependent gating are consistent with a 50 state two-tiered model with 25 closed states on the upper tier and 25 open states on the lower tier. Adding an additional tier of brief closed (flicker states) to the 10-state or 50-state models improved the description of the gating. For fixed experimental conditions a channel would gate in only a subset of the potential number of states. The detected number of states and the correlations between adjacent interval durations are consistent with the tiered models. The examined models can account for the single-channel kinetics and the bursting behavior of gating. Ca2+ and voltage activate BK channels by predominantly increasing the effective opening rate of the channel with a smaller decrease in the effective closing rate. Ca2+ and depolarization thus activate by mainly destabilizing the closed states. PMID:25653620

  1. Estimating ground-water inflow to lakes in central Florida using the isotope mass-balance approach

    USGS Publications Warehouse

    Sacks, Laura A.

    2002-01-01

    The isotope mass-balance approach was used to estimate ground-water inflow to 81 lakes in the central highlands and coastal lowlands of central Florida. The study area is characterized by a subtropical climate and numerous lakes in a mantled karst terrain. Ground-water inflow was computed using both steady-state and transient formulations of the isotope mass-balance equation. More detailed data were collected from two study lakes, including climatic, hydrologic, and isotopic (hydrogen and oxygen isotope ratio) data. For one of these lakes (Lake Starr), ground-water inflow was independently computed from a water-budget study. Climatic and isotopic data collected from the two lakes were similar even though they were in different physiographic settings about 60 miles apart. Isotopic data from all of the study lakes plotted on an evaporation trend line, which had a very similar slope to the theoretical slope computed for Lake Starr. These similarities suggest that data collected from the detailed study lakes can be extrapolated to the rest of the study area. Ground-water inflow computed using the isotope mass-balance approach ranged from 0 to more than 260 inches per year (or 0 to more than 80 percent of total inflows). Steady-state and transient estimates of ground-water inflow were very similar. Computed ground-water inflow was most sensitive to uncertainty in variables used to calculate the isotopic composition of lake evaporate (isotopic compositions of lake water and atmospheric moisture and climatic variables). Transient results were particularly sensitive to changes in the isotopic composition of lake water. Uncertainty in ground-water inflow results is considerably less for lakes with higher ground-water inflow than for lakes with lower ground-water inflow. Because of these uncertainties, the isotope mass-balance approach is better used to distinguish whether ground-water inflow quantities fall within certain ranges of values, rather than for precise quantification. The lakes fit into three categories based on their range of ground-water inflow: low (less than 25 percent of total inflows), medium (25-50 percent of inflows), and high (greater than 50 percent of inflows). The majority of lakes in the coastal lowlands had low ground-water inflow, whereas the majority of lakes in the central highlands had medium to high ground-water inflow. Multiple linear regression models were used to predict ground-water inflow to lakes. These models help identify basin characteristics that are important in controlling ground-water inflow to Florida lakes. Significant explanatory variables include: ratio of basin area to lake surface area, depth to the Upper Floridan aquifer, maximum lake depth, and fraction of wetlands in the basin. Models were improved when lake water-quality data (nitrate, sodium, and iron concentrations) were included, illustrating the link between ground-water geochemistry and lake chemistry. Regression models that considered lakes within specific geographic areas were generally poorer than models for the entire study area. Regression results illustrate how more simplified models based on basin and lake characteristics can be used to estimate ground-water inflow. Although the uncertainty in the amount of ground-water inflow to individual lakes is high, the isotope mass-balance approach was useful in comparing the range of ground-water inflow for numerous Florida lakes. Results were also helpful in understanding differences in the geographic distribution of ground-water inflow between the coastal lowlands and central highlands. In order to use the isotope mass-balance approach to estimate inflow for multiple lakes, it is essential that all the lakes are sampled during the same time period and that detailed isotopic, hydrologic, and climatic data are collected over this same period of time. Isotopic data for Florida lakes can change over time, both seasonally and interannually, primarily because of differ

  2. 12 CFR 3.63 - Disclosures by national banks or Federal savings associations described in § 3.61.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... tier 1 capital, tier 2 capital, tier 1 and total capital ratios, including the regulatory capital elements and all the regulatory adjustments and deductions needed to calculate the numerator of such ratios... to calculate total risk-weighted assets; (3) Regulatory capital ratios during any transition periods...

  3. 33 CFR 154.1135 - Response plan development and evaluation criteria.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Operating in Prince William Sound, Alaska § 154.1135 Response plan development and evaluation criteria. The following response times must be used in determining the on scene arrival time in Prince William Sound for the response resources required by § 154.1045: Tier 1 (hrs.) Tier 2 (hrs.) tier 3 (hrs.) Prince...

  4. To What Interventions Are Students Responding?

    ERIC Educational Resources Information Center

    Lipson, Marjorie Y.; Wixson, Karen K.

    2012-01-01

    Intervention is a central tenet of the various (multitiered) approaches used to implement Response to Intervention (RTI). It appears in Tier 1 core instruction in the form of differentiation, in Tier 2 in the form of supplemental small groups, and in Tier 3 and 4 instruction in the form of more intensive, often individualized support from…

  5. Using Brief Experimental Analysis to Intensify Tier 3 Reading Interventions

    ERIC Educational Resources Information Center

    Coolong-Chaffin, Melissa; Wagner, Dana

    2015-01-01

    As implementation of multi-tiered systems of support becomes common practice across the nation, practitioners continue to need strategies for intensifying interventions and supports for the subset of students who fail to make adequate progress despite strong programs at Tiers 1 and 2. Experts recommend making several changes to the structure and…

  6. Examining Proportional Representation of Ethnic Groups within the SWPBIS Model

    ERIC Educational Resources Information Center

    Jewell, Kelly

    2012-01-01

    The quantitative study seeks to analyze if School-wide Positive Behavior Intervention and Support (SWPBIS) model reduces the likelihood that minority students will receive more individualized supports due to behavior problems. In theory, the SWPBIS model should reflect a 3-tier system with tier 1 representing approximately 80%, tier 2 representing…

  7. A self-configuring control system for storage and computing departments at INFN-CNAF Tierl

    NASA Astrophysics Data System (ADS)

    Gregori, Daniele; Dal Pra, Stefano; Ricci, Pier Paolo; Pezzi, Michele; Prosperini, Andrea; Sapunenko, Vladimir

    2015-05-01

    The storage and farming departments at the INFN-CNAF Tier1[1] manage approximately thousands of computing nodes and several hundreds of servers that provides access to the disk and tape storage. In particular, the storage server machines should provide the following services: an efficient access to about 15 petabytes of disk space with different cluster of GPFS file system, the data transfers between LHC Tiers sites (Tier0, Tier1 and Tier2) via GridFTP cluster and Xrootd protocol and finally the writing and reading data operations on magnetic tape backend. One of the most important and essential point in order to get a reliable service is a control system that can warn if problems arise and which is able to perform automatic recovery operations in case of service interruptions or major failures. Moreover, during daily operations the configurations can change, i.e. if the GPFS cluster nodes roles can be modified and therefore the obsolete nodes must be removed from the control system production, and the new servers should be added to the ones that are already present. The manual management of all these changes is an operation that can be somewhat difficult in case of several changes, it can also take a long time and is easily subject to human error or misconfiguration. For these reasons we have developed a control system with the feature of self-configure itself if any change occurs. Currently, this system has been in production for about a year at the INFN-CNAF Tier1 with good results and hardly any major drawback. There are three major key points in this system. The first is a software configurator service (e.g. Quattor or Puppet) for the servers machines that we want to monitor with the control system; this service must ensure the presence of appropriate sensors and custom scripts on the nodes to check and should be able to install and update software packages on them. The second key element is a database containing information, according to a suitable format, on all the machines in production and able to provide for each of them the principal information such as the type of hardware, the network switch to which the machine is connected, if the machine is real (physical) or virtual, the possible hypervisor to which it belongs and so on. The last key point is a control system software (in our implementation we choose the Nagios software), capable of assessing the status of the servers and services, and that can attempt to restore the working state, restart or inhibit software services and send suitable alarm messages to the site administrators. The integration of these three elements was made by appropriate scripts and custom implementation that allow the self-configuration of the system according to a decisional logic and the whole combination of all the above-mentioned components will be deeply discussed in this paper.

  8. Ligand accessibility to the HIV-1 Env co-receptor binding site can occur prior to CD4 engagement and is independent of viral tier category.

    PubMed

    Boliar, Saikat; Patil, Shilpa; Shukla, Brihaspati N; Ghobbeh, Ali; Deshpande, Suprit; Chen, Weizao; Guenaga, Javier; Dimitrov, Dimiter S; Wyatt, Richard T; Chakrabarti, Bimal K

    2018-06-01

    HIV-1 virus entry into target cells requires the envelope glycoprotein (Env) to first bind the primary receptor, CD4 and subsequently the co-receptor. Antibody access to the co-receptor binding site (CoRbs) in the pre-receptor-engaged state, prior to cell attachment, remains poorly understood. Here, we have demonstrated that for tier-1 Envs, the CoRbs is directly accessible to full-length CD4-induced (CD4i) antibodies even before primary receptor engagement, indicating that on these Envs the CoRbs site is either preformed or can conformationally sample post-CD4-bound state. Tier-2 and tier-3 Envs, which are resistant to full-length CD4i antibody, are neutralized by m36.4, a lower molecular mass of CD4i-directed domain antibody. In some tier-2 and tier-3 Envs, CoRbs is accessible to m36.4 even prior to cellular attachment in an Env-specific manner independent of their tier category. These data suggest differential structural arrangements of CoRbs and varied masking of ligand access to the CoRbs in different Env isolates. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. Oklahoma Center for High Energy Physics (OCHEP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nandi, S; Strauss, M J; Snow, J

    2012-02-29

    The DOE EPSCoR implementation grant, with the support from the State of Oklahoma and from the three universities, Oklahoma State University, University of Oklahoma and Langston University, resulted in establishing of the Oklahoma Center for High Energy Physics (OCHEP) in 2004. Currently, OCHEP continues to flourish as a vibrant hub for research in experimental and theoretical particle physics and an educational center in the State of Oklahoma. All goals of the original proposal were successfully accomplished. These include foun- dation of a new experimental particle physics group at OSU, the establishment of a Tier 2 computing facility for the Largemore » Hadron Collider (LHC) and Tevatron data analysis at OU and organization of a vital particle physics research center in Oklahoma based on resources of the three universities. OSU has hired two tenure-track faculty members with initial support from the grant funds. Now both positions are supported through OSU budget. This new HEP Experimental Group at OSU has established itself as a full member of the Fermilab D0 Collaboration and LHC ATLAS Experiment and has secured external funds from the DOE and the NSF. These funds currently support 2 graduate students, 1 postdoctoral fellow, and 1 part-time engineer. The grant initiated creation of a Tier 2 computing facility at OU as part of the Southwest Tier 2 facility, and a permanent Research Scientist was hired at OU to maintain and run the facility. Permanent support for this position has now been provided through the OU university budget. OCHEP represents a successful model of cooperation of several universities, providing the establishment of critical mass of manpower, computing and hardware resources. This led to increasing Oklahoma's impact in all areas of HEP, theory, experiment, and computation. The Center personnel are involved in cutting edge research in experimental, theoretical, and computational aspects of High Energy Physics with the research areas ranging from the search for new phenomena at the Fermilab Tevatron and the CERN Large Hadron Collider to theoretical modeling, computer simulation, detector development and testing, and physics analysis. OCHEP faculty members participating on the D0 collaboration at the Fermilab Tevatron and on the ATLAS collaboration at the CERN LHC have made major impact on the Standard Model (SM) Higgs boson search, top quark studies, B physics studies, and measurements of Quantum Chromodynamics (QCD) phenomena. The OCHEP Grid computing facility consists of a large computer cluster which is playing a major role in data analysis and Monte Carlo productions for both the D0 and ATLAS experiments. Theoretical efforts are devoted to new ideas in Higgs bosons physics, extra dimensions, neutrino masses and oscillations, Grand Unified Theories, supersymmetric models, dark matter, and nonperturbative quantum field theory. Theory members are making major contributions to the understanding of phenomena being explored at the Tevatron and the LHC. They have proposed new models for Higgs bosons, and have suggested new signals for extra dimensions, and for the search of supersymmetric particles. During the seven year period when OCHEP was partially funded through the DOE EPSCoR implementation grant, OCHEP members published over 500 refereed journal articles and made over 200 invited presentations at major conferences. The Center is also involved in education and outreach activities by offering summer research programs for high school teachers and college students, and organizing summer workshops for high school teachers, sometimes coordinating with the Quarknet programs at OSU and OU. The details of the Center can be found in http://ochep.phy.okstate.edu.« less

  10. Impacts of nutrients and pesticides from small- and large-scale agriculture on the water quality of Lake Ziway, Ethiopia.

    PubMed

    Teklu, Berhan M; Hailu, Amare; Wiegant, Daniel A; Scholten, Bernice S; Van den Brink, Paul J

    2018-05-01

    The area around Lake Ziway in Ethiopia is going through a major agricultural transformation with both small-scale farmers and large horticultural companies using pesticides and fertilisers at an increased rate. To be able to understand how this influences the water quality of Lake Ziway, water quality data was gathered to study the dynamics of pesticide concentrations and physicochemical parameters for the years from 2009 to 2015. Results indicate that for some physicochemical parameters, including pH, potassium and iron, over 50 % of the values were above the maximum permissible limit of the Ethiopian standard for drinking water. The fungicide spiroxamine poses a high chronic risk when the water is used for drinking water, while the estimated intake of diazinon was approximately 50 % of the acceptable daily intake. Higher-tier risk assessment indicated that the fungicide spiroxamine poses a high acute risk to aquatic organisms, while possible acute risks were indicated for the insecticides deltamethrin and endosulfan. Longer-term monitoring needs to be established to show the water quality changes across time and space, and the current study can be used as a baseline measurement for further research in the area as well as an example for other surface water systems in Ethiopia and Africa.

  11. The 2016 ACCP Pharmacotherapy Didactic Curriculum Toolkit.

    PubMed

    Schwinghammer, Terry L; Crannage, Andrew J; Boyce, Eric G; Bradley, Bridget; Christensen, Alyssa; Dunnenberger, Henry M; Fravel, Michelle; Gurgle, Holly; Hammond, Drayton A; Kwon, Jennifer; Slain, Douglas; Wargo, Kurt A

    2016-11-01

    The 2016 American College of Clinical Pharmacy (ACCP) Educational Affairs Committee was charged with updating and contemporizing ACCP's 2009 Pharmacotherapy Didactic Curriculum Toolkit. The toolkit has been designed to guide schools and colleges of pharmacy in developing, maintaining, and modifying their curricula. The 2016 committee reviewed the recent medical literature and other documents to identify disease states that are responsive to drug therapy. Diseases and content topics were organized by organ system, when feasible, and grouped into tiers as defined by practice competency. Tier 1 topics should be taught in a manner that prepares all students to provide collaborative, patient-centered care upon graduation and licensure. Tier 2 topics are generally taught in the professional curriculum, but students may require additional knowledge or skills after graduation (e.g., residency training) to achieve competency in providing direct patient care. Tier 3 topics may not be taught in the professional curriculum; thus, graduates will be required to obtain the necessary knowledge and skills on their own to provide direct patient care, if required in their practice. The 2016 toolkit contains 276 diseases and content topics, of which 87 (32%) are categorized as tier 1, 133 (48%) as tier 2, and 56 (20%) as tier 3. The large number of tier 1 topics will require schools and colleges to use creative pedagogical strategies to achieve the necessary practice competencies. Almost half of the topics (48%) are tier 2, highlighting the importance of postgraduate residency training or equivalent practice experience to competently care for patients with these disorders. The Pharmacotherapy Didactic Curriculum Toolkit will continue to be updated to provide guidance to faculty at schools and colleges of pharmacy as these academic pharmacy institutions regularly evaluate and modify their curricula to keep abreast of scientific advances and associated practice changes. Access the current Pharmacotherapy Didactic Curriculum Toolkit at http://www.accp.com/docs/positions/misc/Toolkit_final.pdf. © 2016 Pharmacotherapy Publications, Inc.

  12. Querying and Computing with BioCyc Databases

    PubMed Central

    Krummenacker, Markus; Paley, Suzanne; Mueller, Lukas; Yan, Thomas; Karp, Peter D.

    2006-01-01

    Summary We describe multiple methods for accessing and querying the complex and integrated cellular data in the BioCyc family of databases: access through multiple file formats, access through Application Program Interfaces (APIs) for LISP, Perl and Java, and SQL access through the BioWarehouse relational database. Availability The Pathway Tools software and 20 BioCyc DBs in Tiers 1 and 2 are freely available to academic users; fees apply to some types of commercial use. For download instructions see http://BioCyc.org/download.shtml PMID:15961440

  13. Implications of limiting mechanical thrombectomy to patients with emergent large vessel occlusion meeting top tier evidence criteria.

    PubMed

    Bhole, Rohini; Goyal, Nitin; Nearing, Katherine; Belayev, Andrey; Doss, Vinodh T; Elijovich, Lucas; Hoit, Daniel A; Tsivgoulis, Georgios; Alexandrov, Andrei V; Arthur, Adam S; Alexandrov, Anne W

    2017-03-01

    Recent guidelines for endovascular management of emergent large vessel occlusion (ELVO) award top tier evidence to the same selective criteria in recent trials. We aimed to understand how guideline adherence would have impacted treatment numbers and outcomes in a cohort of patients from a comprehensive stroke center. A retrospective observational study was conducted using consecutive emergent endovascular patients. Mechanical thrombectomy (MT) was performed with stent retrievers or large bore clot aspiration catheters. Procedural outcomes were compared between patients meeting, and those failing to meet, top tier evidence criteria. 126 patients receiving MT from January 2012 to June 2015 were included (age 31-89 years, National Institutes of Health Stroke Scale (NIHSS) score 2-38); 62 (49%) patients would have been excluded if top tier criteria were upheld: pretreatment NIHSS score <6 (10%), Alberta Stroke Program Early CT score <6 (6.5%), premorbid modified Rankin Scale (mRS) score ≥2 (27%), M2 occlusion (10%), posterior circulation (32%), symptom to groin puncture >360 min (58%). 26 (42%) subjects had more than one top tier exclusion. Symptomatic intracerebral hemorrhage (sICH) and systemic hemorrhage rates were similar between the groups. 3 month mortality was 45% in those lacking top tier evidence compared with 26% (p=0.044), and 3 month mRS score 0-2 was 33% versus 46%, respectively (NS). After adjusting for potential confounders, top tier treatment was not associated with neurological improvement during hospitalization (β -8.2; 95% CI -24.6 to -8.2; p=0.321), 3 month mortality (OR=0.38; 95% CI 0.08 to 1.41), or 3 month favorable mRS (OR=0.97; 95% CI 0.28 to 3.35). Our study showed that with strict adherence to top tier evidence criteria, half of patients may not be considered for MT. Our data indicate no increased risk of sICH and a potentially higher mortality that is largely due to treatment of patients with basilar occlusions and those treated at an extended time window. Despite this, good functional recovery is possible, and consideration of MT in patients not meeting top tier evidence criteria may be warranted. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  14. Computational Fluid Dynamics simulations of the Late Pleistocene Lake Bonneville Flood

    NASA Astrophysics Data System (ADS)

    Abril-Hernández, José M.; Periáñez, Raúl; O'Connor, Jim E.; Garcia-Castellanos, Daniel

    2018-06-01

    At approximately 18.0 ka, pluvial Lake Bonneville reached its maximum level. At its northeastern extent it was impounded by alluvium of the Marsh Creek Fan, which breached at some point north of Red Rock Pass (Idaho), leading to one of the largest floods on Earth. About 5320 km3 of water was discharged into the Snake River drainage and ultimately into the Columbia River. We use a 0D model and a 2D non-linear depth-averaged hydrodynamic model to aid understanding of outflow dynamics, specifically evaluating controls on the amount of water exiting the Lake Bonneville basin exerted by the Red Rock Pass outlet lithology and geometry as well as those imposed by the internal lake geometry of the Bonneville basin. These models are based on field evidence of prominent lake levels, hypsometry and terrain elevations corrected for post-flood isostatic deformation of the lake basin, as well as reconstructions of the topography at the outlet for both the initial and final stages of the flood. Internal flow dynamics in the northern Lake Bonneville basin during the flood were affected by the narrow passages separating the Cache Valley from the main body of Lake Bonneville. This constriction imposed a water-level drop of up to 2.7 m at the time of peak-flow conditions and likely reduced the peak discharge at the lake outlet by about 6%. The modeled peak outlet flow is 0.85·106 m3 s-1. Energy balance calculations give an estimate for the erodibility coefficient for the alluvial Marsh Creek divide of ∼0.005 m y-1 Pa-1.5, at least two orders of magnitude greater than for the underlying bedrock at the outlet. Computing quasi steady-state water flows, water elevations, water currents and shear stresses as a function of the water-level drop in the lake and for the sequential stages of erosion in the outlet gives estimates of the incision rates and an estimate of the outflow hydrograph during the Bonneville Flood: About 18 days would have been required for the outflow to grow from 10% to 100% of its peak value. At the time of peak flow, about 10% of the lake volume would have already exited; eroding about 1 km3 of alluvium from the outlet, and the lake level would have dropped by about 10.6 m.

  15. Alternate assembly sequence databook for the Tier 2 Bus-1 option of the International Space Station

    NASA Technical Reports Server (NTRS)

    Brewer, L. M.; Cirillo, W. M.; Cruz, J. N.; Hall, J. B.; Troutman, P. A.; Monell, D. W.; Garn, M. A.; Heck, M. L.; Kumar, R. R.; Llewellyn, C. P.

    1995-01-01

    The JSC International Space Station program office requested that SSB prepare a databook to document the alternate space station assembly sequence known as Tier 2, which assumes that the Russian participation has been eliminated and that the functions that were supplied by the Russians (propulsion, resupply, initial attitude control, communications, etc.) are now supplied by the U.S. Tier 2 utilizes the Lockheed Bus-l to replace much of the missing Russian functionality. The space station at each stage of its buildup during the Tier 2 assembly sequence is characterized in terms of of properties, functionality, resource balances, operations, logistics, attitude control, microgravity environment and propellant usage. The assembly sequence as analyzed was defined by JSC as a first iteration, with subsequent iterations required to address some of the issues that the analysis in this databook identified. Several significant issues were identified, including: less than desirable orbit lifetimes, shortage of EVA, large flight attitudes, poor microgravity environments, and reboost propellant shortages. Many of these issues can be resolved but at the cost of possible baseline modifications and revisions in the proposed Tier 2 assembly sequence.

  16. Data sharing as a national quality improvement program: reporting on BRCA1 and BRCA2 variant-interpretation comparisons through the Canadian Open Genetics Repository (COGR).

    PubMed

    Lebo, Matthew S; Zakoor, Kathleen-Rose; Chun, Kathy; Speevak, Marsha D; Waye, John S; McCready, Elizabeth; Parboosingh, Jillian S; Lamont, Ryan E; Feilotter, Harriet; Bosdet, Ian; Tucker, Tracy; Young, Sean; Karsan, Aly; Charames, George S; Agatep, Ronald; Spriggs, Elizabeth L; Chisholm, Caitlin; Vasli, Nasim; Daoud, Hussein; Jarinova, Olga; Tomaszewski, Robert; Hume, Stacey; Taylor, Sherryl; Akbari, Mohammad R; Lerner-Ellis, Jordan

    2018-03-01

    PurposeThe purpose of this study was to develop a national program for Canadian diagnostic laboratories to compare DNA-variant interpretations and resolve discordant-variant classifications using the BRCA1 and BRCA2 genes as a case study.MethodsBRCA1 and BRCA2 variant data were uploaded and shared through the Canadian Open Genetics Repository (COGR; http://www.opengenetics.ca). A total of 5,554 variant observations were submitted; classification differences were identified and comparison reports were sent to participating laboratories. Each site had the opportunity to reclassify variants. The data were analyzed before and after the comparison report process to track concordant- or discordant-variant classifications by three different models.ResultsVariant-discordance rates varied by classification model: 38.9% of variants were discordant when using a five-tier model, 26.7% with a three-tier model, and 5.0% with a two-tier model. After the comparison report process, the proportion of discordant variants dropped to 30.7% with the five-tier model, to 14.2% with the three-tier model, and to 0.9% using the two-tier model.ConclusionWe present a Canadian interinstitutional quality improvement program for DNA-variant interpretations. Sharing of variant knowledge by clinical diagnostic laboratories will allow clinicians and patients to make more informed decisions and lead to better patient outcomes.

  17. Stability and Scalability of the CMS Global Pool: Pushing HTCondor and GlideinWMS to New Limits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balcas, J.; Bockelman, B.; Hufnagel, D.

    The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such asmore » multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.« less

  18. Stability and scalability of the CMS Global Pool: Pushing HTCondor and glideinWMS to new limits

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Bockelman, B.; Hufnagel, D.; Hurtado Anampa, K.; Aftab Khan, F.; Larson, K.; Letts, J.; Marra da Silva, J.; Mascheroni, M.; Mason, D.; Perez-Calero Yzquierdo, A.; Tiradani, A.

    2017-10-01

    The CMS Global Pool, based on HTCondor and glideinWMS, is the main computing resource provisioning system for all CMS workflows, including analysis, Monte Carlo production, and detector data reprocessing activities. The total resources at Tier-1 and Tier-2 grid sites pledged to CMS exceed 100,000 CPU cores, while another 50,000 to 100,000 CPU cores are available opportunistically, pushing the needs of the Global Pool to higher scales each year. These resources are becoming more diverse in their accessibility and configuration over time. Furthermore, the challenge of stably running at higher and higher scales while introducing new modes of operation such as multi-core pilots, as well as the chaotic nature of physics analysis workflows, places huge strains on the submission infrastructure. This paper details some of the most important challenges to scalability and stability that the CMS Global Pool has faced since the beginning of the LHC Run II and how they were overcome.

  19. Transforming Big Data into cancer-relevant insight: An initial, multi-tier approach to assess reproducibility and relevance

    PubMed Central

    2016-01-01

    The Cancer Target Discovery and Development (CTD2) Network was established to accelerate the transformation of “Big Data” into novel pharmacological targets, lead compounds, and biomarkers for rapid translation into improved patient outcomes. It rapidly became clear in this collaborative network that a key central issue was to define what constitutes sufficient computational or experimental evidence to support a biologically or clinically relevant finding. This manuscript represents a first attempt to delineate the challenges of supporting and confirming discoveries arising from the systematic analysis of large-scale data resources in a collaborative work environment and to provide a framework that would begin a community discussion to resolve these challenges. The Network implemented a multi-Tier framework designed to substantiate the biological and biomedical relevance as well as the reproducibility of data and insights resulting from its collaborative activities. The same approach can be used by the broad scientific community to drive development of novel therapeutic and biomarker strategies for cancer. PMID:27401613

  20. The successively temporal error concealment algorithm using error-adaptive block matching principle

    NASA Astrophysics Data System (ADS)

    Lee, Yu-Hsuan; Wu, Tsai-Hsing; Chen, Chao-Chyun

    2014-09-01

    Generally, the temporal error concealment (TEC) adopts the blocks around the corrupted block (CB) as the search pattern to find the best-match block in previous frame. Once the CB is recovered, it is referred to as the recovered block (RB). Although RB can be the search pattern to find the best-match block of another CB, RB is not the same as its original block (OB). The error between the RB and its OB limits the performance of TEC. The successively temporal error concealment (STEC) algorithm is proposed to alleviate this error. The STEC procedure consists of tier-1 and tier-2. The tier-1 divides a corrupted macroblock into four corrupted 8 × 8 blocks and generates a recovering order for them. The corrupted 8 × 8 block with the first place of recovering order is recovered in tier-1, and remaining 8 × 8 CBs are recovered in tier-2 along the recovering order. In tier-2, the error-adaptive block matching principle (EA-BMP) is proposed for the RB as the search pattern to recover remaining corrupted 8 × 8 blocks. The proposed STEC outperforms sophisticated TEC algorithms on average PSNR by 0.3 dB on the packet error rate of 20% at least.

  1. Tier 2 Team Processes and Decision-Making in a Comprehensive Three-Tiered Model

    ERIC Educational Resources Information Center

    Pool, Juli L.; Carter, Deborah Russell; Johnson, Evelyn S.

    2013-01-01

    Three-tiered models of academic and behavioral support are being increasingly adopted across the nation, and with that adoption has come an increasing message that designing and implementing effective practices alone is not enough. Systems are needed to help staff to collectively implement best practices. These systems, as well as effective…

  2. Differentiating by Readiness: Strategies and Lesson Plans for Tiered Instruction, Grades K-8

    ERIC Educational Resources Information Center

    Turville, Joni; Allen, Linda; Nickelsen, LeAnn

    2010-01-01

    This book provides a comprehensive introduction to tiering plus step-by-step instructions for using it in your classroom. Also included are 23 ready-to-apply blackline masters, which provide helpful ideas for activities and classroom management. Contents include: (1) Building the foundation: What is tiering in differentiated instruction?; (2) The…

  3. 76 FR 68642 - Fisheries of the Northeastern United States; Atlantic Mackerel, Squid, and Butterfish Fisheries...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-11-07

    ...: Copies of supporting documents used by the Mid-Atlantic Fishery Management Council (Council), including..., Tier 1 and Tier 2 vessel owners are required to obtain a fish hold capacity measurement from a certified marine surveyor. The hold capacity measurement submitted at the time of application for a Tier 1...

  4. Exploring the Relationship between Cognitive Characteristics and Responsiveness to a Tier 3 Reading Fluency Intervention

    ERIC Educational Resources Information Center

    Field, Stacey Allyson

    2015-01-01

    Current research suggests that certain cognitive functions predict the likelihood of intervention response for students who receive Tier 2 instruction through an RTI-framework. However, less is known about cognitive predictors of responder status at a theoretically more critical point of divergence within the RTI model: Tier 3. Moreover, no…

  5. 2 CFR 180.425 - When do I check to see if a person is excluded or disqualified?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... excluded or disqualified? 180.425 Section 180.425 Grants and Agreements OFFICE OF MANAGEMENT AND BUDGET... tier covered transaction; (b) Approve a principal in a primary tier covered transaction; (c) Approve a...) Approve a principal in connection with a lower tier transaction if your agency's approval of the principal...

  6. Intensive Reading Interventions for Inadequate Responders in Grades K-3: A Synthesis

    ERIC Educational Resources Information Center

    Austin, Christy R.; Vaughn, Sharon; McClelland, Amanda M.

    2017-01-01

    A subset of students fail to respond adequately to reading interventions. This synthesis systematically reviews studies in which students in grades K-3 responded inadequately to a Tier 2 reading intervention and were provided with a Tier 3 intervention. Descriptions of the Tier 3 reading interventions and effects are provided. To meet inclusion…

  7. 40 CFR 89.204 - Averaging.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Provisions § 89.204 Averaging. (a) Requirements for Tier 1 engines rated at or above 37 kW. A manufacturer... credits obtained through trading. (b) Requirements for Tier 2 and later engines rated at or above 37 kW and Tier 1 and later engines rated under 37 kW. A manufacturer may use averaging to offset an emission...

  8. The effect of incentive-based formularies on prescription-drug utilization and spending.

    PubMed

    Huskamp, Haiden A; Deverka, Patricia A; Epstein, Arnold M; Epstein, Robert S; McGuigan, Kimberly A; Frank, Richard G

    2003-12-04

    Many employers and health plans have adopted incentive-based formularies in an attempt to control prescription-drug costs. We used claims data to compare the utilization of and spending on drugs in two employer-sponsored health plans that implemented changes in formulary administration with those in comparison groups of enrollees covered by the same insurers. One plan simultaneously switched from a one-tier to a three-tier formulary and increased all enrollee copayments for medications. The second switched from a two-tier to a three-tier formulary, changing only the copayments for tier-3 drugs. We examined the utilization of angiotensin-converting-enzyme (ACE) inhibitors, proton-pump inhibitors, and 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitors (statins). Enrollees covered by the employer that implemented more dramatic changes experienced slower growth than the comparison group in the probability of the use of a drug and a major shift in spending from the plan to the enrollee. Among the enrollees who were initially taking tier-3 statins, more enrollees in the intervention group than in the comparison group switched to tier-1 or tier-2 medications (49 percent vs. 17 percent, P<0.001) or stopped taking statins entirely (21 percent vs. 11 percent, P=0.04). Patterns were similar for ACE inhibitors and proton-pump inhibitors. The enrollees covered by the employer that implemented more moderate changes were more likely than the comparison enrollees to switch to tier-1 or tier-2 medications but not to stop taking a given class of medications altogether. Different changes in formulary administration may have dramatically different effects on utilization and spending and may in some instances lead enrollees to discontinue therapy. The associated changes in copayments can substantially alter out-of-pocket spending by enrollees, the continuation of the use of medications, and possibly the quality of care. Copyright 2003 Massachusetts Medical Society

  9. Estimating the volume and age of water stored in global lakes using a geo-statistical approach

    PubMed Central

    Messager, Mathis Loïc; Lehner, Bernhard; Grill, Günther; Nedeva, Irena; Schmitt, Oliver

    2016-01-01

    Lakes are key components of biogeochemical and ecological processes, thus knowledge about their distribution, volume and residence time is crucial in understanding their properties and interactions within the Earth system. However, global information is scarce and inconsistent across spatial scales and regions. Here we develop a geo-statistical model to estimate the volume of global lakes with a surface area of at least 10 ha based on the surrounding terrain information. Our spatially resolved database shows 1.42 million individual polygons of natural lakes with a total surface area of 2.67 × 106 km2 (1.8% of global land area), a total shoreline length of 7.2 × 106 km (about four times longer than the world's ocean coastline) and a total volume of 181.9 × 103 km3 (0.8% of total global non-frozen terrestrial water stocks). We also compute mean and median hydraulic residence times for all lakes to be 1,834 days and 456 days, respectively. PMID:27976671

  10. Business Use of Small Computers in the Salt Lake City, Utah Area.

    ERIC Educational Resources Information Center

    Homer, Michael M.

    In July 1981, Utah Technical College (UTC) conducted a survey of businesses in the Salt Lake City area to gather information for the development of a curriculum integrating computer applications with business course instruction. The survey sought to determine the status and usage of current micro/mini computer equipment, future data processing…

  11. Reclassification of serous ovarian carcinoma by a 2-tier system: a Gynecologic Oncology Group Study.

    PubMed

    Bodurka, Diane C; Deavers, Michael T; Tian, Chunqiao; Sun, Charlotte C; Malpica, Anais; Coleman, Robert L; Lu, Karen H; Sood, Anil K; Birrer, Michael J; Ozols, Robert; Baergen, Rebecca; Emerson, Robert E; Steinhoff, Margaret; Behmaram, Behnaz; Rasty, Golnar; Gershenson, David M

    2012-06-15

    A study was undertaken to use the 2-tier system to reclassify the grade of serous ovarian tumors previously classified using the International Federation of Gynecology and Obstetrics (FIGO) 3-tier system and determine the progression-free survival (PFS) and overall survival (OS) of patients treated on Gynecologic Oncology Group (GOG) Protocol 158. The authors retrospectively reviewed demographic, pathologic, and survival data of 290 patients with stage III serous ovarian carcinoma treated with surgery and chemotherapy on GOG Protocol 158, a cooperative multicenter group trial. A blinded pathology review was performed by a panel of 6 gynecologic pathologists to verify histology and regrade tumors using the 2-tier system. The association of tumor grade with PFS and OS was assessed. Of 241 cases, both systems demonstrated substantial agreement when combining FIGO grades 2 and 3 (overall agreement, 95%; kappa statistic, 0.68). By using the 2-tier system, patients with low-grade versus high-grade tumors had significantly longer PFS (45.0 vs 19.8 months, respectively; P = .01). By using FIGO criteria, median PFS for patients with grade 1, 2, and 3 tumors was 37.5, 19.8, and 20.1 months, respectively (P = .07). There was no difference in clinical outcome in patients with grade 2 or 3 tumors in multivariate analysis. Woman with high-grade versus low-grade tumors demonstrated significantly higher risk of death (hazard ratio, 2.43; 95% confidence interval, 1.17-5.04; P = .02). Women with high-grade versus low-grade serous carcinoma of the ovary are 2 distinct patient populations. Adoption of the 2-tier grading system provides a simple yet precise framework for predicting clinical outcomes. Copyright © 2011 American Cancer Society.

  12. Methods and apparatus for constructing and implementing a universal extension module for processing objects in a database

    NASA Technical Reports Server (NTRS)

    Li, Chung-Sheng (Inventor); Smith, John R. (Inventor); Chang, Yuan-Chi (Inventor); Jhingran, Anant D. (Inventor); Padmanabhan, Sriram K. (Inventor); Hsiao, Hui-I (Inventor); Choy, David Mun-Hien (Inventor); Lin, Jy-Jine James (Inventor); Fuh, Gene Y. C. (Inventor); Williams, Robin (Inventor)

    2004-01-01

    Methods and apparatus for providing a multi-tier object-relational database architecture are disclosed. In one illustrative embodiment of the present invention, a multi-tier database architecture comprises an object-relational database engine as a top tier, one or more domain-specific extension modules as a bottom tier, and one or more universal extension modules as a middle tier. The individual extension modules of the bottom tier operationally connect with the one or more universal extension modules which, themselves, operationally connect with the database engine. The domain-specific extension modules preferably provide such functions as search, index, and retrieval services of images, video, audio, time series, web pages, text, XML, spatial data, etc. The domain-specific extension modules may include one or more IBM DB2 extenders, Oracle data cartridges and/or Informix datablades, although other domain-specific extension modules may be used.

  13. Experimental evaluation of atmospheric effects on radiometric measurements using the EREP of Skylab. [Salton Sea and Great Salt Lake

    NASA Technical Reports Server (NTRS)

    Chang, D. T. (Principal Investigator); Isaacs, R. G.

    1975-01-01

    The author has identified the following significant results. Test sites were located near the Great Salt Lake and the Salton Sea. Calculations were performed for a set of atmospheric models corresponding to the test sites, in addition to standard models for summer and winter midlatitude atmospheres with respective integrated water vapor amount of 2.4 g/sq cm and 0.9 g/sq cm. Each atmosphere was found to contain an average amount of continental aerosol. Computations were valid for high solar elevation angles. Atmospheric attenuation quantities were computed in addition to simulated EREP S192 radiances.

  14. Anatomy of a Security Operations Center

    NASA Technical Reports Server (NTRS)

    Wang, John

    2010-01-01

    Many agencies and corporations are either contemplating or in the process of building a cyber Security Operations Center (SOC). Those Agencies that have established SOCs are most likely working on major revisions or enhancements to existing capabilities. As principle developers of the NASA SOC; this Presenters' goals are to provide the GFIRST community with examples of some of the key building blocks of an Agency scale cyber Security Operations Center. This presentation viII include the inputs and outputs, the facilities or shell, as well as the internal components and the processes necessary to maintain the SOC's subsistence - in other words, the anatomy of a SOC. Details to be presented include the SOC architecture and its key components: Tier 1 Call Center, data entry, and incident triage; Tier 2 monitoring, incident handling and tracking; Tier 3 computer forensics, malware analysis, and reverse engineering; Incident Management System; Threat Management System; SOC Portal; Log Aggregation and Security Incident Management (SIM) systems; flow monitoring; IDS; etc. Specific processes and methodologies discussed include Incident States and associated Work Elements; the Incident Management Workflow Process; Cyber Threat Risk Assessment methodology; and Incident Taxonomy. The Evolution of the Cyber Security Operations Center viII be discussed; starting from reactive, to proactive, and finally to proactive. Finally, the resources necessary to establish an Agency scale SOC as well as the lessons learned in the process of standing up a SOC viII be presented.

  15. Documentation of a computer program to simulate lake-aquifer interaction using the MODFLOW ground water flow model and the MOC3D solute-transport model

    USGS Publications Warehouse

    Merritt, Michael L.; Konikow, Leonard F.

    2000-01-01

    Heads and flow patterns in surficial aquifers can be strongly influenced by the presence of stationary surface-water bodies (lakes) that are in direct contact, vertically and laterally, with the aquifer. Conversely, lake stages can be significantly affected by the volume of water that seeps through the lakebed that separates the lake from the aquifer. For these reasons, a set of computer subroutines called the Lake Package (LAK3) was developed to represent lake/aquifer interaction in numerical simulations using the U.S. Geological Survey three-dimensional, finite-difference, modular ground-water flow model MODFLOW and the U.S. Geological Survey three-dimensional method-of-characteristics solute-transport model MOC3D. In the Lake Package described in this report, a lake is represented as a volume of space within the model grid which consists of inactive cells extending downward from the upper surface of the grid. Active model grid cells bordering this space, representing the adjacent aquifer, exchange water with the lake at a rate determined by the relative heads and by conductances that are based on grid cell dimensions, hydraulic conductivities of the aquifer material, and user-specified leakance distributions that represent the resistance to flow through the material of the lakebed. Parts of the lake may become ?dry? as upper layers of the model are dewatered, with a concomitant reduction in lake surface area, and may subsequently rewet when aquifer heads rise. An empirical approximation has been encoded to simulate the rewetting of a lake that becomes completely dry. The variations of lake stages are determined by independent water budgets computed for each lake in the model grid. This lake budget process makes the package a simulator of the response of lake stage to hydraulic stresses applied to the aquifer. Implementation of a lake water budget requires input of parameters including those representing the rate of lake atmospheric recharge and evaporation, overland runoff, and the rate of any direct withdrawal from, or augmentation of, the lake volume. The lake/aquifer interaction may be simulated in both transient and steady-state flow conditions, and the user may specify that lake stages be computed explicitly, semi-implicitly, or fully-implicitly in transient simulations. The lakes, and all sources of water entering the lakes, may have solute concentrations associated with them for use in solute-transport simulations using MOC3D. The Stream Package of MODFLOW-2000 and MOC3D represents stream connections to lakes, either as inflows or outflows. Because lakes with irregular bathymetry can exist as separate pools of water at lower stages, that coalesce to become a single body of water at higher stages, logic was added to the Lake Package to allow the representation of this process as a user option. If this option is selected, a system of linked pools (sublakes) is identified in each time step and stages are equalized based on current relative sublake surface areas.

  16. 76 FR 64825 - Approval and Promulgation of Air Quality Implementation Plans, Ohio and Indiana; Redesignation of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-19

    ... emissions inventories for primary PM 2.5 ,\\1\\ NO X , and sulfur dioxide (SO 2 ),\\2\\ documented in Ohio and... measures include the following. Tier 2 Emission Standards for Vehicles and Gasoline Sulfur Standards. These... Tier 2 standards included the [[Page 64830

  17. An analysis of potential water availability from the Atwood, Leesville, and Tappan Lakes in the Muskingum River Watershed, Ohio

    USGS Publications Warehouse

    Koltun, G.F.

    2013-01-01

    This report presents the results of a study to assess potential water availability from the Atwood, Leesville, and Tappan Lakes, located within the Muskingum River Watershed, Ohio. The assessment was based on the criterion that water withdrawals should not appreciably affect maintenance of recreation-season pool levels in current use. To facilitate and simplify the assessment, it was assumed that historical lake operations were successful in maintaining seasonal pool levels, and that any discharges from lakes constituted either water that was discharged to prevent exceeding seasonal pool levels or discharges intended to meet minimum in-stream flow targets downstream from the lakes. It further was assumed that the volume of water discharged in excess of the minimum in-stream flow target is available for use without negatively impacting seasonal pool levels or downstream water uses and that all or part of it is subject to withdrawal. Historical daily outflow data for the lakes were used to determine the quantity of water that potentially could be withdrawn and the resulting quantity of water that would flow downstream (referred to as “flow-by”) on a daily basis as a function of all combinations of three hypothetical target minimum flow-by amounts (1, 2, and 3 times current minimum in-stream flow targets) and three pumping capacities (1, 2, and 3 million gallons per day). Using both U.S. Geological Survey streamgage data and lake-outflow data provided by the U.S. Army Corps of Engineers resulted in analytical periods ranging from 51 calendar years for the Atwood Lake to 73 calendar years for the Leesville and Tappan Lakes. The observed outflow time series and the computed time series of daily flow-by amounts and potential withdrawals were analyzed to compute and report order statistics (95th, 75th, 50th, 25th, 10th, and 5th percentiles) and means for the analytical period, in aggregate, and broken down by calendar month. In addition, surplus-water mass curve data were tabulated for each of the lakes. Monthly order statistics of computed withdrawals indicated that, for the three pumping capacities considered, increasing the target minimum flow-by amount tended to reduce the amount of water that can be withdrawn. The reduction was greatest in the lower percentiles of withdrawal; however, increasing the flow-by amount had no impact on potential withdrawals during high flow. In addition, for a given target minimum flow-by amount, increasing the pumping rate increased the total amount of water that could be withdrawn; however, that increase was less than a direct multiple of the increase in pumping rate for most flow statistics. Potential monthly withdrawals were observed to be more variable and more limited in some calendar months than others. Monthly order statistics and means of computed daily mean flow-by amounts indicated that flow-by amounts generally tended to be lowest during June–October and February. Increasing the target minimum flow-by amount for a given pumping rate resulted in some small increases in the magnitudes of the mean and 50th percentile and lower order statistics of computed mean flow-by, but had no effect on the magnitudes of the higher percentile statistics. Increasing the pumping rate for a given target minimum flow-by amount resulted in decreases in magnitudes of higher-percentile flow-by statistics by an amount equal to the flow equivalent of the increase in pumping rate; however, some lower percentile statistics remained unchanged.

  18. Evaluation of the C6 Lyme Enzyme Immunoassay for the Diagnosis of Lyme Disease in Children and Adolescents.

    PubMed

    Lipsett, Susan C; Branda, John A; McAdam, Alexander J; Vernacchio, Louis; Gordon, Caroline D; Gordon, Catherine R; Nigrovic, Lise E

    2016-10-01

    The commercially-available C6 Lyme enzyme immunoassay (EIA) has been approved to replace the standard whole-cell sonicate EIA as a first-tier test for the diagnosis of Lyme disease and has been suggested as a stand-alone diagnostic. However, the C6 EIA has not been extensively studied in pediatric patients undergoing evaluation for Lyme disease. We collected discarded serum samples from children and adolescents (aged ≤21 years) undergoing conventional 2-tiered testing for Lyme disease at a single hospital-based clinical laboratory located in an area endemic for Lyme disease. We performed a C6 EIA on all collected specimens, followed by a supplemental immunoblot if the C6 EIA result was positive but the whole-cell sonicate EIA result was negative. We defined a case of Lyme disease as either a clinician-diagnosed erythema migrans lesion or a positive standard 2-tiered serologic result in a patient with symptoms compatible with Lyme disease. We then compared the performance of the C6 EIA alone and as a first-tier test followed by immunoblot, with that of standard 2-tiered serology for the diagnosis of Lyme disease. Of the 944 specimens collected, 114 (12%) were from patients with Lyme disease. The C6 EIA alone had sensitivity similar to that of standard 2-tiered testing (79.8% vs 81.6% for standard 2-tiered testing; P = .71) with slightly lower specificity (94.2% vs 98.8% 2; P < .002). Addition of a supplemental immunoblot improved the specificity of the C6 EIA to 98.6%. For children and adolescents undergoing evaluation for Lyme disease, the C6 EIA could guide initial clinical decision making, although a supplemental immunoblot should still be performed. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America. All rights reserved. For permissions, e-mail journals.permissions@oup.com.

  19. Prioritization of constituents for national- and regional-scale ambient monitoring of water and sediment in the United States

    USGS Publications Warehouse

    Olsen, Lisa D.; Valder, Joshua F.; Carter, Janet M.; Zogorski, John S.

    2013-01-01

    A total of 2,541 constituents were evaluated and prioritized for national- and regional-scale ambient monitoring of water and sediment in the United States. This prioritization was done by the U.S. Geological Survey (USGS) in preparation for the upcoming third decade (Cycle 3; 2013–23) of the National Water-Quality Assessment (NAWQA) Program. This report provides the methods used to prioritize the constituents and the results of that prioritization. Constituents were prioritized by the NAWQA National Target Analyte Strategy (NTAS) work group on the basis of available information on physical and chemical properties, observed or predicted environmental occurrence and fate, and observed or anticipated adverse effects on human health or aquatic life. Constituents were evaluated within constituent groups that were determined on the basis of physical or chemical properties or on uses or sources. Some constituents were evaluated within more than one constituent group. Although comparable objectives were used in the prioritization of constituents within the different constituent groups, differences in the availability of information accessed for each constituent group led to the development of separate prioritization approaches adapted to each constituent group to make best use of available resources. Constituents were assigned to one of three prioritization tiers: Tier 1, those having the highest priority for inclusion in ambient monitoring of water or sediment on a national or regional scale (including NAWQA Cycle 3 monitoring) on the basis of their likelihood of environmental occurrence in ambient water or sediment, or likelihood of effects on human health or aquatic life; Tier 2, those having intermediate priority for monitoring on the basis of their lower likelihood of environmental occurrence or lower likelihood of effects on human health or aquatic life; and Tier 3, those having low or no priority for monitoring on the basis of evidence of nonoccurrence or lack of effects on human health or aquatic life, or of having insufficient evidence of potential occurrence or effects to justify placement into Tier 2. Of the 1,081 constituents determined to be of highest priority for ambient monitoring (Tier 1), 602 were identified for water and 686 were identified for sediment (note that some constituents were evaluated for both water and sediment). These constituents included various types of organic compounds, trace elements and other inorganic constituents, and radionuclides. Some of these constituents are difficult to analyze, whereas others are mixtures, isomers, congeners, salts, and acids of other constituents; therefore, modifications to the list of high-priority constituents for ambient monitoring could be made on the basis of the availability of suitable methods for preparation, extraction, or analysis. An additional 1,460 constituents were placed into Tiers 2 or 3 for water or sediment, including some constituents that had been placed into Tier 1 for a different matrix; 436 constituents were placed into Tier 2 for water and 246 constituents into Tier 2 for sediment; 979 constituents were placed into Tier 3 for water and 779 constituents into Tier 3 for sediment.

  20. 75 FR 1055 - Agency Information Collection Activities; Submission to OMB for Review and Approval; Comment...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-08

    ... for biodiesel, water/diesel emulsions, several atypical additives, and renewable diesel fuels. Tier 2... effects. Tier 2 data have been submitted for baseline diesel, biodiesel, and water/diesel emulsions...

  1. Improving Multi-Objective Management of Water Quality Tipping Points: Revisiting the Classical Shallow Lake Problem

    NASA Astrophysics Data System (ADS)

    Quinn, J. D.; Reed, P. M.; Keller, K.

    2015-12-01

    Recent multi-objective extensions of the classical shallow lake problem are useful for exploring the conceptual and computational challenges that emerge when managing irreversible water quality tipping points. Building on this work, we explore a four objective version of the lake problem where a hypothetical town derives economic benefits from polluting a nearby lake, but at the risk of irreversibly tipping the lake into a permanently polluted state. The trophic state of the lake exhibits non-linear threshold dynamics; below some critical phosphorus (P) threshold it is healthy and oligotrophic, but above this threshold it is irreversibly eutrophic. The town must decide how much P to discharge each year, a decision complicated by uncertainty in the natural P inflow to the lake. The shallow lake problem provides a conceptually rich set of dynamics, low computational demands, and a high level of mathematical difficulty. These properties maximize its value for benchmarking the relative merits and limitations of emerging decision support frameworks, such as Direct Policy Search (DPS). Here, we explore the use of DPS as a formal means of developing robust environmental pollution control rules that effectively account for deeply uncertain system states and conflicting objectives. The DPS reformulation of the shallow lake problem shows promise in formalizing pollution control triggers and signposts, while dramatically reducing the computational complexity of the multi-objective pollution control problem. More broadly, the insights from the DPS variant of the shallow lake problem formulated in this study bridge emerging work related to socio-ecological systems management, tipping points, robust decision making, and robust control.

  2. Using latent class analysis to identify academic and behavioral risk status in elementary students.

    PubMed

    King, Kathleen R; Lembke, Erica S; Reinke, Wendy M

    2016-03-01

    Identifying classes of children on the basis of academic and behavior risk may have important implications for the allocation of intervention resources within Response to Intervention (RTI) and Multi-Tiered System of Support (MTSS) models. Latent class analysis (LCA) was conducted with a sample of 517 third grade students. Fall screening scores in the areas of reading, mathematics, and behavior were used as indicators of success on an end of year statewide achievement test. Results identified 3 subclasses of children, including a class with minimal academic and behavioral concerns (Tier 1; 32% of the sample), a class at-risk for academic problems and somewhat at-risk for behavior problems (Tier 2; 37% of the sample), and a class with significant academic and behavior problems (Tier 3; 31%). Each class was predictive of end of year performance on the statewide achievement test, with the Tier 1 class performing significantly higher on the test than the Tier 2 class, which in turn scored significantly higher than the Tier 3 class. The results of this study indicated that distinct classes of children can be determined through brief screening measures and are predictive of later academic success. Further implications are discussed for prevention and intervention for students at risk for academic failure and behavior problems. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Professional Development to Differentiate Kindergarten Tier 1 Instruction: Can Already Effective Teachers Improve Student Outcomes by Differentiating Tier 1 Instruction?

    ERIC Educational Resources Information Center

    Al Otaiba, Stephanie; Folsom, Jessica S.; Wanzek, Jeanne; Greulich, Luana; Waesche, Jessica; Schatschneider, Christopher; Connor, Carol M.

    2016-01-01

    Two primary purposes guided this quasi-experimental within-teacher study: (a) to examine changes from baseline through 2 years of professional development (Individualizing Student Instruction) in kindergarten teachers' differentiation of Tier 1 literacy instruction; and (b) to examine changes in reading and vocabulary of 3 cohorts of the teachers'…

  4. Lessons Learned from Implementing a Check-in/Check-out Behavioral Program in an Urban Middle School

    ERIC Educational Resources Information Center

    Myers, Diane M.; Briere, Donald E., III

    2010-01-01

    Schoolwide positive behavior support (SWPBS) is an empirically supported approach that is implemented by more than 10,000 schools in the United States to support student and staff behavior (www.pbis.org). SWPBS is based on a three-tiered prevention logic: (a) Tier 1 interventions support all students; (b) Tier 2 interventions support targeted…

  5. 26 CFR 1.960-1 - Foreign tax credit with respect to taxes paid on earnings and profits of controlled foreign...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...-, second-, or third-tier corporation's earnings and profits. Section 1.960-2 prescribes rules for applying section 902 to dividends paid by a third-, second-, or first-tier corporation from earnings and profits...) Second-tier corporation. In the case of amounts included in the gross income of the taxpayer under...

  6. 77 FR 52344 - Information Collection Request Sent to the Office of Management and Budget (OMB) for Approval...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-29

    ... responses time per burden per nonhour response hours response burden cost Tier 1 (Desktop Analysis... developers of these small-scale projects do the desktop analysis described in Tier 1 or Tier 2 using publicly... published in the Federal Register (77 FR 19683) a notice of our intent to request that OMB renew approval...

  7. 12 CFR 565.4 - Capital measures and capital category definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...-based capital ratio; (2) The Tier 1 risk-based capital ratio; and (3) The leverage ratio. (b) Capital...; and (ii) Has a Tier 1 risk-based capital ratio of 6.0 percent or greater; and (iii) Has a leverage... total risk-based capital ratio of 8.0 percent or greater; and (ii) Has a Tier 1 risk-based capital ratio...

  8. 12 CFR 208.73 - What additional provisions are applicable to state member banks with financial subsidiaries?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... subsidiaries from both the bank's Tier 1 capital and Tier 2 capital; and (ii) Deduct the entire amount of the... deducted from the bank's Tier 1 capital. (b) Financial statement disclosure of capital deduction. Any... (including the well capitalized standard of § 208.71(a)(1)): (1) The bank must not consolidate the assets and...

  9. Use of a 2-tier histologic grading system for canine cutaneous mast cell tumors on cytology specimens.

    PubMed

    Hergt, Franziska; von Bomhard, Wolf; Kent, Michael S; Hirschberger, Johannes

    2016-09-01

    Mast cell tumors (MCT) represent the most common malignant skin tumor in the dog. Diagnosis of an MCT can be achieved through cytologic examination of a fine-needle aspirate. However, the grade of the tumor is an important prognostic marker and currently requires histologic assessment. Recently a 2-tier histologic grading system based on nuclear features including number of mitoses, multinucleated cells, bizarre nuclei, and karyomegaly was proposed. The aim of this study was to assess if the cytomorphologic criteria proposed in the 2-tier histologic grading system are applicable to cytology specimens. A total of 141 MCT specimens reported as grade I, II, or III according to the Patnaik system with both histologic specimens and fine-needle aspirates available were histologically and cytologically reevaluated in a retrospective study. According to the 2-tier grading system, 38 cases were diagnosed histologically as high-grade and 103 as low-grade MCT. Cytologic grading resulted in 36 high-grade and 105 low-grade tumors. Agreement between histologic and cytologic grading based on the 2-tier grading system was achieved in 133 cases (sensitivity 86.8%, specificity 97.1%, kappa value 0.853), but 5 high-grade tumors on histology were classified as low-grade on cytology. Cytologic grading of MCT in the dog is helpful for initial assessment. However, the reliability of cytology using the 2-tier grading system is considered inadequate at this point. Prospective studies including clinical outcome should be pursued to further determine diagnostic accuracy of cytologic mast cell grading. © 2016 American Society for Veterinary Clinical Pathology.

  10. To Wait in Tier 1 or Intervene Immediately: A Randomized Experiment Examining First Grade Response to Intervention (RTI) in Reading.

    PubMed

    Al Otaiba, Stephanie; Connor, Carol M; Folsom, Jessica S; Wanzek, Jeanne; Greulich, Luana; Schatschneider, Christopher; Wagner, Richard K

    2014-10-01

    This randomized controlled experiment compared the efficacy of two Response to Intervention (RTI) models - Typical RTI and Dynamic RTI - and included 34 first-grade classrooms ( n = 522 students) across 10 socio-economically and culturally diverse schools. Typical RTI was designed to follow the two-stage RTI decision rules that wait to assess response to Tier 1 in many districts, whereas Dynamic RTI provided Tier 2 or Tier 3 interventions immediately according to students' initial screening results. Interventions were identical across conditions except for when intervention began. Reading assessments included letter-sound, word, and passage reading, and teacher-reported severity of reading difficulties. An intent-to-treat analysis using multi-level modeling indicated an overall effect favoring the Dynamic RTI condition ( d = .36); growth curve analyses demonstrated that students in Dynamic RTI showed an immediate score advantage, and effects accumulated across the year. Analyses of standard score outcomes confirmed that students in the Dynamic condition who received Tier 2 and Tier 3 ended the study with significantly higher reading performance than students in the Typical condition. Implications for RTI implementation practice and for future research are discussed.

  11. To Wait in Tier 1 or Intervene Immediately: A Randomized Experiment Examining First Grade Response to Intervention (RTI) in Reading

    PubMed Central

    Al Otaiba, Stephanie; Connor, Carol M.; Folsom, Jessica S.; Wanzek, Jeanne; Greulich, Luana; Schatschneider, Christopher; Wagner, Richard K.

    2014-01-01

    This randomized controlled experiment compared the efficacy of two Response to Intervention (RTI) models – Typical RTI and Dynamic RTI - and included 34 first-grade classrooms (n = 522 students) across 10 socio-economically and culturally diverse schools. Typical RTI was designed to follow the two-stage RTI decision rules that wait to assess response to Tier 1 in many districts, whereas Dynamic RTI provided Tier 2 or Tier 3 interventions immediately according to students’ initial screening results. Interventions were identical across conditions except for when intervention began. Reading assessments included letter-sound, word, and passage reading, and teacher-reported severity of reading difficulties. An intent-to-treat analysis using multi-level modeling indicated an overall effect favoring the Dynamic RTI condition (d = .36); growth curve analyses demonstrated that students in Dynamic RTI showed an immediate score advantage, and effects accumulated across the year. Analyses of standard score outcomes confirmed that students in the Dynamic condition who received Tier 2 and Tier 3 ended the study with significantly higher reading performance than students in the Typical condition. Implications for RTI implementation practice and for future research are discussed. PMID:25530622

  12. Strain-Specific V3 and CD4 Binding Site Autologous HIV-1 Neutralizing Antibodies Select Neutralization-Resistant Viruses.

    PubMed

    Moody, M Anthony; Gao, Feng; Gurley, Thaddeus C; Amos, Joshua D; Kumar, Amit; Hora, Bhavna; Marshall, Dawn J; Whitesides, John F; Xia, Shi-Mao; Parks, Robert; Lloyd, Krissey E; Hwang, Kwan-Ki; Lu, Xiaozhi; Bonsignori, Mattia; Finzi, Andrés; Vandergrift, Nathan A; Alam, S Munir; Ferrari, Guido; Shen, Xiaoying; Tomaras, Georgia D; Kamanga, Gift; Cohen, Myron S; Sam, Noel E; Kapiga, Saidi; Gray, Elin S; Tumba, Nancy L; Morris, Lynn; Zolla-Pazner, Susan; Gorny, Miroslaw K; Mascola, John R; Hahn, Beatrice H; Shaw, George M; Sodroski, Joseph G; Liao, Hua-Xin; Montefiori, David C; Hraber, Peter T; Korber, Bette T; Haynes, Barton F

    2015-09-09

    The third variable (V3) loop and the CD4 binding site (CD4bs) of the HIV-1 envelope are frequently targeted by neutralizing antibodies (nAbs) in infected individuals. In chronic infection, HIV-1 escape mutants repopulate the plasma, and V3 and CD4bs nAbs emerge that can neutralize heterologous tier 1 easy-to-neutralize but not tier 2 difficult-to-neutralize HIV-1 isolates. However, neutralization sensitivity of autologous plasma viruses to this type of nAb response has not been studied. We describe the development and evolution in vivo of antibodies distinguished by their target specificity for V3 and CD4bs epitopes on autologous tier 2 viruses but not on heterologous tier 2 viruses. A surprisingly high fraction of autologous circulating viruses was sensitive to these antibodies. These findings demonstrate a role for V3 and CD4bs antibodies in constraining the native envelope trimer in vivo to a neutralization-resistant phenotype, explaining why HIV-1 transmission generally occurs by tier 2 neutralization-resistant viruses. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. High-resolution mapping of wetland vegetation biomass and distribution with L-band radar in southeastern coastal Louisiana

    NASA Astrophysics Data System (ADS)

    Thomas, N. M.; Simard, M.; Byrd, K. B.; Windham-Myers, L.; Castaneda, E.; Twilley, R.; Bevington, A. E.; Christensen, A.

    2017-12-01

    Louisiana coastal wetlands account for approximately one third (37%) of the estuarine wetland vegetation in the conterminous United States, yet the spatial distribution of their extent and aboveground biomass (AGB) is not well defined. This knowledge is critical for the accurate completion of national greenhouse gas (GHG) inventories. We generated high-resolution baselines maps of wetland vegetation extent and biomass at the Atchafalaya and Terrebonne basins in coastal Louisiana using a multi-sensor approach. Optical satellite data was used within an object-oriented machine learning approach to classify the structure of wetland vegetation types, offering increased detail over currently available land cover maps that do not distinguish between wetland vegetation types nor account for non-permanent seasonal changes in extent. We mapped 1871 km2 of wetlands during a period of peak biomass in September 2015 comprised of flooded forested wetlands and leaf, grass and emergent herbaceous marshes. The distribution of aboveground biomass (AGB) was mapped using JPL L-band Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR). Relationships between time-series radar imagery and field data collected in May 2015 and September 2016 were derived to estimate AGB at the Wax Lake and Atchafalaya deltas. Differences in seasonal biomass estimates reflect the increased AGB in September over May, concurrent with periods of peak biomass and the onset of the vegetation growing season, respectively. This method provides a tractable means of mapping and monitoring biomass of wetland vegetation types with L-band radar, in a region threatened with wetland loss under projections of increasing sea-level rise and terrestrial subsidence. Through this, we demonstrate a method that is able to satisfy the IPCC 2013 Wetlands Supplement requirement for Tier 2/Tier 3 reporting of coastal wetland GHG inventories.

  14. 40 CFR 86.1810-09 - General standards; increase in emissions; unsafe condition; waivers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... light-duty vehicles and light-duty trucks fueled by gasoline, diesel, methanol, ethanol, natural gas and... applicable to methanol fueled vehicles are also applicable to Tier 2 and interim non-Tier 2 ethanol fueled...

  15. 40 CFR 86.1810-09 - General standards; increase in emissions; unsafe condition; waivers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... light-duty vehicles and light-duty trucks fueled by gasoline, diesel, methanol, ethanol, natural gas and... applicable to methanol fueled vehicles are also applicable to Tier 2 and interim non-Tier 2 ethanol fueled...

  16. An Integrated Tiered Service Delivery Model (ITSDM) Based on Local CD4 Testing Demands Can Improve Turn-Around Times and Save Costs whilst Ensuring Accessible and Scalable CD4 Services across a National Programme

    PubMed Central

    Glencross, Deborah K.; Coetzee, Lindi M.; Cassim, Naseem

    2014-01-01

    Background The South African National Health Laboratory Service (NHLS) responded to HIV treatment initiatives with two-tiered CD4 laboratory services in 2004. Increasing programmatic burden, as more patients access anti-retroviral therapy (ART), has demanded extending CD4 services to meet increasing clinical needs. The aim of this study was to review existing services and develop a service-model that integrated laboratory-based and point-of-care testing (POCT), to extend national coverage, improve local turn-around/(TAT) and contain programmatic costs. Methods NHLS Corporate Data Warehouse CD4 data, from 60–70 laboratories and 4756 referring health facilities was reviewed for referral laboratory workload, respective referring facility volumes and related TAT, from 2009–2012. Results An integrated tiered service delivery model (ITSDM) is proposed. Tier-1/POCT delivers CD4 testing at single health-clinics providing ART in hard-to-reach areas (<5 samples/day). Laboratory-based testing is extended with Tier-2/POC-Hubs (processing ≤30–40 CD4 samples/day), consolidating POCT across 8–10 health-clinics with other HIV-related testing and Tier-3/‘community’ laboratories, serving ≤40 health-clinics, processing ≤150 samples/day. Existing Tier-4/‘regional’ laboratories serve ≤100 facilities and process <350 samples/day; Tier-5 are high-volume ‘metro’/centralized laboratories (>350–1500 tests/day, serving ≥200 health-clinics). Tier-6 provides national support for standardisation, harmonization and quality across the organization. Conclusion The ITSDM offers improved local TAT by extending CD4 services into rural/remote areas with new Tier-3 or Tier-2/POC-Hub services installed in existing community laboratories, most with developed infrastructure. The advantage of lower laboratory CD4 costs and use of existing infrastructure enables subsidization of delivery of more expensive POC services, into hard-to-reach districts without reasonable access to a local CD4 laboratory. Full ITSDM implementation across 5 service tiers (as opposed to widespread implementation of POC testing to extend service) can facilitate sustainable ‘full service coverage’ across South Africa, and save>than R125 million in HIV/AIDS programmatic costs. ITSDM hierarchical parental-support also assures laboratory/POC management, equipment maintenance, quality control and on-going training between tiers. PMID:25490718

  17. A water-budget analysis of Medina and Diversion Lakes and the Medina/Diversion Lake system, with estimated recharge to Edwards aquifer, San Antonio area, Texas

    USGS Publications Warehouse

    Slattery, Richard N.; Miller, Lisa D.

    2004-12-22

    In January 2001, the U.S. Geological Survey—in cooperation with the Edwards Aquifer Authority—began a study to refine and, if possible, extend previously derived (1995–96) relations between the stage in Medina Lake and recharge to the Edwards aquifer to include the effects of reservoir stages below 1,018 feet and greater than 1,046 feet above National Geodetic Vertical Datum of 1929. The principal objective of this present (2001–02) study was to estimate ground-water outflow (seepage) from Medina Lake, Diversion Lake, and from the Medina/Diversion Lake system through the calculation of water budgets representing steady-state conditions over as wide a range as possible in the stages of Medina and Diversion Lakes. The water budgets were compiled for selected periods during which time the water-budget components were inferred to be relatively stable and the influence of precipitation, stormwater runoff, and changes in storage were presumably minimal.Water budgets for the Medina/Diversion Lake system were compiled for 127 water-budget periods ranging from 8 to 78 days from daily hydrologic data collected during March 1955–September 1964, October 1995–September 1996, and February 2001–June 2002. Budgets for Medina and Diversion Lakes were compiled for 14 periods ranging from 8 to 23 days from daily hydrologic data collected only during October 1995–September 1996 and April 2001–June 2002.Linear equations were developed to relate the stage in Medina Lake to ground-water outflow from Medina Lake, Diversion Lake, and the Medina/Diversion Lake system. The computed mean rates of outflow from Medina Lake ranged from about 18 to 182 acre-feet per day between stages of 1,019 and 1,064 feet above National Geodetic Vertical Datum of 1929. The computed rates of outflow from Diversion Lake ranged from about -85 to 52 acre-feet per day. The rates of outflow from the entire lake system ranged from about 5 to 178 acre-feet per day between Medina Lake stages of 963 to 1,064 feet. It is assumed that all outflow from the lake system enters the ground-water system as recharge to the Edwards aquifer.During the time that the stage in Medina Lake was greater than about 1,040 feet, Diversion Lake gained more water than it lost to the ground-water system and the rate of ground-water outflow from Medina Lake increased sharply while its stage was between about 1,043 and 1,045 feet. The observed outflow from Diversion Lake during this time decreased sharply to the extent that a net gain resulted—indicating that a substantial amount of the additional outflow from Medina Lake returned to Diversion Lake. When the stage in Medina Lake is at the spillway elevation of 1,064 feet, Diversion Lake appears to gain as much as 40 percent of the concurrent ground-water outflow from Medina Lake.An indication of water moving from the lake system into the ground-water system and back to the surface-water system was observed in the most downstream reach of the Medina River, between Diversion Lake and the Medina River near Riomedina. During conditions of no flow over Diversion Dam, this reach of the Medina River gained from about 32 to 94 acre-feet per day, with the gain increasing with increasing stage in Diversion Lake.The average of the monthly recharge to the Edwards aquifer from the Medina/Diversion Lake system—as estimated by the present study for the October 1995–September 2002 period—is 3,083 acre-feet, or about 56 percent of recharge computed for this period with a previously used (Lowry) method. The present study’s estimates of recharge for months with rising-lake stage conditions are about 44 percent of those computed with the previously used method, compared to about 60 percent for months with steady or falling-stage conditions. For stages greater than 1,045 feet, the present study estimated recharge to be about 52 percent of that computed with the previously used method, compared to about 64 percent at stages below 1,045 feet.

  18. Tree volume and biomass equations for the Lake States.

    Treesearch

    Jerold T. Hahn

    1984-01-01

    Presents species specific equations and methods for computing tree height, cubic foot, and board foot volume, and biomass for the Lake States (Michigan, Minnesota, and Wisconsin). Height equations compute either total or merchantable height to a variable top d.o.b. from d.b.h., site index, and basal area. Volumes and biomass are computed from d.b.h. and height.

  19. Numerical modeling of crater lake seepage

    NASA Astrophysics Data System (ADS)

    Todesco, M.; Rouwet, D.

    2012-04-01

    The fate of crater lake waters seeping into the volcanic edifice is poorly constrained. Quantification of the seepage flux is important in volcanic surveillance as this water loss counterbalances the inflow of hot magmatic fluids into the lake, and enters the mass balance computation. Uncertainties associated with the estimate of seepage therefore transfer to the estimate of magmatic degassing and hazard assessment. Moreover, when the often acidic lake brines disperse into the volcanic edifice, they may lead to acid attack (stress corrosion) and eventually to mechanical weakening of the volcano flanks, thereby causing an indirect volcanic risk. Understanding of the features that control the underground propagation of lake waters and their interactions with the magmatic-hydrothermal system is therefore highly recommended in volcanic hazard assessment. In this work, we use the TOUGH2 geothermal simulator to investigate crater lake water seepage in different volcanic settings. Modeling is carried out to describe the evolution of a hydrothermal system open on a hot, pressurized reservoir of dry gas and capped by a volcanic lake. Numerical simulations investigate the role of lake morphology, system geometry, rock properties, and of the conditions applied to the lake and to the gas reservoir at depth.

  20. 12 CFR 24.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Definitions. 24.2 Section 24.2 Banks and... ENTITIES, COMMUNITY DEVELOPMENT PROJECTS, AND OTHER PUBLIC WELFARE INVESTMENTS § 24.2 Definitions. For... adequately capitalized in 12 CFR 6.4. (b) Capital and surplus means: (1) A bank's Tier 1 and Tier 2 capital...

  1. Pooling the resources of the CMS Tier-1 sites

    DOE PAGES

    Apyan, A.; Badillo, J.; Cruz, J. Diaz; ...

    2015-12-23

    The CMS experiment at the LHC relies on 7 Tier-1 centres of the WLCG to perform the majority of its bulk processing activity, and to archive its data. During the first run of the LHC, these two functions were tightly coupled as each Tier-1 was constrained to process only the data archived on its hierarchical storage. This lack of flexibility in the assignment of processing workflows occasionally resulted in uneven resource utilisation and in an increased latency in the delivery of the results to the physics community.The long shutdown of the LHC in 2013-2014 was an opportunity to revisit thismore » mode of operations, disentangling the processing and archive functionalities of the Tier-1 centres. The storage services at the Tier-1s were redeployed breaking the traditional hierarchical model: each site now provides a large disk storage to host input and output data for processing, and an independent tape storage used exclusively for archiving. Movement of data between the tape and disk endpoints is not automated, but triggered externally through the WLCG transfer management systems.With this new setup, CMS operations actively controls at any time which data is available on disk for processing and which data should be sent to archive. Thanks to the high-bandwidth connectivity guaranteed by the LHCOPN, input data can be freely transferred between disk endpoints as needed to take advantage of free CPU, turning the Tier-1s into a large pool of shared resources. The output data can be validated before archiving them permanently, and temporary data formats can be produced without wasting valuable tape resources. Lastly, the data hosted on disk at Tier-1s can now be made available also for user analysis since there is no risk any longer of triggering chaotic staging from tape.In this contribution, we describe the technical solutions adopted for the new disk and tape endpoints at the sites, and we report on the commissioning and scale testing of the service. We detail the procedures implemented by CMS computing operations to actively manage data on disk at Tier-1 sites, and we give examples of the benefits brought to CMS workflows by the additional flexibility of the new system.« less

  2. Pooling the resources of the CMS Tier-1 sites

    NASA Astrophysics Data System (ADS)

    Apyan, A.; Badillo, J.; Diaz Cruz, J.; Gadrat, S.; Gutsche, O.; Holzman, B.; Lahiff, A.; Magini, N.; Mason, D.; Perez, A.; Stober, F.; Taneja, S.; Taze, M.; Wissing, C.

    2015-12-01

    The CMS experiment at the LHC relies on 7 Tier-1 centres of the WLCG to perform the majority of its bulk processing activity, and to archive its data. During the first run of the LHC, these two functions were tightly coupled as each Tier-1 was constrained to process only the data archived on its hierarchical storage. This lack of flexibility in the assignment of processing workflows occasionally resulted in uneven resource utilisation and in an increased latency in the delivery of the results to the physics community. The long shutdown of the LHC in 2013-2014 was an opportunity to revisit this mode of operations, disentangling the processing and archive functionalities of the Tier-1 centres. The storage services at the Tier-1s were redeployed breaking the traditional hierarchical model: each site now provides a large disk storage to host input and output data for processing, and an independent tape storage used exclusively for archiving. Movement of data between the tape and disk endpoints is not automated, but triggered externally through the WLCG transfer management systems. With this new setup, CMS operations actively controls at any time which data is available on disk for processing and which data should be sent to archive. Thanks to the high-bandwidth connectivity guaranteed by the LHCOPN, input data can be freely transferred between disk endpoints as needed to take advantage of free CPU, turning the Tier-1s into a large pool of shared resources. The output data can be validated before archiving them permanently, and temporary data formats can be produced without wasting valuable tape resources. Finally, the data hosted on disk at Tier-1s can now be made available also for user analysis since there is no risk any longer of triggering chaotic staging from tape. In this contribution, we describe the technical solutions adopted for the new disk and tape endpoints at the sites, and we report on the commissioning and scale testing of the service. We detail the procedures implemented by CMS computing operations to actively manage data on disk at Tier-1 sites, and we give examples of the benefits brought to CMS workflows by the additional flexibility of the new system.

  3. Single-Tier Testing with the C6 Peptide ELISA Kit Compared with Two-Tier Testing for Lyme Disease

    PubMed Central

    Wormser, Gary P.; Schriefer, Martin; Aguero-Rosenfeld, Maria E.; Levin, Andrew; Steere, Allen C.; Nadelman, Robert B.; Nowakowski, John; Marques, Adriana; Johnson, Barbara J. B.; Dumler, J. Stephen

    2014-01-01

    Background The two-tier serologic testing protocol for Lyme disease has a number of shortcomings including low sensitivity in early disease; increased cost, time and labor; and subjectivity in the interpretation of immunoblots. Methods The diagnostic accuracy of a single-tier commercial C6 ELISA kit was compared with two-tier testing. Results The C6 ELISA was significantly more sensitive than two-tier testing with sensitivities of 66.5% (95% C.I.:61.7-71.1) and 35.2% (95%C.I.:30.6-40.1), respectively (p<0.001) in 403 sera from patients with erythema migrans. The C6 ELISA had sensitivity statistically comparable to two-tier testing in sera from Lyme disease patients with early neurological manifestations (88.6% vs. 77.3%, p=0.13) or arthritis (98.3% vs. 95.6%, p= 0.38). Te specificities of C6 ELISA and two-tier testing in over 2200 blood donors, patients with other conditions, and Lyme disease vaccine recipients were found to be 98.9% and 99.5%, respectively (p<0.05, 95% C.I. surrounding the 0.6 percentage point difference of 0.04 to 1.15). Conclusions Using a reference standard of two-tier testing, the C6 ELISA as a single step serodiagnostic test provided increased sensitivity in early Lyme disease with comparable sensitivity in later manifestations of Lyme disease. The C6 ELISA had slightly decreased specificity. Future studies should evaluate the performance of the C6 ELISA compared with two-tier testing in routine clinical practice. PMID:23062467

  4. Geochemical evolution of a high arsenic, alkaline pit-lake in the Mother Lode Gold District, California

    USGS Publications Warehouse

    Savage, Kaye S.; Ashley, Roger P.; Bird, Dennis K.

    2009-01-01

    The Harvard orebody at the Jamestown gold mine, located along the Melones fault zone in the southern Mother Lode gold district, California, was mined in an open-pit operation from 1987 to 1994. Dewatering during mining produced a hydrologic cone of depression; recovery toward the premining ground-water configuration produced a monomictic pit lake with alkaline Ca-Mg-HCO3-SO4–type pit water, concentrations of As up to 1,200 μg/L, and total dissolved solids (TDS) up to 2,000 mg/L. In this study, pit-wall rocks were mapped and chemically analyzed to provide a context for evaluating observed variability in the composition of the pit-lake waters in relationship to seasonal weather patterns. An integrated hydrogeochemical model of pit-lake evolution based on observations of pit-lake volume, water composition (samples collected between 1998–2000, 2004), and processes occurring on pit walls was developed in three stages using the computer code PHREEQC. Stage 1 takes account of seasonally variable water fluxes from precipitation, evaporation, springs, and ground water, as well as lake stratification and mixing processes. Stage 2 adds CO2fluxes and wall-rock interactions, and stage 3 assesses the predictive capability of the model.Two major geologic units in fault contact comprise the pit walls. The hanging wall is composed of interlayered slate, metavolcanic and metavolcaniclastic rocks, and schists; the footwall rocks are chlorite-actinolite and talc-tremolite schists generated by metasomatism of greenschist-facies mafic and ultramafic igneous rocks. Alteration in the ore zone provides evidence for mineralizing fluids that introduced CO2, S, and K2O, and redistributed SiO2. Arsenian pyrite associated with the alteration weathers to produce goethite and jarosite on pit walls and in joints, as well as copiapite and hexahydrite efflorescences that accumulate on wall-rock faces during dry California summers. All of these pyrite weathering products incorporate arsenic at concentrations from <100 up to 1,200 ppm. In the pit lake, pH and TDS reach seasonal highs in the summer epilimnion; pH is lowest in the summer hypolimnion. Arsenic and bicarbonate covary in the hypolimnion, rising as stratification proceeds and declining during winter rains. The computational model suggests that water fluxes alone do not account for this seasonal variability. Loss of CO2 to the atmosphere, interaction with pit walls including washoff of efflorescent salts during the first flush and seasonal rainfall, and arsenic sorption appear to contribute to the observed pit-lake characteristics.

  5. Health monitoring display system for a complex plant

    DOEpatents

    Ridolfo, Charles F [Bloomfield, CT; Harmon, Daryl L [Enfield, CT; Colin, Dreyfuss [Enfield, CT

    2006-08-08

    A single page enterprise wide level display provides a comprehensive readily understood representation of the overall health status of a complex plant. Color coded failure domains allow rapid intuitive recognition of component failure status. A three-tier hierarchy of displays provide details on the health status of the components and systems displayed on the enterprise wide level display in a manner that supports a logical drill down to the health status of sub-components on Tier 1 to expected faults of the sub-components on Tier 2 to specific information relative to expected sub-component failures on Tier 3.

  6. Secure anonymous mutual authentication for star two-tier wireless body area networks.

    PubMed

    Ibrahim, Maged Hamada; Kumari, Saru; Das, Ashok Kumar; Wazid, Mohammad; Odelu, Vanga

    2016-10-01

    Mutual authentication is a very important service that must be established between sensor nodes in wireless body area network (WBAN) to ensure the originality and integrity of the patient's data sent by sensors distributed on different parts of the body. However, mutual authentication service is not enough. An adversary can benefit from monitoring the traffic and knowing which sensor is in transmission of patient's data. Observing the traffic (even without disclosing the context) and knowing its origin, it can reveal to the adversary information about the patient's medical conditions. Therefore, anonymity of the communicating sensors is an important service as well. Few works have been conducted in the area of mutual authentication among sensor nodes in WBAN. However, none of them has considered anonymity among body sensor nodes. Up to our knowledge, our protocol is the first attempt to consider this service in a two-tier WBAN. We propose a new secure protocol to realize anonymous mutual authentication and confidential transmission for star two-tier WBAN topology. The proposed protocol uses simple cryptographic primitives. We prove the security of the proposed protocol using the widely-accepted Burrows-Abadi-Needham (BAN) logic, and also through rigorous informal security analysis. In addition, to demonstrate the practicality of our protocol, we evaluate it using NS-2 simulator. BAN logic and informal security analysis prove that our proposed protocol achieves the necessary security requirements and goals of an authentication service. The simulation results show the impact on the various network parameters, such as end-to-end delay and throughput. The nodes in the network require to store few hundred bits. Nodes require to perform very few hash invocations, which are computationally very efficient. The communication cost of the proposed protocol is few hundred bits in one round of communication. Due to the low computation cost, the energy consumed by the nodes is also low. Our proposed protocol is a lightweight anonymous mutually authentication protocol to mutually authenticate the sensor nodes with the controller node (hub) in a star two-tier WBAN topology. Results show that our protocol proves efficiency over previously proposed protocols and at the same time, achieves the necessary security requirements for a secure anonymous mutual authentication scheme. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  7. Impact of 3-tier formularies on drug treatment of attention-deficit/hyperactivity disorder in children.

    PubMed

    Huskamp, Haiden A; Deverka, Patricia A; Epstein, Arnold M; Epstein, Robert S; McGuigan, Kimberly A; Muriel, Anna C; Frank, Richard G

    2005-04-01

    Expenditures for medications used to treat attention-deficit/hyperactivity disorder (ADHD) in children have increased rapidly. Many employers and health plans have adopted 3-tier formularies in an attempt to control costs for these and other drugs. To assess the effect of copayment increases associated with 3-tier formulary adoption on use and spending patterns for ADHD medications for children. Observational study using quasi-experimental design to compare effects on ADHD medication use and spending for children enrolled as dependents in an employer-sponsored plan that made major changes to its pharmacy benefit design and a comparison group of children covered by the same insurer. The plan simultaneously moved from a 1-tier (same copayment required for all drugs) to a 3-tier formulary and implemented an across-the-board copayment increase. The plan later moved 3 drugs from tier 3 to tier 2. An intervention group of 20 326 and a comparison group of 15 776 children aged 18 years and younger. Monthly probability of using an ADHD medication; plan, enrollee, and total ADHD medication spending; and medication continuation. A 3-tier formulary implementation resulted in a 17% decrease in the monthly probability of using medication (P<.001), a 20% decrease in expected total medication expenditures, and a substantial shifting of costs from the plan to families (P<.001). Intervention group children using medications in the pre-period were more likely to change to a medication in a different tier after 3-tier adoption, relative to the comparison group (P = .08). The subsequent tier changes resulted in increased plan spending (P<.001) and decreased patient spending (P = .003) for users but no differences in continuation. The copayment increases associated with 3-tier formulary implementation by 1 employer resulted in lower total ADHD medication spending, sizeable increases in out-of-pocket expenditures for families of children with ADHD, and a significant decrease in the probability of using these medications.

  8. A systems relations model for Tier 2 early intervention child mental health services with schools: an exploratory study.

    PubMed

    van Roosmalen, Marc; Gardner-Elahi, Catherine; Day, Crispin

    2013-01-01

    Over the last 15 years, policy initiatives have aimed at the provision of more comprehensive Child and Adolescent Mental Health care. These presented a series of new challenges in organising and delivering Tier 2 child mental health services, particularly in schools. This exploratory study aimed to examine and clarify the service model underpinning a Tier 2 child mental health service offering school-based mental health work. Using semi-structured interviews, clinician descriptions of operational experiences were gathered. These were analysed using grounded theory methods. Analysis was validated by respondents at two stages. A pathway for casework emerged that included a systemic consultative function, as part of an overall three-function service model, which required: (1) activity as a member of the multi-agency system; (2) activity to improve the system working around a particular child; and (3) activity to universally develop a Tier 1 workforce confident in supporting children at risk of or experiencing mental health problems. The study challenged the perception of such a service serving solely a Tier 2 function, the requisite workforce to deliver the service model, and could give service providers a rationale for negotiating service models that include an explicit focus on improving the children's environments.

  9. Evaluation of Modified 2-Tiered Serodiagnostic Testing Algorithms for Early Lyme Disease

    PubMed Central

    Strle, Klemen; Nigrovic, Lise E.; Lantos, Paul M.; Lepore, Timothy J.; Damle, Nitin S.; Ferraro, Mary Jane; Steere, Allen C.

    2017-01-01

    Abstract Background. The conventional 2-tiered serologic testing protocol for Lyme disease (LD), an enzyme immunoassay (EIA) followed by immunoglobulin M and immunoglobulin G Western blots, performs well in late-stage LD but is insensitive in patients with erythema migrans (EM), the most common manifestation of the illness. Western blots are also complex, difficult to interpret, and relatively expensive. In an effort to improve test performance and simplify testing in early LD, we evaluated several modified 2-tiered testing (MTTT) protocols, which use 2 assays designed as first-tier tests sequentially, without the need of Western blots. Methods. The MTTT protocols included (1) a whole-cell sonicate (WCS) EIA followed by a C6 EIA; (2) a WCS EIA followed by a VlsE chemiluminescence immunoassay (CLIA); and (3) a variable major protein-like sequence, expressed (VlsE) CLIA followed by a C6 EIA. Sensitivity was determined using serum from 55 patients with erythema migrans; specificity was determined using serum from 50 patients with other illnesses and 1227 healthy subjects. Results. Sensitivity of the various MTTT protocols in patients with acute erythema migrans ranged from 36% (95% confidence interval [CI], 25%–50%) to 54% (95% CI, 42%–67%), compared with 25% (95% CI, 16%–38%) using the conventional protocol (P = .003–0.3). Among control subjects, the 3 MTTT protocols were similarly specific (99.3%–99.5%) compared with conventional 2-tiered testing (99.5% specificity; P = .6–1.0). Conclusions. Although there were minor differences in sensitivity and specificity among MTTT protocols, each provides comparable or greater sensitivity in acute EM, and similar specificity compared with conventional 2-tiered testing, obviating the need for Western blots. PMID:28329259

  10. What Is the Evidence Base to Support Reading Interventions for Improving Student Outcomes in Grades 1-3? REL 2017-271

    ERIC Educational Resources Information Center

    Gersten, Russell; Newman-Gonchar, Rebecca; Haymond, Kelly S.; Dimino, Joseph

    2017-01-01

    Response to intervention (RTI) is a comprehensive early detection and prevention strategy used to identify and support struggling students before they fall behind. An RTI model usually has three tiers or levels of support. Tier 1 is generally defined as classroom instruction provided to all students, tier 2 is typically a preventive intervention…

  11. 5 CFR 919.445 - What action may I take if a primary tier participant knowingly does business with an excluded or...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... § 919.445 What action may I take if a primary tier participant knowingly does business with an excluded... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false What action may I take if a primary tier participant knowingly does business with an excluded or disqualified person? 919.445 Section 919.445...

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Apyan, A.; Badillo, J.; Cruz, J. Diaz

    The CMS experiment at the LHC relies on 7 Tier-1 centres of the WLCG to perform the majority of its bulk processing activity, and to archive its data. During the first run of the LHC, these two functions were tightly coupled as each Tier-1 was constrained to process only the data archived on its hierarchical storage. This lack of flexibility in the assignment of processing workflows occasionally resulted in uneven resource utilisation and in an increased latency in the delivery of the results to the physics community.The long shutdown of the LHC in 2013-2014 was an opportunity to revisit thismore » mode of operations, disentangling the processing and archive functionalities of the Tier-1 centres. The storage services at the Tier-1s were redeployed breaking the traditional hierarchical model: each site now provides a large disk storage to host input and output data for processing, and an independent tape storage used exclusively for archiving. Movement of data between the tape and disk endpoints is not automated, but triggered externally through the WLCG transfer management systems.With this new setup, CMS operations actively controls at any time which data is available on disk for processing and which data should be sent to archive. Thanks to the high-bandwidth connectivity guaranteed by the LHCOPN, input data can be freely transferred between disk endpoints as needed to take advantage of free CPU, turning the Tier-1s into a large pool of shared resources. The output data can be validated before archiving them permanently, and temporary data formats can be produced without wasting valuable tape resources. Lastly, the data hosted on disk at Tier-1s can now be made available also for user analysis since there is no risk any longer of triggering chaotic staging from tape.In this contribution, we describe the technical solutions adopted for the new disk and tape endpoints at the sites, and we report on the commissioning and scale testing of the service. We detail the procedures implemented by CMS computing operations to actively manage data on disk at Tier-1 sites, and we give examples of the benefits brought to CMS workflows by the additional flexibility of the new system.« less

  13. Tiers of intervention in kindergarten through third grade.

    PubMed

    O'Connor, Rollanda E; Harty, Kristin R; Fulmer, Deborah

    2005-01-01

    This study measured the effects of increasing levels of intervention in reading for a cohort of children in Grades K through 3 to determine whether the severity of reading disability (RD) could be significantly reduced in the catchment schools. Tier 1 consisted of professional development for teachers of reading. The focus of this study is on additional instruction that was provided as early as kindergarten for children whose achievement fell below average. Tier 2 intervention consisted of small-group reading instruction 3 times per week, and Tier 3 of daily instruction delivered individually or in groups of two. A comparison of the reading achievement of third-grade children who were at risk in kindergarten showed moderate to large differences favoring children in the tiered interventions in decoding, word identification, fluency, and reading comprehension.

  14. Effects of artificial hypolimnetic oxygenation in a shallow lake. Part 2: numerical modelling.

    PubMed

    Toffolon, Marco; Serafini, Michele

    2013-01-15

    A three-dimensional numerical model is used to simulate the thermal destratification caused by hypolimnetic jets releasing oxygen-rich water for lake restoration. Focussing on the case study described in the companion paper (Toffolon et al., 2013), i.e. the small, relatively shallow Lake Serraia (Trentino, Italy), a specific simplified sub-grid model is developed in the numerical model to reproduce jet entrainment with reduced computational costs, with the aim to simulate the whole lake dynamics along several weeks. The noticeable agreement between numerical results and available measurements suggests that the model can be used to understand the main effects of the hypolimnetic oxygenation in different scenarios. Therefore, different options can be evaluated and guidelines can be proposed for lake management, with the aim to preserve the typical thermal stratification while providing sufficient oxygen mass to proceed with the restoration phase. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. Radiative temperature measurements at Kupaianaha lava lake, Kilauea Volcano, Hawaii

    NASA Technical Reports Server (NTRS)

    Flynn, Luke P.; Mouginis-Mark, Peter J.; Gradie, Jonathan C.; Lucey, Paul G.

    1993-01-01

    The radiative temperature of the surface of Kupaianaha lava lake is computed using field spectroradiometer data. Observations were made during periods of active overturning. The lake surface exhibits three stages of activity. Magma fountaining and overturning events characterize stage 1, which exhibits the hottest crustal temperatures and the largest fractional hot areas. Rifting events between plates of crust mark stage 2; crustal temperatures in this stage are between 100 C and 340 C, and fractional hot areas are at least an order of magnitude smaller than those in stage 1. Stage 3 is characterized by quiescent periods when the lake is covered by a thick crust. This stage dominates the activity of the lake more than 90 percent of the time. The results of this study are relevant for satellite and airborne measurement of the thermal characteristics of active volcanoes, and indicate that the thermal output of a lava lake varies on a time scale of seconds to minutes.

  16. 12 CFR 23.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Definitions. (a) Affiliate means an affiliate as described in § 23.6. (b) Capital and surplus means: (1) A bank's Tier 1 and Tier 2 capital calculated under the OCC's risk-based capital standards set forth in... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Definitions. 23.2 Section 23.2 Banks and...

  17. Dynamic partitioning as a way to exploit new computing paradigms: the cloud use case.

    NASA Astrophysics Data System (ADS)

    Ciaschini, Vincenzo; Dal Pra, Stefano; dell'Agnello, Luca

    2015-12-01

    The WLCG community and many groups in the HEP community have based their computing strategy on the Grid paradigm, which proved successful and still ensures its goals. However, Grid technology has not spread much over other communities; in the commercial world, the cloud paradigm is the emerging way to provide computing services. WLCG experiments aim to achieve integration of their existing current computing model with cloud deployments and take advantage of the so-called opportunistic resources (including HPC facilities) which are usually not Grid compliant. One missing feature in the most common cloud frameworks, is the concept of job scheduler, which plays a key role in a traditional computing centre, by enabling a fairshare based access at the resources to the experiments in a scenario where demand greatly outstrips availability. At CNAF we are investigating the possibility to access the Tier-1 computing resources as an OpenStack based cloud service. The system, exploiting the dynamic partitioning mechanism already being used to enable Multicore computing, allowed us to avoid a static splitting of the computing resources in the Tier-1 farm, while permitting a share friendly approach. The hosts in a dynamically partitioned farm may be moved to or from the partition, according to suitable policies for request and release of computing resources. Nodes being requested in the partition switch their role and become available to play a different one. In the cloud use case hosts may switch from acting as Worker Node in the Batch system farm to cloud compute node member, made available to tenants. In this paper we describe the dynamic partitioning concept, its implementation and integration with our current batch system, LSF.

  18. Final Tier 2 Environmental Impact Statement for International Space Station

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Final Tier 2 Environmental Impact Statement (EIS) for the International Space Station (ISS) has been prepared by the National Aeronautics and Space Administration (NASA) and follows NASA's Record of Decision on the Final Tier 1 EIS for the Space Station Freedom. The Tier 2 EIS provides an updated evaluation of the environmental impacts associated with the alternatives considered: the Proposed Action and the No-Action alternative. The Proposed Action is to continue U.S. participation in the assembly and operation of ISS. The No-Action alternative would cancel NASA!s participation in the Space Station Program. ISS is an international cooperative venture between NASA, the Canadian Space Agency, the European Space Agency, the Science and Technology Agency of Japan, the Russian Space Agency, and the Italian Space Agency. The purpose of the NASA action would be to further develop human presence in space; to meet scientific, technological, and commercial research needs; and to foster international cooperation.

  19. Draft Tier 2 Environmental Impact Statement for International Space Station

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Draft Tier 2 Environmental Impact Statement (EIS) for the International Space Station (ISS) has been prepared by the National Aeronautics and Space Administration (NASA) and follows NASA's Record of Decision on the Final Tier 1 EIS for the Space Station Freedom. The Tier 2 EIS provides an updated evaluation of the environmental impacts associated with the alternatives considered: the Proposed Action and the No-Action alternative. The Proposed Action is to continue U.S. participation in the assembly and operation of ISS. The No-Action alternative would cancel NASA's participation in the Space Station Program. ISS is an international cooperative venture between NASA, the Canadian Space Agency, the European Space Agency, the Science and Technology Agency of Japan, the Russian Space Agency, and the Italian Space Agency. The purpose of the NASA action would be to further develop a human presence in space; to meet scientific, technological, and commercial research needs; and to foster international cooperation.

  20. "The Effect of Alternative Representations of Lake ...

    EPA Pesticide Factsheets

    Lakes can play a significant role in regional climate, modulating inland extremes in temperature and enhancing precipitation. Representing these effects becomes more important as regional climate modeling (RCM) efforts focus on simulating smaller scales. When using the Weather Research and Forecasting (WRF) model to downscale future global climate model (GCM) projections into RCM simulations, model users typically must rely on the GCM to represent temperatures at all water points. However, GCMs have insufficient resolution to adequately represent even large inland lakes, such as the Great Lakes. Some interpolation methods, such as setting lake surface temperatures (LSTs) equal to the nearest water point, can result in inland lake temperatures being set from sea surface temperatures (SSTs) that are hundreds of km away. In other cases, a single point is tasked with representing multiple large, heterogeneous lakes. Similar consequences can result from interpolating ice from GCMs to inland lake points, resulting in lakes as large as Lake Superior freezing completely in the space of a single timestep. The use of a computationally-efficient inland lake model can improve RCM simulations where the input data is too coarse to adequately represent inland lake temperatures and ice (Gula and Peltier 2012). This study examines three scenarios under which ice and LSTs can be set within the WRF model when applied as an RCM to produce 2-year simulations at 12 km gri

  1. On Statistical Approaches for Demonstrating Analytical Similarity in the Presence of Correlation.

    PubMed

    Yang, Harry; Novick, Steven; Burdick, Richard K

    Analytical similarity is the foundation for demonstration of biosimilarity between a proposed product and a reference product. For this assessment, currently the U.S. Food and Drug Administration (FDA) recommends a tiered system in which quality attributes are categorized into three tiers commensurate with their risk and approaches of varying statistical rigor are subsequently used for the three-tier quality attributes. Key to the analyses of Tiers 1 and 2 quality attributes is the establishment of equivalence acceptance criterion and quality range. For particular licensure applications, the FDA has provided advice on statistical methods for demonstration of analytical similarity. For example, for Tier 1 assessment, an equivalence test can be used based on an equivalence margin of 1.5 σ R , where σ R is the reference product variability estimated by the sample standard deviation S R from a sample of reference lots. The quality range for demonstrating Tier 2 analytical similarity is of the form X̄ R ± K × σ R where the constant K is appropriately justified. To demonstrate Tier 2 analytical similarity, a large percentage (e.g., 90%) of test product must fall in the quality range. In this paper, through both theoretical derivations and simulations, we show that when the reference drug product lots are correlated, the sample standard deviation S R underestimates the true reference product variability σ R As a result, substituting S R for σ R in the Tier 1 equivalence acceptance criterion and the Tier 2 quality range inappropriately reduces the statistical power and the ability to declare analytical similarity. Also explored is the impact of correlation among drug product lots on Type I error rate and power. Three methods based on generalized pivotal quantities are introduced, and their performance is compared against a two-one-sided tests (TOST) approach. Finally, strategies to mitigate risk of correlation among the reference products lots are discussed. A biosimilar is a generic version of the original biological drug product. A key component of a biosimilar development is the demonstration of analytical similarity between the biosimilar and the reference product. Such demonstration relies on application of statistical methods to establish a similarity margin and appropriate test for equivalence between the two products. This paper discusses statistical issues with demonstration of analytical similarity and provides alternate approaches to potentially mitigate these problems. © PDA, Inc. 2016.

  2. Developing the greatest Blue Economy: Water productivity, fresh water depletion, and virtual water trade in the Great Lakes basin

    NASA Astrophysics Data System (ADS)

    Mayer, A. S.; Ruddell, B. L.; Mubako, S. T.

    2016-12-01

    The Great Lakes basin hosts the world's most abundant surface fresh water reserve. Historically an industrial and natural resource powerhouse, the region has suffered economic stagnation in recent decades. Meanwhile, growing water resource scarcity around the world is creating pressure on water-intensive human activities. This situation creates the potential for the Great Lakes region to sustainably utilize its relative water wealth for economic benefit. We combine economic production and trade datasets with water consumption data and models of surface water depletion in the region. We find that, on average, the current economy does not create significant impacts on surface waters, but there is some risk that unregulated large water uses can create environmental flow impacts if they are developed in the wrong locations. Water uses drawing on deep groundwater or the Great Lakes themselves are unlikely to create a significant depletion, and discharge of groundwater withdrawals to surface waters offsets most surface water depletion. This relative abundance of surface water means that science-based management of large water uses to avoid accidentally creating "hotspots" is likely to be successful in avoiding future impacts, even if water use is significantly increased. Commercial water uses are the most productive, with thermoelectric, mining, and agricultural water uses in the lowest tier of water productivity. Surprisingly for such a water-abundant economy, the region is a net importer of water-derived goods and services. This, combined with the abundance of surface water, suggests that the region's water-based economy has room to grow in the 21st century.

  3. Virtual pools for interactive analysis and software development through an integrated Cloud environment

    NASA Astrophysics Data System (ADS)

    Grandi, C.; Italiano, A.; Salomoni, D.; Calabrese Melcarne, A. K.

    2011-12-01

    WNoDeS, an acronym for Worker Nodes on Demand Service, is software developed at CNAF-Tier1, the National Computing Centre of the Italian Institute for Nuclear Physics (INFN) located in Bologna. WNoDeS provides on demand, integrated access to both Grid and Cloud resources through virtualization technologies. Besides the traditional use of computing resources in batch mode, users need to have interactive and local access to a number of systems. WNoDeS can dynamically select these computers instantiating Virtual Machines, according to the requirements (computing, storage and network resources) of users through either the Open Cloud Computing Interface API, or through a web console. An interactive use is usually limited to activities in user space, i.e. where the machine configuration is not modified. In some other instances the activity concerns development and testing of services and thus implies the modification of the system configuration (and, therefore, root-access to the resource). The former use case is a simple extension of the WNoDeS approach, where the resource is provided in interactive mode. The latter implies saving the virtual image at the end of each user session so that it can be presented to the user at subsequent requests. This work describes how the LHC experiments at INFN-Bologna are testing and making use of these dynamically created ad-hoc machines via WNoDeS to support flexible, interactive analysis and software development at the INFN Tier-1 Computing Centre.

  4. Use of joint-growth directions and rock textures to infer thermal regimes during solidification of basaltic lava flows

    NASA Astrophysics Data System (ADS)

    Degraff, James M.; Long, Philip E.; Aydin, Atilla

    1989-09-01

    Thermal contraction joints form in the upper and lower solidifying crusts of basaltic lava flows and grow toward the interior as the crusts thicken. Lava flows are thus divided by vertical joints that, by changes in joint spacing and form, define horizontal intraflow layers known as tiers. Entablatures are tiers with joint spacings less than about 40 cm, whereas colonnades have larger joint spacings. We use structural and petrographic methods to infer heat-transfer processes and to constrain environmental conditions that produce these contrasting tiers. Joint-surface morphology indicates overall joint-growth direction and thus identifies the level in a flow where the upper and lower crusts met. Rock texture provides information on relative cooling rates in the tiers of a flow. Lava flows without entablature have textures that develop by relatively slow cooling, and two joint sets that usually meet near their middles, which indicate mostly conductive cooling. Entablature-bearing flows have two main joint sets that meet well below their middles, and textures that indicate fast cooling of entablatures and slow cooling of colonnades. Entablatures always occur in the upper joint sets and sometimes alternate several times with colonnades. Solidification times of entablature-bearing flows, constrained by lower joint-set thicknesses, are much less than those predicted by a purely conductive cooling model. These results are best explained by a cooling model based on conductive heat transfer near a flow base and water-steam convection in the upper part of an entablature-bearing flow. Calculated solidification rates in the upper parts of such flows exceed that of the upper crust of Kilauea Iki lava lake, where water-steam convection is documented. Use of the solidification rates in an available model of water-steam convection yields permeability values that agree with measured values for fractured crystalline rock. We conclude, therefore, that an entablature forms when part of a flow cools very rapidly by water-steam convection. Flooding of the flow top by surface drainage most likely induces the convection. Colonnades form under conditions of slower cooling by conductive heat transfer in the absence of water.

  5. CON4EI: Development of testing strategies for hazard identification and labelling for serious eye damage and eye irritation of chemicals.

    PubMed

    Adriaens, E; Verstraelen, S; Alépée, N; Kandarova, H; Drzewiecka, A; Gruszka, K; Guest, R; Willoughby, J A; Van Rompay, A R

    2018-06-01

    Assessment of acute eye irritation potential is part of the international regulatory requirements for safety testing of chemicals. In the last decades, many efforts have been made in the search for alternative methods to replace the regulatory in vivo Draize rabbit eye test (OECD TG 405). Success in terms of complete replacement of the regulatory in vivo Draize rabbit eye test has not yet been achieved. The main objective of the CEFIC-LRI-AIMT6-VITO CON4EI (CONsortium for in vitro Eye Irritation testing strategy) project was to develop tiered testing strategies for serious eye damage and eye irritation assessment that can lead to complete replacement of OECD TG 405. A set of 80 reference chemicals (e.g. balanced by important driver of classification and physical state), was tested with seven test methods. Based on the results of this project, three different strategies were suggested. We have provided a standalone (EpiOcular ET-50), a two-tiered and three-tiered strategy, that can be used to distinguish between Cat 1 and Cat 2 chemicals and chemicals that do not require classification (No Cat). The two-tiered and three-tiered strategies use an RhCE test method (EpiOcular EIT or SkinEthic™ EIT) at the bottom (identification No Cat) in combination with the BCOP LLBO (two-tiered strategy) or BCOP OP-KIT and SMI (three-tiered strategy) at the top (identification Cat 1). For our proposed strategies, 71.1% - 82.9% Cat 1, 64.2% - 68.5% Cat 2 and ≥80% No Cat chemicals were correctly identified. Also, similar results were obtained for the Top-Down and Bottom-Up approach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. 5 CFR 919.445 - What action may I take if a primary tier participant knowingly does business with an excluded or...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false What action may I take if a primary tier... § 919.445 What action may I take if a primary tier participant knowingly does business with an excluded... person, you as an agency official may refer the matter for suspension and debarment consideration. You...

  7. Computational fluid dynamics simulations of the Late Pleistocene Lake Bonneville flood

    USGS Publications Warehouse

    Abril-Hernández, José M.; Periáñez, Raúl; O'Connor, Jim E.; Garcia-Castellanos, Daniel

    2018-01-01

    At approximately 18.0 ka, pluvial Lake Bonneville reached its maximum level. At its northeastern extent it was impounded by alluvium of the Marsh Creek Fan, which breached at some point north of Red Rock Pass (Idaho), leading to one of the largest floods on Earth. About 5320 km3 of water was discharged into the Snake River drainage and ultimately into the Columbia River. We use a 0D model and a 2D non-linear depth-averaged hydrodynamic model to aid understanding of outflow dynamics, specifically evaluating controls on the amount of water exiting the Lake Bonneville basin exerted by the Red Rock Pass outlet lithology and geometry as well as those imposed by the internal lake geometry of the Bonneville basin. These models are based on field evidence of prominent lake levels, hypsometry and terrain elevations corrected for post-flood isostatic deformation of the lake basin, as well as reconstructions of the topography at the outlet for both the initial and final stages of the flood. Internal flow dynamics in the northern Lake Bonneville basin during the flood were affected by the narrow passages separating the Cache Valley from the main body of Lake Bonneville. This constriction imposed a water-level drop of up to 2.7 m at the time of peak-flow conditions and likely reduced the peak discharge at the lake outlet by about 6%. The modeled peak outlet flow is 0.85·106 m3 s−1. Energy balance calculations give an estimate for the erodibility coefficient for the alluvial Marsh Creek divide of ∼0.005 m y−1 Pa−1.5, at least two orders of magnitude greater than for the underlying bedrock at the outlet. Computing quasi steady-state water flows, water elevations, water currents and shear stresses as a function of the water-level drop in the lake and for the sequential stages of erosion in the outlet gives estimates of the incision rates and an estimate of the outflow hydrograph during the Bonneville Flood: About 18 days would have been required for the outflow to grow from 10% to 100% of its peak value. At the time of peak flow, about 10% of the lake volume would have already exited; eroding about 1 km3 of alluvium from the outlet, and the lake level would have dropped by about 10.6 m.

  8. 40 CFR 86.1860-04 - How to comply with the Tier 2 and interim non-Tier 2 fleet average NOX standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... HIGHWAY VEHICLES AND ENGINES (CONTINUED) General Compliance Provisions for Control of Air Pollution From New and In-Use Light-Duty Vehicles, Light-Duty Trucks, and Complete Otto-Cycle Heavy-Duty Vehicles...

  9. Structural Constraints of Vaccine-Induced Tier-2 Autologous HIV Neutralizing Antibodies Targeting the Receptor-Binding Site.

    PubMed

    Bradley, Todd; Fera, Daniela; Bhiman, Jinal; Eslamizar, Leila; Lu, Xiaozhi; Anasti, Kara; Zhang, Ruijung; Sutherland, Laura L; Scearce, Richard M; Bowman, Cindy M; Stolarchuk, Christina; Lloyd, Krissey E; Parks, Robert; Eaton, Amanda; Foulger, Andrew; Nie, Xiaoyan; Karim, Salim S Abdool; Barnett, Susan; Kelsoe, Garnett; Kepler, Thomas B; Alam, S Munir; Montefiori, David C; Moody, M Anthony; Liao, Hua-Xin; Morris, Lynn; Santra, Sampa; Harrison, Stephen C; Haynes, Barton F

    2016-01-05

    Antibodies that neutralize autologous transmitted/founder (TF) HIV occur in most HIV-infected individuals and can evolve to neutralization breadth. Autologous neutralizing antibodies (nAbs) against neutralization-resistant (Tier-2) viruses are rarely induced by vaccination. Whereas broadly neutralizing antibody (bnAb)-HIV-Envelope structures have been defined, the structures of autologous nAbs have not. Here, we show that immunization with TF mutant Envs gp140 oligomers induced high-titer, V5-dependent plasma neutralization for a Tier-2 autologous TF evolved mutant virus. Structural analysis of autologous nAb DH427 revealed binding to V5, demonstrating the source of narrow nAb specificity and explaining the failure to acquire breadth. Thus, oligomeric TF Envs can elicit autologous nAbs to Tier-2 HIVs, but induction of bnAbs will require targeting of precursors of B cell lineages that can mature to heterologous neutralization. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  10. Tier 2 Reading Interventions, K-2nd Grade Practices and Processes

    ERIC Educational Resources Information Center

    Allison, Tamara Alice

    2016-01-01

    Due to variation that exists in providing Tier 2 reading intervention instruction, the purpose of the study was to identify processes and instructional strategies currently being utilized by K-2 teachers of the Gallup, New Mexico elementary schools. 17 teachers from 9 of the 10 elementary schools participated in the study. A survey instrument was…

  11. 77 FR 71371 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Redesignation of the Ohio...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-11-30

    ....5 , NO X and sulfur dioxide (SO 2 ) emissions inventories as satisfying the requirement in section... control measures include the following. Tier 2 Emission Standards for Vehicles and Gasoline Sulfur... vehicles replace older vehicles. The Tier 2 standards also reduced the sulfur content of gasoline to 30...

  12. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology

    PubMed Central

    Zao, John K.; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system. PMID:24917804

  13. Pervasive brain monitoring and data sharing based on multi-tier distributed computing and linked data technology.

    PubMed

    Zao, John K; Gan, Tchin-Tze; You, Chun-Kai; Chung, Cheng-En; Wang, Yu-Te; Rodríguez Méndez, Sergio José; Mullen, Tim; Yu, Chieh; Kothe, Christian; Hsiao, Ching-Teng; Chu, San-Liang; Shieh, Ce-Kuen; Jung, Tzyy-Ping

    2014-01-01

    EEG-based Brain-computer interfaces (BCI) are facing basic challenges in real-world applications. The technical difficulties in developing truly wearable BCI systems that are capable of making reliable real-time prediction of users' cognitive states in dynamic real-life situations may seem almost insurmountable at times. Fortunately, recent advances in miniature sensors, wireless communication and distributed computing technologies offered promising ways to bridge these chasms. In this paper, we report an attempt to develop a pervasive on-line EEG-BCI system using state-of-art technologies including multi-tier Fog and Cloud Computing, semantic Linked Data search, and adaptive prediction/classification models. To verify our approach, we implement a pilot system by employing wireless dry-electrode EEG headsets and MEMS motion sensors as the front-end devices, Android mobile phones as the personal user interfaces, compact personal computers as the near-end Fog Servers and the computer clusters hosted by the Taiwan National Center for High-performance Computing (NCHC) as the far-end Cloud Servers. We succeeded in conducting synchronous multi-modal global data streaming in March and then running a multi-player on-line EEG-BCI game in September, 2013. We are currently working with the ARL Translational Neuroscience Branch to use our system in real-life personal stress monitoring and the UCSD Movement Disorder Center to conduct in-home Parkinson's disease patient monitoring experiments. We shall proceed to develop the necessary BCI ontology and introduce automatic semantic annotation and progressive model refinement capability to our system.

  14. Critical Thinking Traits of Top-Tier Experts and Implications for Computer Science Education

    DTIC Science & Technology

    2007-08-01

    field of cognitive theory ," [Papert 1999] used his work while developing the Logo programming language. 19 Although other researchers had developed ...of computer expert systems influenced the development of current theories dealing with cognitive abilities. One of the most important initiatives by...multitude of factors involved. He also builds on the cognitive development work of Piaget and is not ready to abandon the generalist approach. Instead, he

  15. Expansion of Enterprise Requirements and Acquisition Model

    DTIC Science & Technology

    2012-06-04

    upgrades in technology that made it more lethal with a smaller force. Computer technology, GPS, and stealth are just a few examples that allowed...The facility consists of banks of networked computers , large displays all built around a centralized workspace. It can be seen in Figure 3. The...first was to meet a gap in UHF satellite communciations for the Navy. This was satisfied as a Tier-1 program by purchasing additional bandwidth

  16. 40 CFR 1042.245 - Deterioration factors.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to meet the Tier 1 and Tier 2 emission standards would qualify as established technology. We must... deterioration factors for Category 1 and Category 2 engines, either with an engineering analysis, with pre... deterioration factors for an engine family with established technology based on engineering analysis instead of...

  17. 40 CFR 1042.245 - Deterioration factors.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... to meet the Tier 1 and Tier 2 emission standards would qualify as established technology. We must... deterioration factors for Category 1 and Category 2 engines, either with an engineering analysis, with pre... deterioration factors for an engine family with established technology based on engineering analysis instead of...

  18. 40 CFR 1042.245 - Deterioration factors.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to meet the Tier 1 and Tier 2 emission standards would qualify as established technology. We must... deterioration factors for Category 1 and Category 2 engines, either with an engineering analysis, with pre... deterioration factors for an engine family with established technology based on engineering analysis instead of...

  19. 40 CFR 1042.245 - Deterioration factors.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to meet the Tier 1 and Tier 2 emission standards would qualify as established technology. We must... deterioration factors for Category 1 and Category 2 engines, either with an engineering analysis, with pre... deterioration factors for an engine family with established technology based on engineering analysis instead of...

  20. 40 CFR 1042.245 - Deterioration factors.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to meet the Tier 1 and Tier 2 emission standards would qualify as established technology. We must... deterioration factors for Category 1 and Category 2 engines, either with an engineering analysis, with pre... deterioration factors for an engine family with established technology based on engineering analysis instead of...

  1. Progress in landslide susceptibility mapping over Europe using Tier-based approaches

    NASA Astrophysics Data System (ADS)

    Günther, Andreas; Hervás, Javier; Reichenbach, Paola; Malet, Jean-Philippe

    2010-05-01

    The European Thematic Strategy for Soil Protection aims, among other objectives, to ensure a sustainable use of soil. The legal instrument of the strategy, the proposed Framework Directive, suggests identifying priority areas of several soil threats including landslides using a coherent and compatible approach based on the use of common thematic data. In a first stage, this can be achieved through landslide susceptibility mapping using geographically nested, multi-step tiered approaches, where areas identified as of high susceptibility by a first, synoptic-scale Tier ("Tier 1") can then be further assessed and mapped at larger scale by successive Tiers. In order to identify areas prone to landslides at European scale ("Tier 1"), a number of thematic terrain and environmental data sets already available for the whole of Europe can be used as input for a continental scale susceptibility model. However, since no coherent landslide inventory data is available at the moment over the whole continent, qualitative heuristic zonation approaches are proposed. For "Tier 1" a preliminary, simplified model has been developed. It consists of an equally weighting combination of a reduced, continent-wide common dataset of landslide conditioning factors including soil parent material, slope angle and land cover, to derive a landslide susceptibility index using raster mapping units consisting of 1 x 1 km pixels. A preliminary European-wide susceptibility map has thus been produced at 1:1 Million scale, since this is compatible with that of the datasets used. The map has been validated by means of a ratio of effectiveness using samples from landslide inventories in Italy, Austria, Hungary and United Kingdom. Although not differentiated for specific geomorphological environments or specific landslide types, the experimental model reveals a relatively good performance in many European regions at a 1:1 Million scale. An additional "Tier 1" susceptibility map at the same scale and using the same or equivalent thematic data as for the one above has been generated for six French departments using a heuristic, weighting-based multi-criteria evaluation model applied also to raster-cell mapping units. In this experiment, thematic data class weights have been differentiated for two stratification areas, namely mountains and plains, and four main landslide types. Separate susceptibility maps for each landslide type and a combined map for all types have been produced. Results have been validated using BRGM's BDMvT landslide inventory. Unlike "Tier 1", "Tier 2" assessment requires landslide inventory data and additional thematic data on conditioning factors which may not be available for all European countries. For the "Tier 2", a nation-wide quantitative landslide susceptibility assessment has been performed for Italy by applying a statistical model. In this assessment, multivariate analysis was applied using bedrock, soil and climate data together with a number of derivatives from SRTM90 DEM. In addition, separate datasets from a historical landslide inventory were used for model training and validation respectively. The mapping units selected were based on administrative boundaries (municipalities). The performance of this nation-wide, quantitative susceptibility assessment has been evaluated using multi-temporal landslide inventory data. Finally, model limitations for "Tier 1" are discussed, and recommendations for enhanced Tier 1 and Tier 2 models including additional thematic data for conditioning factors are drawn. This project is part of the collaborative research carried out within the European Landslide Expert Group coordinated by JRC in support to the EU Soil Thematic Strategy. It is also supported by the International Programme on Landslides of the International Consortium on Landslides.

  2. Achieving Potent Autologous Neutralizing Antibody Responses against Tier 2 HIV-1 Viruses by Strategic Selection of Envelope Immunogens.

    PubMed

    Hessell, Ann J; Malherbe, Delphine C; Pissani, Franco; McBurney, Sean; Krebs, Shelly J; Gomes, Michelle; Pandey, Shilpi; Sutton, William F; Burwitz, Benjamin J; Gray, Matthew; Robins, Harlan; Park, Byung S; Sacha, Jonah B; LaBranche, Celia C; Fuller, Deborah H; Montefiori, David C; Stamatatos, Leonidas; Sather, D Noah; Haigwood, Nancy L

    2016-04-01

    Advancement in immunogen selection and vaccine design that will rapidly elicit a protective Ab response is considered critical for HIV vaccine protective efficacy. Vaccine-elicited Ab responses must therefore have the capacity to prevent infection by neutralization-resistant phenotypes of transmitted/founder (T/F) viruses that establish infection in humans. Most vaccine candidates to date have been ineffective at generating Abs that neutralize T/F or early variants. In this study, we report that coimmunizing rhesus macaques with HIV-1 gp160 DNA and gp140 trimeric protein selected from native envelope gene sequences (envs) induced neutralizing Abs against Tier 2 autologous viruses expressing cognate envelope (Env). The Env immunogens were selected from envs emerging during the earliest stages of neutralization breadth developing within the first 2 years of infection in two clade B-infected human subjects. Moreover, the IgG responses in macaques emulated the targeting to specific regions of Env known to be associated with autologous and heterologous neutralizing Abs developed within the human subjects. Furthermore, we measured increasing affinity of macaque polyclonal IgG responses over the course of the immunization regimen that correlated with Tier 1 neutralization. In addition, we report firm correlations between Tier 2 autologous neutralization and Tier 1 heterologous neutralization, as well as overall TZM-bl breadth scores. Additionally, the activation of Env-specific follicular helper CD4 T cells in lymphocytes isolated from inguinal lymph nodes of vaccinated macaques correlated with Tier 2 autologous neutralization. These results demonstrate the potential for native Env derived from subjects at the time of neutralization broadening as effective HIV vaccine elements. Copyright © 2016 by The American Association of Immunologists, Inc.

  3. Cleavage-Independent HIV-1 Trimers From CHO Cell Lines Elicit Robust Autologous Tier 2 Neutralizing Antibodies

    PubMed Central

    Bale, Shridhar; Martiné, Alexandra; Wilson, Richard; Behrens, Anna-Janina; Le Fourn, Valérie; de Val, Natalia; Sharma, Shailendra K.; Tran, Karen; Torres, Jonathan L.; Girod, Pierre-Alain; Ward, Andrew B.; Crispin, Max; Wyatt, Richard T.

    2018-01-01

    Native flexibly linked (NFL) HIV-1 envelope glycoprotein (Env) trimers are cleavage-independent and display a native-like, well-folded conformation that preferentially displays broadly neutralizing determinants. The NFL platform simplifies large-scale production of Env by eliminating the need to co-transfect the precursor-cleaving protease, furin that is required by the cleavage-dependent SOSIP trimers. Here, we report the development of a CHO-M cell line that expressed BG505 NFL trimers at a high level of homogeneity and yields of ~1.8 g/l. BG505 NFL trimers purified by single-step lectin-affinity chromatography displayed a native-like closed structure, efficient recognition by trimer-preferring bNAbs, no recognition by non-neutralizing CD4 binding site-directed and V3-directed antibodies, long-term stability, and proper N-glycan processing. Following negative-selection, formulation in ISCOMATRIX adjuvant and inoculation into rabbits, the trimers rapidly elicited potent autologous tier 2 neutralizing antibodies. These antibodies targeted the N-glycan “hole” naturally present on the BG505 Env proximal to residues at positions 230, 241, and 289. The BG505 NFL trimers that did not expose V3 in vitro, elicited low-to-no tier 1 virus neutralization in vivo, indicating that they remained intact during the immunization process, not exposing V3. In addition, BG505 NFL and BG505 SOSIP trimers expressed from 293F cells, when formulated in Adjuplex adjuvant, elicited equivalent BG505 tier 2 autologous neutralizing titers. These titers were lower in potency when compared to the titers elicited by CHO-M cell derived trimers. In addition, increased neutralization of tier 1 viruses was detected. Taken together, these data indicate that both adjuvant and cell-type expression can affect the elicitation of tier 2 and tier 1 neutralizing responses in vivo.

  4. A sediment resuspension and water quality model of Lake Okeechobee

    USGS Publications Warehouse

    James, R.T.; Martin, J.; Wool, T.; Wang, P.-F.

    1997-01-01

    The influence of sediment resuspension on the water quality of shallow lakes is well documented. However, a search of the literature reveals no deterministic mass-balance eutrophication models that explicitly include resuspension. We modified the Lake Okeeehobee water quality model - which uses the Water Analysis Simulation Package (WASP) to simulate algal dynamics and phosphorus, nitrogen, and oxygen cycles - to include inorganic suspended solids and algorithms that: (1) define changes in depth with changes in volume; (2) compute sediment resuspension based on bottom shear stress; (3) compute partition coefficients for ammonia and ortho-phosphorus to solids; and (4) relate light attenuation to solids concentrations. The model calibration and validation were successful with the exception of dissolved inorganic nitrogen species which did not correspond well to observed data in the validation phase. This could be attributed to an inaccurate formulation of algal nitrogen preference and/or the absence of nitrogen fixation in the model. The model correctly predicted that the lake is lightlimited from resuspended solids, and algae are primarily nitrogen limited. The model simulation suggested that biological fluxes greatly exceed external loads of dissolved nutrients; and sedimentwater interactions of organic nitrogen and phosphorus far exceed external loads. A sensitivity analysis demonstrated that parameters affecting resuspension, settling, sediment nutrient and solids concentrations, mineralization, algal productivity, and algal stoichiometry are factors requiring further study to improve our understanding of the Lake Okeechobee ecosystem.

  5. An introduction to indigenous health and culture: the first tier of the Three Tiered Plan.

    PubMed

    Sinnott, M J; Wittmann, B

    2001-06-01

    The objective of the present study was to prepare new doctors with an awareness of cultural and health issues to facilitate positive experiences with indigenous patients. The study incorporated the 1998 intern orientation programs in Queensland public hospitals. The study method included tier one of the Three Tiered Plan, which was implemented and audited. Indigenous liaison officers, directors of clinical training and medical education officers were surveyed prior to this implementation to determine whether any or similar initiatives had been carried out in previous years and/or were planned. Post-implementation feedback from interns was obtained by using questionnaires. Follow-up telephone interviews with the directors of clinical training, medical education officers and indigenous hospital liaison officers detailed the format and content of tier one at each hospital. The results indicate that this active intervention improved the implementation rate of tier one from nine of 19 (47%) Queensland public hospitals in 1997 to 17 (90%) in 1998. The 14 indigenous hospital liaison officers (100%) involved in the intervention perceived it as beneficial. Forty-three (67%) of interns who responded to the survey indicated they had encountered an indigenous patient within the last 2-4 months. The level of knowledge of indigenous health and culture self-reported by interns was between the categories 'enough to get by' and 'inadequate'. In conclusion, it appears that tier one has been successful and is to be a formal component of intern orientations in Queensland public hospitals. Further initiatives in indigenous health and culture targeting medical staff (i.e. tier two and tier three), are needed.

  6. Rankings matter: nurse graduates from higher-ranked institutions have higher productivity.

    PubMed

    Yakusheva, Olga; Weiss, Marianne

    2017-02-13

    Increasing demand for baccalaureate-prepared nurses has led to rapid growth in the number of baccalaureate-granting programs, and to concerns about educational quality and potential effects on productivity of the graduating nursing workforce. We examined the association of individual productivity of a baccalaureate-prepared nurse with the ranking of the degree-granting institution. For a sample of 691 nurses from general medical-surgical units at a large magnet urban hospital between 6/1/2011-12/31/2011, we conducted multivariate regression analysis of nurse productivity on the ranking of the degree-granting institution, adjusted for age, hospital tenure, gender, and unit-specific effects. Nurse productivity was coded as "top"/"average"/"bottom" based on a computation of individual nurse value-added to patient outcomes. Ranking of the baccalaureate-granting institution was derived from the US News and World Report Best Colleges Rankings' categorization of the nurse's institution as the "first tier" or the "second tier", with diploma or associate degree as the reference category. Relative to diploma or associate degree nurses, nurses who had attended first-tier universities had three-times the odds of being in the top productivity category (OR = 3.18, p < 0.001), while second-tier education had a non-significant association with productivity (OR = 1.73, p = 0.11). Being in the bottom productivity category was not associated with having a baccalaureate degree or the quality tier. The productivity boost from a nursing baccalaureate degree depends on the quality of the educational institution. Recognizing differences in educational outcomes, initiatives to build a baccalaureate-educated nursing workforce should be accompanied by improved access to high-quality educational institutions.

  7. Exploring Lake Ecology in a Computer-Supported Learning Environment

    ERIC Educational Resources Information Center

    Ergazaki, Marida; Zogza, Vassiliki

    2008-01-01

    This study highlights the computer-mediated discursive activity of two dyads of first year educational sciences students, each collaboratively exploring several options for increasing the equilibrium size of a fish population in a lake. Our focus is on peers' attempts to come up with justified predictions about the adequacy of several options for…

  8. A Spatial-Temporal Comparison of Lake Mendota CO2 Fluxes and Collection Methods

    NASA Astrophysics Data System (ADS)

    Baldocchi, A. K.; Reed, D. E.; Desai, A. R.; Loken, L. C.; Schramm, P.; Stanley, E. H.

    2017-12-01

    Monitoring of carbon fluxes at the lake/atmosphere interface can help us determine baselines from which to understand responses in both space and time that may result from our warming climate or increasing nutrient inputs. Since recent research has shown lakes to be hotspots of global carbon cycling, it is important to quantify carbon sink and source dynamics as well as to verify observations between multiple methods in the context of long-term data collection efforts. Here we evaluate a new method for measuring space and time variation in CO2 fluxes based on novel speedboat-based collection method of aquatic greenhouse gas concentrations and a flux computation and interpolation algorithm. Two-hundred and forty-nine consecutive days of spatial flux maps over the 2016 open ice period were compared to ongoing eddy covariance tower flux measurements on the shore of Lake Mendota, Wisconsin US using a flux footprint analysis. Spatial and temporal alignments of the fluxes from these two observational datasets revealed both similar trends from daily to seasonal timescales as well as biases between methods. For example, throughout the Spring carbon fluxes showed strong correlation although off by an order of magnitude. Isolating physical patterns of agreement between the two methods of the lake/atmosphere CO2 fluxes allows us to pinpoint where biology and physical drivers contribute to the global carbon cycle and help improve modelling of lakes and utilize lakes as leading indicators of climate change.

  9. 12 CFR 324.63 - Disclosures by FDIC-supervised institutions described in § 324.61.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., tier 2 capital, tier 1 and total capital ratios, including the regulatory capital elements and all the regulatory adjustments and deductions needed to calculate the numerator of such ratios; (2) Total risk... risk-weighted assets; (3) Regulatory capital ratios during any transition periods, including a...

  10. Testing the Efficacy of a Tier 2 Mathematics Intervention: A Conceptual Replication Study

    ERIC Educational Resources Information Center

    Doabler, Christian T.; Clarke, Ben; Kosty, Derek B.; Kurtz-Nelson, Evangeline; Fien, Hank; Smolkowski, Keith; Baker, Scott K.

    2016-01-01

    The purpose of this closely aligned conceptual replication study was to investigate the efficacy of a Tier 2 kindergarten mathematics intervention. The replication study differed from the initial randomized controlled trial on three important elements: geographical region, timing of the intervention, and instructional context of the…

  11. Examining the Efficacy of a Tier 2 Kindergarten Mathematics Intervention

    ERIC Educational Resources Information Center

    Clarke, Ben; Doabler, Christian T.; Smolkowski, Keith; Baker, Scott K.; Fien, Hank; Cary, Mari Strand

    2016-01-01

    This study examined the efficacy of a Tier 2 kindergarten mathematics intervention program, ROOTS, focused on developing whole number understanding for students at risk in mathematics. A total of 29 classrooms were randomly assigned to treatment (ROOTS) or control (standard district practices) conditions. Measures of mathematics achievement were…

  12. RE-AIM Checklist for Integrating and Sustaining Tier 2 Social-Behavioral Interventions

    ERIC Educational Resources Information Center

    Cheney, Douglas A.; Yong, Minglee

    2014-01-01

    Even though evidence-based Tier 2 programs are now more commonly available, integrating and sustaining these interventions in schools remain challenging. RE-AIM, which stands for Reach, Effectiveness, Adoption, Implementation, and Maintenance, is a public health framework used to maximize the effectiveness of health promotion programs in…

  13. INDUCTION OF SKIN PAPILLOMAS IN THE SENCAR MOUSE AS A TIER 2 CARCINOGENESIS BIOASSAY

    EPA Science Inventory

    The Toxic Substances Control Act mandates the testing of industrial chemicals for which insufficient evidence of safety exists. One of the more critical areas in chemical carcinogenesis testing is a dependable approach to confirmatory tests (tier 2) of identified positives at a s...

  14. Supporting Documentation Used in the Derivation of Selected Freshwater Tier 2 ESBs

    EPA Science Inventory

    Compilation of toxicity data used to derive secondary chronic values (SCVs) and tier 2 equilibrium partitioning sediment benchmarks (ESBs) for a selection of nonionic organic chemicals. The values are used in the following U.S. EPA document: U.S. EPA. 2008. Procedures for th...

  15. Explicit Instructional Interactions: Exploring the Black Box of a Tier 2 Mathematics Intervention

    ERIC Educational Resources Information Center

    Doabler, Christian T.; Clarke, Ben; Stoolmiller, Mike; Kosty, Derek B.; Fien, Hank; Smolkowski, Keith; Baker, Scott K.

    2017-01-01

    A critical aspect of intervention research is investigating the active ingredients that underlie intensive interventions and their theories of change. This study explored the rate of instructional interactions within treatment groups to determine whether they offered explanatory power of an empirically validated Tier 2 kindergarten mathematics…

  16. Predicting Response to Treatment in a Tier 2 Supplemental Vocabulary Intervention

    ERIC Educational Resources Information Center

    Kelley, Elizabeth; Leary, Emily; Goldstein, Howard

    2018-01-01

    Purpose: To effectively implement a response to intervention approach, there is a need for timely and specific information about student learning in response to treatment to ensure that treatment decisions are appropriate. This exploratory study examined responsivity to a supplemental, Tier 2 vocabulary intervention delivered to preschool children…

  17. 26 CFR 1.704-2 - Allocations attributable to nonrecourse liabilities.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... minimum gain. (iii) Carryover to succeeding taxable year. (k) Tiered partnerships. (1) Increase in upper... increase in the upper-tier partnership's minimum gain (under paragraph (k)(1) of this section) attributable... deductions. (2) Definition of and allocations pursuant to a minimum gain chargeback. (3) Definition of...

  18. 26 CFR 1.704-2 - Allocations attributable to nonrecourse liabilities.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... minimum gain. (iii) Carryover to succeeding taxable year. (k) Tiered partnerships. (1) Increase in upper... increase in the upper-tier partnership's minimum gain (under paragraph (k)(1) of this section) attributable... deductions. (2) Definition of and allocations pursuant to a minimum gain chargeback. (3) Definition of...

  19. 78 FR 11869 - Proposed Information Collection Request; Comment Request; Registration of Fuels and Fuel...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-20

    ... biodiesel, water/diesel emulsions, several atypical additives, and renewable gasoline and diesel fuels. Tier... health effects. Tier 2 data have been submitted for baseline diesel, biodiesel, and water/diesel...

  20. Learning To Work Smarter.

    ERIC Educational Resources Information Center

    Baldwin, Fred D.

    2001-01-01

    With support from federal grants and area industry, the Alfred State College of Technology in New York's Southern Tier is training future workers for high-skill manufacturing jobs. The college offers certification and associate's degree programs in welding and machine-tool technology and is developing a training program in computer technology.…

  1. 20 CFR 226.16 - Supplemental annuity.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Supplemental annuity. 226.16 Section 226.16... EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing an Employee Annuity § 226.16 Supplemental annuity. A supplemental annuity is payable in addition to tiers I and II and the vested dual benefit to an...

  2. ExScal Backbone Network Architecture

    DTIC Science & Technology

    2005-01-01

    802.11 battery powered nodes was laid over the sensor network. We adopted the Stargate platform for the backbone tier to serve as the basis for...its head. XSS Hardware and Network: XSS stands for eXtreme Scaling Stargate . A stargate is a linux-based single board computer. It has a 400 MHz

  3. A gene expression biomarker accurately predicts estrogen receptor α modulation in a human gene expression compendium

    EPA Science Inventory

    The EPA’s vision for the Endocrine Disruptor Screening Program (EDSP) in the 21st Century (EDSP21) includes utilization of high-throughput screening (HTS) assays coupled with computational modeling to prioritize chemicals with the goal of eventually replacing current Tier 1...

  4. A tiered approach to incorporate exposure and pharmacokinetics considerations in in vitro based safety assessment

    EPA Science Inventory

    Application of in vitro based safety assessment requires reconciling chemical concentrations sufficient to produce bioactivity in vitro with those that trigger a molecular initiating event at the relevant in vivo target site. To address such need, computational tools such as phy...

  5. Agreement of three interpretation systems of intrapartum foetal heart rate monitoring by different levels of physicians.

    PubMed

    Pruksanusak, Ninlapa; Thongphanang, Putthaporn; Chainarong, Natthicha; Suntharasaj, Thitima; Kor-Anantakul, Ounjai; Suwanrath, Chitkasaem; Petpichetchian, Chusana

    2017-11-01

    A prospective study was conducted in centre in Southern Thailand, to evaluate agreement in EFM interpretation among various physicians in order to find out the most practical system for daily use. We found strong agreement of very normal FHR tracings among the FIGO, NICHD 3-tier and 5-tier systems. The NICHD 3-tier was more compatible with the FIGO system than 5-tier system. Overall inter-observer agreement was moderate for the NICHD 3-tier system while inter-observer agreement of 5-tier system was fair also the intra-observer agreement was higher in the NICHD 3-tier system. So the 3-tier systems are more suitable than the 5-tier system in general obstetric practice. Impact statement What is already known on this subject: The 3-tier and 5-tier systems were widely used in general obstetrics practice. What the results of this study add: The inter- and intra-observer agreement of NICHD 3-tier system was higher than the 5-tier system. What the implications are of these findings for clinical practice and/or further research: The 3-tier systems were more suitable than the 5-tier systems in general obstetrics practice.

  6. Council Bluffs interstate system improvements project tier 2, segment 3 Pottawattamie County, Iowa IMN-029-2(55)49--13-78 : environmental assessment and section 4(f) de minimis impact finding.

    DOT National Transportation Integrated Search

    2011-03-14

    This Tier 2 Environmental Assessment (EA) presents the results of studies and analyses : conducted to determine the potential impacts of proposed improvements in Segment 3 of the : Council Bluffs Interstate System (CBIS) in the Council Bluffs metropo...

  7. Validation of the modified 4-tiered categorization system through comparison with the 5-tiered categorization system of the 2015 American Thyroid Association guidelines for classifying small thyroid nodules on ultrasound.

    PubMed

    Lee, Ji Hye; Han, Kyunghwa; Kim, Eun-Kyung; Moon, Hee Jung; Yoon, Jung Hyun; Park, Vivian Y; Kwak, Jin Young

    2017-11-01

    The purpose of this study was to validate the modified 4-tiered categorization system and to compare stratification of malignancy risk in small thyroid nodules with the 2015 American Thyroid Association (ATA) management guidelines. From January 2015 to December 2015, 737 thyroid nodules measured ≥ 1 cm and <2 cm were included in this study. Each nodule was assigned a category with the ultrasonographic patterns described by the 2015 ATA guidelines. On univariate analysis, there was no difference of malignancy risk between low suspicion and very low suspicion nodules (P = .584). Therefore, we suggested a modified 4-tiered categorization, which combines very low suspicion and low suspicion nodules into the "revised low suspicion" category. Specificity, positive predictive value (PPV) and accuracy were higher with the modified 4-tiered categorization system (P < .001 for all). The modified 4-tiered categorization system allows more efficient management with better diagnostic performance than the 2015 ATA categorization system in small thyroid nodules. © 2017 Wiley Periodicals, Inc.

  8. Schlumberger soundings near Medicine Lake, California

    USGS Publications Warehouse

    Zohdy, A.A.R.; Bisdorf, R.J.

    1990-01-01

    The use of direct current resistivity soundings to explore the geothermal potential of the Medicine Lake area in northern California proved to be challenging because of high contact resistances and winding roads. Deep Schlumberger soundings were made by expanding current electrode spacings along the winding roads. Corrected sounding data were interpreted using an automatic interpretation method. Forty-two maps of interpreted resistivity were calculated for depths extending from 20 to 1000 m. Computer animation of these 42 maps revealed that: 1) certain subtle anomalies migrate laterallly with depth and can be traced to their origin, 2) an extensive volume of low-resistivity material underlies the survey area, and 3) the three areas (east of Bullseye Lake, southwest of Glass Mountain, and northwest of Medicine Lake) may be favorable geothermal targets. Six interpreted resistivity maps and three cross-sections illustrate the above findings. -from Authors

  9. Molecular constituents of the extracellular matrix in rat liver mounting a hepatic progenitor cell response for tissue repair

    PubMed Central

    2013-01-01

    Background Tissue repair in the adult mammalian liver occurs in two distinct processes, referred to as the first and second tiers of defense. We undertook to characterize the changes in molecular constituents of the extracellular matrix when hepatic progenitor cells (HPCs) respond in a second tier of defense to liver injury. Results We used transcriptional profiling on rat livers responding by a first tier (surgical removal of 70% of the liver mass (PHx protocol)) and a second tier (70% hepatectomy combined with exposure to 2-acetylaminofluorene (AAF/PHx protocol)) of defense to liver injury and compared the transcriptional signatures in untreated rat liver (control) with those from livers of day 1, day 5 and day 9 post hepatectomy in both protocols. Numerous transcripts encoding specific subunits of collagens, laminins, integrins, and various other extracellular matrix structural components were differentially up- or down-modulated (P < 0.01). The levels of a number of transcripts were significantly up-modulated, mainly in the second tier of defense (Agrn, Bgn, Fbn1, Col4a1, Col8a1, Col9a3, Lama5, Lamb1, Lamb2, Itga4, Igtb2, Itgb4, Itgb6, Nid2), and their signal intensities showed a strong or very strong correlation with Krt1-19, a well-established marker of a ductular/HPC reaction. Furthermore, a significant up-modulation and very strong correlation between the transcriptional profiles of Krt1-19 and St14 encoding matriptase, a component of a novel protease system, was found in the second tier of defense. Real-time PCR confirmed the modulation of St14 transcript levels and strong correlation to Krt-19 and also showed a significant up-modulation and strong correlation to Spint1 encoding HAI-1, a cognate inhibitor of matriptase. Immunodetection and three-dimensional reconstructions showed that laminin, Collagen1a1, agrin and nidogen1 surrounded bile ducts, proliferating cholangiocytes, and HPCs in ductular reactions regardless of the nature of defense. Similarly, matriptase and HAI-1 were expressed in cholangiocytes regardless of the tier of defense, but in the second tier of defense, a subpopulation of HPCs in ductular reactions co-expressed HAI-1 and the fetal hepatocyte marker Dlk1. Conclusion Transcriptional profiling and immunodetection, including three-dimensional reconstruction, generated a detailed overview of the extracellular matrix constituents expressed in a second tier of defense to liver injury. PMID:24359594

  10. Environmental Analysis of Lake Pontchartrain, Louisiana, Its Surrounding Wetlands, and Selected Land Uses. Volume 2.

    DTIC Science & Technology

    1980-02-01

    CHAPTER 1: PRELIMINARY MODELING OF THE LAKE PONTCHARTRAIN ECOSYSTEM BY COMPUTER SIMULATIONS Janes H. Stone and Linda A. Deegan ...related to the extent and productivity of intertidal wetlands ( Craig et al. 1979). The role of coastal wetlands in estuarine areas has been well documented...site arid a bottomland harlwood stt c ill I Iouisiana swamp. Amer. J. Bot. 63 (10):1354-1364. Craig , N. J., R. E. Turner, aird J. W. Day, Jr. 197

  11. Aquatic insect community of lake, Phulbari anua in Cachar, Assam.

    PubMed

    Gupta, Susmita; Narzary, Rupali

    2013-05-01

    An investigation on the water quality and aquatic insect community of an oxbow lake (Phulbari anua) of south Assam, North-East India was carried out during February to April, 2010. Aquatic insect community of the oxbow lake was represented by 9 species belonging to 9 families and 4 orders during the study period. Order Ephemeroptera and Hemiptera were found to be dominant. Record of 5 species and 5 families from the order Hemiptera showed that this is the largest order in terms of aquatic insect diversity of the lake. Computation of dominance status of different species of aquatic insects of the lake based on Engelmann's Scale revealed that Anisops lundbladiana and Cloeon sp. were eudominant in the system. The Shannon- Weiner's Diversity Index (H') and Shannon evenness values (J') were found to range from 0.3-0.69 and 0.53 -0.97, respectively indicating perturbation of the system. Again in terms of physico-chemical properties of water the lake is in a satisfactory condition where all the parameters are well within the range of IS 10500. The DO values were found to range from 6.8 to 14.8 mgl(-1). Free CO2 fluctuated from 1 to 4.98 mgl(-1) and nitrate in water ranged from 0.4 to 2.1 mgl(-1). Margalef's water quality index values of most of the samplings also indicated clean water condition of the lake. Correlation coefficient analyses of the environmental variables, aquatic insect diversity and density of the lake revealed that aquatic insect diversity of the lake is mainly governed by dissolved oxygen, nitrate, and free carbon dioxide.

  12. The rising prevalence of severe poverty in America: a growing threat to public health.

    PubMed

    Woolf, Steven H; Johnson, Robert E; Geiger, H Jack

    2006-10-01

    The U.S. poverty rate has increased since 2000, but the depth of poverty experienced by Americans has been inadequately studied. Of particular concern is whether severe poverty is increasing, a trend that would carry important public health implications. Income-to-poverty (I/P) ratios and income deficits/surpluses were examined for the 1990-2004 period. The severely poor, moderately poor, and near-poor were classified as those with I/P ratios of less than 0.5, 0.5 to 1.0, or 1.0 to 2.0, respectively. Income deficits/surpluses were classified relative to the poverty threshold as Tier I (deficit Dollars 8000 or more), Tier II (deficit or surplus less than Dollars 8000), or Tier III (surplus more than Dollars 8000). Odds ratios for severe poverty and Tier I were also calculated. Severe poverty increased between 2000 and 2004-those with I/P ratios of less than 0.5 grew by 20%, and Tier I grew by 45% to 55%-while the prevalence of higher levels of income diminished. The population in severe poverty was over-represented by children (odds ratio [OR] = 1.69, confidence interval [CI] = 1.63-1.75), African Americans (OR = 2.84, CI = 2.74-2.95), and Hispanics (OR = 1.64, CI = 1.58-1.71). From 2000 to 2004, the prevalence of severe poverty increased sharply while the proportion of Americans in higher income tiers diminished. These trends have broad societal implications. Likely health consequences include a higher prevalence of chronic illnesses, more frequent and severe disease complications, and increased demands and costs for healthcare services. Adverse effects on children warrant special concern. The growth in the number of Americans living in poverty calls for the re-examination of policies enacted in recent years to foster economic progress.

  13. Simple Objective Detection of Human Lyme Disease Infection Using Immuno-PCR and a Single Recombinant Hybrid Antigen

    PubMed Central

    Halpern, Micah D.; Molins, Claudia R.; Schriefer, Martin

    2014-01-01

    A serology-based tiered approach has, to date, provided the most effective means of laboratory confirmation of clinically suspected cases of Lyme disease, but it lacks sensitivity in the early stages of disease and is often dependent on subjectively scored immunoblots. We recently demonstrated the use of immuno-PCR (iPCR) for detecting Borrelia burgdorferi antibodies in patient serum samples that were positive for Lyme disease. To better understand the performance of the Lyme disease iPCR assay, the repeatability and variability of the background of the assay across samples from a healthy population (n = 36) were analyzed. Both of these parameters were found to have coefficients of variation of <3%. Using eight antigen-specific iPCR assays and positive call thresholds established for each assay, iPCR IgM and/or IgG diagnosis from Lyme disease patient serum samples (n = 12) demonstrated a strong correlation with that of 2-tier testing. Furthermore, a simplified iPCR approach using a single hybrid antigen and detecting only IgG antibodies confirmed the 2-tier diagnosis in the Lyme disease patient serum samples (n = 12). Validation of the hybrid antigen IgG iPCR assay using a blinded panel of Lyme disease and non-Lyme disease patient serum samples (n = 92) resulted in a sensitivity of 69% (95% confidence interval [CI], 50% to 84%), compared to that of the 2-tier analysis at 59% (95% CI, 41% to 76%), and a specificity of 98% (95% CI, 91% to 100%) compared to that of the 2-tier analysis at 97% (95% CI, 88% to 100%). A single-tier hybrid antigen iPCR assay has the potential to be an improved method for detecting host-generated antibodies against B. burgdorferi. PMID:24899074

  14. Crustal structure between Lake Mead, Nevada, and Mono Lake, California

    USGS Publications Warehouse

    Johnson, Lane R.

    1964-01-01

    Interpretation of a reversed seismic-refraction profile between Lake Mead, Nevada, and Mono Lake, California, indicates velocities of 6.15 km/sec for the upper layer of the crust, 7.10 km/sec for an intermediate layer, and 7.80 km/sec for the uppermost mantle. Phases interpreted to be reflections from the top of the intermediate layer and the Mohorovicic discontinuity were used with the refraction data to calculate depths. The depth to the Moho increases from about 30 km near Lake Mead to about 40 km near Mono Lake. Variations in arrival times provide evidence for fairly sharp flexures in the Moho. Offsets in the Moho of 4 km at one point and 2 1/2 km at another correspond to large faults at the surface, and it is suggested that fracture zones in the upper crust may displace the Moho and extend into the upper mantle. The phase P appears to be an extension of the reflection from the top of the intermediate layer beyond the critical angle. Bouguer gravity, computed for the seismic model of the crust, is in good agreement with the measured Bouguer gravity. Thus a model of the crustal structure is presented which is consistent with three semi-independent sources of geophysical data: seismic-refraction, seismic-reflection, and gravity.

  15. Ozone distributions over southern Lake Michigan: comparisons between ferry-based observations, shoreline-based DOAS observations and model forecasts

    NASA Astrophysics Data System (ADS)

    Cleary, P. A.; Fuhrman, N.; Schulz, L.; Schafer, J.; Fillingham, J.; Bootsma, H.; McQueen, J.; Tang, Y.; Langel, T.; McKeen, S.; Williams, E. J.; Brown, S. S.

    2015-05-01

    Air quality forecast models typically predict large summertime ozone abundances over water relative to land in the Great Lakes region. While each state bordering Lake Michigan has dedicated monitoring systems, offshore measurements have been sparse, mainly executed through specific short-term campaigns. This study examines ozone abundances over Lake Michigan as measured on the Lake Express ferry, by shoreline differential optical absorption spectroscopy (DOAS) observations in southeastern Wisconsin and as predicted by the Community Multiscale Air Quality (CMAQ) model. From 2008 to 2009 measurements of O3, SO2, NO2 and formaldehyde were made in the summertime by DOAS at a shoreline site in Kenosha, WI. From 2008 to 2010 measurements of ambient ozone were conducted on the Lake Express, a high-speed ferry that travels between Milwaukee, WI, and Muskegon, MI, up to six times daily from spring to fall. Ferry ozone observations over Lake Michigan were an average of 3.8 ppb higher than those measured at shoreline in Kenosha, with little dependence on position of the ferry or temperature and with greatest differences during evening and night. Concurrent 1-48 h forecasts from the CMAQ model in the upper Midwestern region surrounding Lake Michigan were compared to ferry ozone measurements, shoreline DOAS measurements and Environmental Protection Agency (EPA) station measurements. The bias of the model O3 forecast was computed and evaluated with respect to ferry-based measurements. Trends in the bias with respect to location and time of day were explored showing non-uniformity in model bias over the lake. Model ozone bias was consistently high over the lake in comparison to land-based measurements, with highest biases for 25-48 h after initialization.

  16. An analysis of potential water availability from the Charles Mill, Clendening, Piedmont, Pleasant Hill, Senecaville, and Wills Creek Lakes in the Muskingum River Watershed, Ohio

    USGS Publications Warehouse

    Koltun, G.F.

    2014-01-01

    This report presents the results of a study to assess potential water availability from the Charles Mill, Clendening, Piedmont, Pleasant Hill, Senecaville, and Wills Creek Lakes, located within the Muskingum River Watershed, Ohio. The assessment was based on the criterion that water withdrawals should not appreciably affect maintenance of recreation-season pool levels in current use. To facilitate and simplify the assessment, it was assumed that historical lake operations were successful in maintaining seasonal pool levels, and that any discharges from lakes constituted either water that was discharged to prevent exceeding seasonal pool levels or discharges intended to meet minimum in-stream flow targets downstream from the lakes. It further was assumed that the volume of water discharged in excess of the minimum in-stream flow target is available for use without negatively impacting seasonal pool levels or downstream water uses and that all or part of it is subject to withdrawal. Historical daily outflow data for the lakes were used to determine the quantity of water that potentially could be withdrawn and the resulting quantity of water that would flow downstream (referred to as “flow-by”) on a daily basis as a function of all combinations of three hypothetical target minimum flow-by amounts (1, 2, and 3 times current minimum in-stream flow targets) and three pumping capacities (1, 2, and 3 million gallons per day). Using both U.S. Geological Survey streamgage data (where available) and lake-outflow data provided by the U.S. Army Corps of Engineers resulted in analytical periods ranging from 51 calendar years for Charles Mill, Clendening, and Piedmont Lakes to 74 calendar years for Pleasant Hill, Senecaville, and Wills Creek Lakes. The observed outflow time series and the computed time series of daily flow-by amounts and potential withdrawals were analyzed to compute and report order statistics (95th, 75th, 50th, 25th, 10th, and 5th percentiles) and means for the analytical period, in aggregate, and broken down by calendar month. In addition, surplus-water mass curve data were tabulated for each of the lakes. Monthly order statistics of computed withdrawals indicated that, for the three pumping capacities considered, increasing the target minimum flow-by amount tended to reduce the amount of water that can be withdrawn. The reduction was greatest in the lower percentiles of withdrawal; however, increasing the flow-by amount had no impact on potential withdrawals during high flow. In addition, for a given target minimum flow-by amount, increasing the pumping rate typically increased the total amount of water that could be withdrawn; however, that increase was less than a direct multiple of the increase in pumping rate for most flow statistics. Potential monthly withdrawals were observed to be more variable and more limited in some calendar months than others. Monthly order statistics and means of computed daily mean flow-by amounts indicated that flow-by amounts generally tended to be lowest during June–October. Increasing the target minimum flow-by amount for a given pumping rate resulted in some small increases in the magnitudes of the mean and 50th percentile and lower order statistics of computed mean flow-by, but had no effect on the magnitudes of the higher percentile statistics. Increasing the pumping rate for a given target minimum flow-by amount resulted in decreases in magnitudes of higher-percentile flow-by statistics by an amount equal to the flow equivalent of the increase in pumping rate; however, some lower percentile statistics remained unchanged.

  17. 40 CFR 63.11517 - What are my monitoring requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) of this section. (1) Daily Method 9 testing for welding, Tier 2 or 3. Perform visual determination of... to the requirements of paragraph (d)(1) of this section. (3) Monthly Method 9 testing for welding... Method 22 testing for welding, Tier 2 or 3. If, after two consecutive months of testing, the average of...

  18. Factors Influencing Staff Perceptions of Administrator Support for Tier 2 and 3 Interventions: A Multilevel Perspective

    ERIC Educational Resources Information Center

    Debnam, Katrina J.; Pas, Elise T.; Bradshaw, Catherine P.

    2013-01-01

    Although the number of schools implementing School-Wide Positive Behavioral Interventions and Supports (SWPBIS) is increasing, and there is great demand for evidence-based Tier 2 and 3 interventions for students requiring additional support, little systematic research has examined administrator support for such programming. This article examines…

  19. Response to Intervention: Using Single-Case Design to Examine the Impact of Tier 2 Mathematics Interventions

    ERIC Educational Resources Information Center

    Valenzuela, Vanessa V.; Gutierrez, Gabriel; Lambros, Katina M.

    2014-01-01

    An A-B single-case design assessed at-risk students' responsiveness to mathematics interventions. Four culturally and linguistically diverse second-grade students were given a Tier 2 standard protocol mathematics intervention that included number sense instruction, modeling procedures, guided math drill and practice of addition and subtraction…

  20. Evaluating Technology-Based Self-Monitoring as a Tier 2 Intervention across Middle School Settings

    ERIC Educational Resources Information Center

    Bruhn, Allison Leigh; Woods-Groves, Suzanne; Fernando, Josephine; Choi, Taehoon; Troughton, Leonard

    2017-01-01

    Multitiered frameworks like Positive Behavior Interventions and Supports (PBIS) have been recommended for preventing and remediating behavior problems. In this study, technology-based self-monitoring was used as a Tier 2 intervention to improve the academic engagement and disruptive behavior of three middle school students who were identified as…

  1. Strain-Specific V3 and CD4 Binding Site Autologous HIV-1 Neutralizing Antibodies Select Neutralization-Resistant Viruses

    DOE PAGES

    Moody, M.  Anthony; Gao, Feng; Gurley, Thaddeus  C.; ...

    2015-09-09

    The third variable (V3) loop and the CD4 binding site (CD4bs) of the viral envelope are frequently targeted by neutralizing antibodies (nAbs) in HIV-1-infected individuals. In chronic infection, virus escape mutants repopulate the plasma and V3 and CD4bs nAbs emerge that can neutralize heterologous tier 1 easy-to-neutralize, but not tier 2 difficult-to-neutralize HIV-1 isolates. However, neutralization sensitivity of autologous plasma viruses to this type of nAb response has not been studied. We describe the development and evolution in vivo of antibodies distinguished by their target specificity for V3and CD4bs epitopes on autologous tier 2 viruses but not on heterologous tiermore » 2 viruses. A surprisingly high fraction of autologous circulating viruses was sensitive to these antibodies. These findings demonstrate a role for V3 and CD4bs antibodies in constraining the native envelope trimer in vivo to a neutralization-resistant phenotype, explaining why HIV-1 transmission generally occurs by tier 2 neutralization-resistant viruses.« less

  2. Strain-Specific V3 and CD4 Binding Site Autologous HIV-1 Neutralizing Antibodies Select Neutralization-Resistant Viruses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moody, M.  Anthony; Gao, Feng; Gurley, Thaddeus  C.

    The third variable (V3) loop and the CD4 binding site (CD4bs) of the viral envelope are frequently targeted by neutralizing antibodies (nAbs) in HIV-1-infected individuals. In chronic infection, virus escape mutants repopulate the plasma and V3 and CD4bs nAbs emerge that can neutralize heterologous tier 1 easy-to-neutralize, but not tier 2 difficult-to-neutralize HIV-1 isolates. However, neutralization sensitivity of autologous plasma viruses to this type of nAb response has not been studied. We describe the development and evolution in vivo of antibodies distinguished by their target specificity for V3and CD4bs epitopes on autologous tier 2 viruses but not on heterologous tiermore » 2 viruses. A surprisingly high fraction of autologous circulating viruses was sensitive to these antibodies. These findings demonstrate a role for V3 and CD4bs antibodies in constraining the native envelope trimer in vivo to a neutralization-resistant phenotype, explaining why HIV-1 transmission generally occurs by tier 2 neutralization-resistant viruses.« less

  3. A tiered approach for the human health risk assessment for consumption of vegetables from with cadmium-contaminated land in urban areas.

    PubMed

    Swartjes, Frank A; Versluijs, Kees W; Otte, Piet F

    2013-10-01

    Consumption of vegetables that are grown in urban areas takes place worldwide. In developing countries, vegetables are traditionally grown in urban areas for cheap food supply. In developing and developed countries, urban gardening is gaining momentum. A problem that arises with urban gardening is the presence of contaminants in soil, which can be taken up by vegetables. In this study, a scientifically-based and practical procedure has been developed for assessing the human health risks from the consumption of vegetables from cadmium-contaminated land. Starting from a contaminated site, the procedure follows a tiered approach which is laid out as follows. In Tier 0, the plausibility of growing vegetables is investigated. In Tier 1 soil concentrations are compared with the human health-based Critical soil concentration. Tier 2 offers the possibility for a detailed site-specific human health risk assessment in which calculated exposure is compared to the toxicological reference dose. In Tier 3, vegetable concentrations are measured and tested following a standardized measurement protocol. To underpin the derivation of the Critical soil concentrations and to develop a tool for site-specific assessment the determination of the representative concentration in vegetables has been evaluated for a range of vegetables. The core of the procedure is based on Freundlich-type plant-soil relations, with the total soil concentration and the soil properties as variables. When a significant plant-soil relation is lacking for a specific vegetable a geometric mean of BioConcentrationFactors (BCF) is used, which is normalized according to soil properties. Subsequently, a 'conservative' vegetable-group-consumption-rate-weighted BioConcentrationFactor is calculated as basis for the Critical soil concentration (Tier 1). The tool to perform site-specific human health risk assessment (Tier 2) includes the calculation of a 'realistic worst case' site-specific vegetable-group-consumption-rate-weighted BioConcentrationFactor. © 2013 Elsevier Inc. All rights reserved.

  4. Towards an integrative soil health assessment strategy: a three tier (integrative biomarker response) approach with Eisenia fetida applied to soils subjected to chronic metal pollution.

    PubMed

    Asensio, Vega; Rodríguez-Ruiz, Amaia; Garmendia, Larraitz; Andre, Jane; Kille, Peter; Morgan, Andrew John; Soto, Manu; Marigómez, Ionan

    2013-01-01

    This is a pilot study for assessing soil ecosystem health in chronically polluted sites on the basis of a 3-tier approach (screening+scoring+understanding) designed to be cost-effective and scientifically based, and to provide straightforward advice and support to managers and stakeholders involved in environmental protection. For the initial screening (Tier 1), the use of a highly sensitive, low-cost biomarker such as neutral red uptake (NRU) in earthworm coelomocytes is proposed. In sites where an alteration in NRU has been established, the stress level may be further assessed by utilising a suite of low-cost and rapid biomarkers of effect integrated in an integrative biological response (IBR) index to obtain an objective (scored) assessment of the induced stress syndrome (Tier 2). The IBR/n index is based on the integration of biomarkers at different levels of biological organisation. Acyl-CoA oxidase activity (AOX), catalase activity (CAT), lipofuscin optical density (LOD%), NRU and the mean epithelial thickness (MET) have been used to calculate the IBR/n index. Biomarkers are determined in earthworms, Eisenia fetida, exposed ex situ to real soils (three mining sites and a reference) for 3, 10 and 17d. The 3d NRU (Tier 1) provided signal of stress. After 3d, PCA, based on the suite of biomarkers (Tier 2), discriminated reference and polluted sites according to toxicity profiles and at 17d, the most polluted site is segregated from less polluted and reference sites. Soils were classified as harmful, unhealthy (not apparently toxic) or healthy. Soils were investigated by microarray transcriptomics (Tier 3), to understand the causes (aetiology) and consequences (prognosis) of health impairment. Tier 3 discriminates, according to stress syndrome traits, soils that did not fall into the category of highly stressed and revealed the main agent causing toxicity at each site by identifying the toxicity mechanisms and biological responses. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Trends in phosphorus loading to the western basin of Lake Erie

    EPA Science Inventory

    Dave Dolan spent much of his career computing and compiling phosphorus loads to the Great Lakes. None of his work in this area has been more valuable than his continued load estimates to Lake Erie, which has allowed us to unambiguously interpret the cyanobacteria blooms and hypox...

  6. Collective screening tools for early identification of dyslexia

    PubMed Central

    Andrade, Olga V. C. A.; Andrade, Paulo E.; Capellini, Simone A.

    2015-01-01

    Current response to intervention models (RTIs) favor a three-tier system. In general, Tier 1 consists of evidence-based, effective reading instruction in the classroom and universal screening of all students at the beginning of the grade level to identify children for early intervention. Non-responders to Tier 1 receive small-group tutoring in Tier 2. Non-responders to Tier 2 are given still more intensive, individual intervention in Tier 3. Limited time, personnel and financial resources derail RTI’s implementation in Brazilian schools because this approach involves procedures that require extra time and extra personnel in all three tiers, including screening tools which normally consist of tasks administered individually. We explored the accuracy of collectively and easily administered screening tools for the early identification of second graders at risk for dyslexia in a two-stage screening model. A first-stage universal screening based on collectively administered curriculum-based measurements was used in 45 7 years old early Portuguese readers from 4 second-grade classrooms at the beginning of the school year and identified an at-risk group of 13 academic low-achievers. Collectively administered tasks based on phonological judgments by matching figures and figures to spoken words [alternative tools for educators (ATE)] and a comprehensive cognitive-linguistic battery of collective and individual assessments were both administered to all children and constituted the second-stage screening. Low-achievement on ATE tasks and on collectively administered writing tasks (scores at the 25th percentile) showed good sensitivity (true positives) and specificity (true negatives) to poor literacy status defined as scores ≤1 SD below the mean on literacy abilities at the end of fifth grade. These results provide implications for the use of a collectively administered screening tool for the early identification of children at risk for dyslexia in a classroom setting. PMID:25667575

  7. Grid site availability evaluation and monitoring at CMS

    DOE PAGES

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  8. Grid site availability evaluation and monitoring at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  9. Grid site availability evaluation and monitoring at CMS

    NASA Astrophysics Data System (ADS)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.

  10. Continuing Development of Alternative High-Throughput Screens to Determine Endocrine Disruption, Focusing on Androgen Receptor, Steroidogenesis, and Thyroid Pathways

    EPA Science Inventory

    The focus of this meeting is the SAP's review and comment on the Agency's proposed high-throughput computational model of androgen receptor pathway activity as an alternative to the current Tier 1 androgen receptor assay (OCSPP 890.1150: Androgen Receptor Binding Rat Prostate Cyt...

  11. 20 CFR 228.18 - Reduction for public pension.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Reduction for public pension. 228.18 Section... COMPUTATION OF SURVIVOR ANNUITIES The Tier I Annuity Component § 228.18 Reduction for public pension. (a) The... receipt of a public pension. (b) When reduction is required. Unless the survivor annuitant meets one of...

  12. Wavelet Decomposition for Discrete Probability Maps

    DTIC Science & Technology

    2007-08-01

    using other wavelet basis functions, such as those mentioned in Section 7 15 DSTO–TN–0760 References 1. P. M. Bentley and J . T . E. McDonnell. Wavelet...84, 1995. 0272-1716. 18. E. J . Stollnitz, T . D. DeRose, and D. H. Salesin. Wavelets for computer graphics: a primer. 2. Computer Graphics and...and Computer Modelling in 2006 from the University of South Australia, Mawson Lakes. Part of this de- gree was undertaken at the University of Twente

  13. Lake on life support: Evaluating urban lake management measures by using a coupled 1D-modelling approach

    NASA Astrophysics Data System (ADS)

    Ladwig, Robert; Kirillin, Georgiy; Hinkelmann, Reinhard; Hupfer, Michael

    2017-04-01

    Urban surface water systems and especially lakes are heavily stressed and modified systems to comply with water management goals and expectations. In this study we focus on Lake Tegel in Berlin, Germany, as a representative of heavily modified urban lakes. In the 20th century, Lake Tegel received increased loadings of nutrients and leached heavy metals from an upstream sewage farm resulting in severe eutrophication problems. The construction of two upstream treatment plants caused a lowering of nutrient concentrations and a re-oligotrophication of the lake. Additionally, artificial aerators, to keep the hypolimnion oxic, and a lake pipeline, to bypass water for maintaining a minimum discharge, went into operation. Lake Tegel is still heavily used for drinking water extraction by bank filtration. These interacting management measures make the system vulnerable to changing climate conditions and pollutant loads. Past modelling studies have shown the complex hydrodynamics of the lake. Here, we are following a simplified approach by using a less computational time consuming vertical 1D-model to simulate the hydrodynamics and the ecological interactions of the system by coupling the General Lake Model to the Aquatic Ecodynamics Model Library 2. For calibration of the multidimensional parameter space we applied the Covariance Matrix Adaption-Evolution Strategy algorithm. The model is able to sufficiently replicate the vertical field temperature profiles of Lake Tegel as well as to simulate similar concentration ranges of phosphate, dissolved oxygen and nitrate. The calibrated model is used to run an uncertainty analysis by sampling the simulated data within the meaning of the Metropolis-Hastings algorithm. Finally, we are evaluating different scenarios: (1) changing air temperatures, precipitation and wind speed due to effects of climate change, (2) decreased discharges into the lake due to bypassing treated effluents into a near stream instead of Lake Tegel, and (3) increased nutrient elimination at the upstream treatment plants. We are focusing on quantifying the impact of these scenarios on lake stability as well as the abundance and distribution of nutrients.

  14. Advanced technologies for scalable ATLAS conditions database access on the grid

    NASA Astrophysics Data System (ADS)

    Basset, R.; Canali, L.; Dimitrov, G.; Girone, M.; Hawkings, R.; Nevski, P.; Valassi, A.; Vaniachine, A.; Viegas, F.; Walker, R.; Wong, A.

    2010-04-01

    During massive data reprocessing operations an ATLAS Conditions Database application must support concurrent access from numerous ATLAS data processing jobs running on the Grid. By simulating realistic work-flow, ATLAS database scalability tests provided feedback for Conditions Db software optimization and allowed precise determination of required distributed database resources. In distributed data processing one must take into account the chaotic nature of Grid computing characterized by peak loads, which can be much higher than average access rates. To validate database performance at peak loads, we tested database scalability at very high concurrent jobs rates. This has been achieved through coordinated database stress tests performed in series of ATLAS reprocessing exercises at the Tier-1 sites. The goal of database stress tests is to detect scalability limits of the hardware deployed at the Tier-1 sites, so that the server overload conditions can be safely avoided in a production environment. Our analysis of server performance under stress tests indicates that Conditions Db data access is limited by the disk I/O throughput. An unacceptable side-effect of the disk I/O saturation is a degradation of the WLCG 3D Services that update Conditions Db data at all ten ATLAS Tier-1 sites using the technology of Oracle Streams. To avoid such bottlenecks we prototyped and tested a novel approach for database peak load avoidance in Grid computing. Our approach is based upon the proven idea of pilot job submission on the Grid: instead of the actual query, an ATLAS utility library sends to the database server a pilot query first.

  15. Status of forensic odontology in metro and in tier 2 city in urban India.

    PubMed

    Khare, Parul; Chandra, Shaleen; Raj, Vineet; Verma, Poonam; Subha, G; Khare, Abhishek

    2013-07-01

    Dentist can play a significant role in identifying the victims or perpetrators of crime as well as in disasters. Knowledge about the various aspects of forensic science as well as dental and related evidences can help a dental practitioner in assisting the civil agencies in such cases. To evaluate the awareness and knowledge of forensic odontology among dentists in a metropolitan and a tier 2 city. Seven hundred and seventy four dentists were included in this survey. Questionnaire was designed to assess the knowledge, aptitude, and status of practice of forensic odontology. Data was analyzed by comparing overall awareness of forensic odontology among dentists in metro and tier 2 city as well as between the different groups. Apart from the source of knowledge, no significant differences were seen in respondents of metropolitan and tier 2 city. Significantly higher proportion of subjects in metro reported journals as source of knowledge (P < 0.001), whereas it was newspaper in tier 2 city (P = 0.001). On comparing the mean scores of knowledge (k), aptitude (a), and practice (p) among different study groups, it was found that all the three scores were highest for practitioner cum academician (PA) group (k - 2.37, a - 0.69, P - 0.17). Knowledge scores were minimum for pure practitioner (PP) group (1.98), and attitude and practice scores of pure academician (A) group were minimum (a - 0.53, P - 0.06). Respondents had low knowledge about the applications of forensic odontology in routine practice; hence, steps must be taken to educate the dental practitioners about its clinical applications.

  16. Bathymetric contour maps of lakes surveyed in Iowa in 2004

    USGS Publications Warehouse

    Linhart, S. Mike; Lund, Kris D.

    2006-01-01

    Bathymetric data were collected using a boat-mounted, differential global positioning system, echo depth-sounding equipment, and computer software. Data were processed with commercial hydrographic software and exported into a geographic information system for mapping and calculating area and volume. Lake volume estimates ranged from 83,924,000 cubic feet (1,930 acre-feet) at Lake Darling to 5,967,000 cubic feet (140 acre-feet) at Upper Gar Lake. Surface area estimates ranged from 10,660,000 square feet (240 acres) at Lake Darling to 1,557,000 square feet (36 acres) at Upper Gar Lake.

  17. Bathymetric contour maps for lakes surveyed in Iowa in 2003

    USGS Publications Warehouse

    Linhart, S. Mike; Lund, Kris D.

    2006-01-01

    Bathymetric data were collected using boat-mounted, differential global positioning system (GPS) equipment, echo depth-sounding equipment, and computer software. Data were processed with commercial hydrographic software and exported into a geographic information system (GIS) for mapping and calculation of area and volume. Lake volume estimates ranged from 590,501,000 cubic feet (13,600 acre-feet) at Lake Macbride to 17,831,000 cubic feet (410 acre-feet) at Lake Meyer. Surface area estimates ranged from 38,118,000 square feet (875 acres) at Lake Macbride to 1,373,000 square feet (32 acres) at Lake Meyer.

  18. 75 FR 57958 - Solicitation of Written Comments on Draft Tier 2 Strategies/Modules for Inclusion in the “HHS...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-23

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Solicitation of Written Comments on Draft Tier 2...'' AGENCY: Department of Health and Human Services, Office of the Assistant Secretary for Health, Office of....'' To further the HHS mission to protect the health and well-being of the nation, the HHS Steering...

  19. Early Numeracy Intervention Program for First-Grade Students with Mathematics Difficulties

    ERIC Educational Resources Information Center

    Bryant, Diane Pedrotty; Bryant, Brian R.; Roberts, Greg; Vaughn, Sharon; Pfannenstiel, Kathleen Hughes; Porterfield, Jennifer; Gersten, Russell

    2011-01-01

    The purpose of this study was to determine the effects of an early numeracy preventative Tier 2 intervention on the mathematics performance of first-grade students with mathematics difficulties. Researchers used a pretest-posttest control group design with randomized assignment of 139 students to the Tier 2 treatment condition and 65 students to…

  20. Determining Responsiveness to Tier 2 Intervention in Response to Intervention: Level of Performance, Growth, or Both

    ERIC Educational Resources Information Center

    Milburn, Trelani F.; Lonigan, Christopher J.; Phillips, Beth M.

    2017-01-01

    This response to intervention study examined agreement between classification methods of preschool children's responsiveness to Tier 2 intervention using level of performance (25th percentile), growth (equivalent to small and medium effect sizes), and both level of performance and growth in a dual-discrepancy approach. Overall, 181 children…

  1. 77 FR 44696 - Agency Forms Submitted for OMB Review, Request for Comments

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-30

    ... the social security equivalent and non- social security equivalent portions of Tier I, Tier II, vested... OMB control number of the ICR. For proper consideration of your comments, it is best if the RRB and... and Compensation Reports. OMB Control Number: 3220-0014. Form(s) submitted: DC-2 and DC-2a. Type of...

  2. Examining the Efficacy of a Tier 2 Kindergarten Mathematics Intervention.

    PubMed

    Clarke, Ben; Doabler, Christian T; Smolkowski, Keith; Baker, Scott K; Fien, Hank; Strand Cary, Mari

    2016-01-01

    This study examined the efficacy of a Tier 2 kindergarten mathematics intervention program, ROOTS, focused on developing whole number understanding for students at risk in mathematics. A total of 29 classrooms were randomly assigned to treatment (ROOTS) or control (standard district practices) conditions. Measures of mathematics achievement were collected at pretest and posttest. Treatment and control students did not differ on mathematics assessments at pretest. Gain scores of at-risk intervention students were significantly greater than those of control peers, and the gains of at-risk treatment students were greater than the gains of peers not at risk, effectively reducing the achievement gap. Implications for Tier 2 mathematics instruction in a response to intervention (RtI) model are discussed. © Hammill Institute on Disabilities 2014.

  3. Newborn Screening for Vitamin B6 Non-responsive Classical Homocystinuria: Systematical Evaluation of a Two-Tier Strategy.

    PubMed

    Okun, Jürgen G; Gan-Schreier, Hongying; Ben-Omran, Tawfeq; Schmidt, Kathrin V; Fang-Hoffmann, Junmin; Gramer, Gwendolyn; Abdoh, Ghassan; Shahbeck, Noora; Al Rifai, Hilal; Al Khal, Abdul Latif; Haege, Gisela; Chiang, Chuan-Chi; Kasper, David C; Wilcken, Bridget; Burgard, Peter; Hoffmann, Georg F

    2017-01-01

    In classical homocystinuria (HCU, MIM# 236200) due to the deficiency of cystathionine β-synthase (EC 4.2.1.22) there is a clear evidence for the success of early treatment. The aim of this study was to develop and evaluate a two-tier strategy for HCU newborn screening. We reevaluated data from our newborn screening programme for Qatar in a total number of 125,047 neonates including 30 confirmed HCU patients. Our hitherto existing screening strategy includes homocysteine (Hcy) measurements in every child, resulting in a unique dataset for evaluation of two-tier strategies. Reevaluation included methionine (Met) levels, Met to phenylalanine (Phe) ratio, and Hcy. Four HCU cases identified after database closure were also included in the evaluation. In addition, dried blood spot samples selected by Met values >P97 in the newborn screening programs in Austria, Australia, the Netherlands, and Taiwan were analyzed for Hcy. Met to Phe ratio was found to be more effective for first sieve than Met, sorting out nearly 90% of normal samples. Only 10% of the samples would have to be processed by second-tier measurement of Hcy in dried blood spots. As no patient with HCU was found neither in the samples investigated for HCU, nor by clinical diagnosis in the other countries, the generalization of our two-tier strategy could only be tested indirectly. The finally derived two-tier algorithm using Met to Phe ratio as first- and Hcy as second-tier requires 10% first-tier positives to be transferred to Hcy measurement, resulting in 100% sensitivity and specificity in HCU newborn screening.

  4. Comparison of one-tier and two-tier newborn screening metrics for congenital adrenal hyperplasia.

    PubMed

    Sarafoglou, Kyriakie; Banks, Kathryn; Gaviglio, Amy; Hietala, Amy; McCann, Mark; Thomas, William

    2012-11-01

    Newborn screening (NBS) for the classic forms of congenital adrenal hyperplasia (CAH) is mandated in all states in the United States. Compared with other NBS disorders, the false-positive rate (FPR) of CAH screening remains high and has not been significantly improved by adjusting 17α-hydroxyprogesterone cutoff values for birth weight and/or gestational age. Minnesota was the first state to initiate, and only 1 of 4 states currently performing, second-tier steroid profiling for CAH. False-negative rates (FNRs) for CAH are not well known. This is a population-based study of all Minnesota infants (769,834) born 1999-2009, grouped by screening protocol (one-tier with repeat screen, January 1999 to May 2004; two-tier with second-tier steroid profiling, June 2004 to December 2009). FPR, FNR, and positive predictive value (PPV) were calculated per infant, rather than per sample, and compared between protocols. Overall, 15 false-negatives (4 salt-wasting, 11 simple-virilizing) and 45 true-positives were identified from 1999 to 2009. With two-tier screening, FNR was 32%, FPR increased to 0.065%, and PPV decreased to 8%, but these changes were not statistically significant. Second-tier steroid profiling obviated repeat screens of borderline results (355 per year average). In comparing the 2 screening protocols, the FPR of CAH NBS remains high, the PPV remains low, and false-negatives occur more frequently than has been reported. Physicians should be cautioned that a negative NBS does not necessarily rule out classic CAH; therefore, any patient for whom there is clinical concern for CAH should receive immediate diagnostic testing.

  5. A Forecast Skill Comparison between CliPAS One-Tier and Two-Tier Hindcast Experiments

    NASA Astrophysics Data System (ADS)

    Lee, J.; Wang, B.; Kang, I.

    2006-05-01

    A 24-year (1981-2004) MME hindcast experimental dataset is produced under the "Climate Prediction and Its Application to Society" (CliPAS) project sponsored by Korean Meteorological Administration (KMA). This dataset consists of 5 one-tier model systems from National Aeronautics and Space Administration (NASA), National Center for Environmental Prediction (NCEP), Frontier Research Center for Global Change (FRCGC), Seoul National University (SNU), and University of Hawaii (UH) and 5 two-tier model systems from Florida State University (FSU), Geophysical Fluid Dynamic Lab (GFDL), SNU, and UH. Multi-model Ensemble (MME) Forecast skills of seasonal precipitation and atmospheric circulation are compared between CliPAS one-tier and two-tier hindcast experiments for seasonal mean precipitation and atmospheric circulation. For winter prediction, two-tier MME has a comparable skill to one-tier MME. However, it is demonstrated that in the Asian-Australian monsoon (A-AM) heavy precipitation regions, one-tier systems are superior to two-tier systems in summer season. The reason is that inclusion of the local warm pool- monsoon interaction in the one-tier system improves the ENSO teleconnection with monsoon regions. Both one-tier and two-tier MME fail to predict Indian monsoon circulation, while they have a significantly good skill for the broad scale monsoon circulation defined by Webster and Yang index. One-tier system has a much better skill to predict the monsoon circulation over the western North pacific where air-sea interaction plays an important role than two-tier system.

  6. Promoting the hydrostatic conceptual change test (HCCT) with four-tier diagnostic test item

    NASA Astrophysics Data System (ADS)

    Purwanto, M. G.; Nurliani, R.; Kaniawati, I.; Samsudin, A.

    2018-05-01

    Hydrostatic Conceptual Change Test (HCCT) is a diagnostic test instrument to identify students’ conception on Hydrostatic field. It is very important to support the learning process in the classroom. Based on that point of view, the researcher decided to develop HCCT instrument test into four-tier test diagnostic items. The resolve of this research is planned as the first step of four-tier test-formatted HCCT development as one of investigative test instrument on Hydrostatic. The research method used the 4D model which has four comprehensive steps: 1) defining, 2) designing, 3) developing and 4) disseminating. The instrument developed has been tried to 30 students in one of senior high schools. The data showed that four-tier- test-formatted HCCT is able to identify student’s conception level of Hydrostatic. In conclusion, the development of four-tier test-formatted HCCT is one of potential diagnostic test instrument that able to classify the category of students who misconception, no understanding, understanding, partial understanding and no codeable about concept of Hydrostatic.

  7. Tier 3 Certification Fuel Impacts Test Program

    EPA Science Inventory

    The recent Tier 3 regulations for light duty vehicles introduced a new certification fuel designed to be more characteristic of current market fuels. A laboratory test program was conducted to measure differences in CO2 and fuel economy between the current and future certificatio...

  8. Great Lakes modeling: Are the mathematics outpacing the data and our understanding of the system?

    EPA Science Inventory

    Mathematical modeling in the Great Lakes has come a long way from the pioneering work done by Manhattan College in the 1970s, when the models operated on coarse computational grids (often lake-wide) and used simple eutrophication formulations. Moving forward 40 years, we are now...

  9. Middle School Students' Responses to Two-Tier Tasks

    ERIC Educational Resources Information Center

    Haja, Shajahan; Clarke, David

    2011-01-01

    The structure of two-tier testing is such that the first tier consists of a multiple-choice question and the second tier requires justifications for choices of answers made in the first tier. This study aims to evaluate two-tier tasks in "proportion" in terms of students' capacity to write and select justifications and to examine the effect of…

  10. Evaluation and Analysis of Urmia Lake Water Level Fluctuations Bettwen 1998-2006 Using Landsat Images and TOPEX Altimetry Data

    NASA Astrophysics Data System (ADS)

    Zahir, N.; Ali, A.

    2015-12-01

    The Lake Urmiah has undergone a drastic shrinkage in size over the past few decades. The initial intention of this paper is to present an approach for determining the so called "salient times" during which the trend of the shrinkage process is accelerated or decelerated. To find these salient times, a quasi_continuous curve was optimally fitted to the Topex altimetry data within the period 1998 to 2006. To find the salient points within this period of time, the points of inflections of the fitted curve is computed using a second derivative approach. The water volume was also computed using 16 cloud free Landsat images of the Lake within the periods of 1998 to 2006. In the first stage of the water volume calculation, the pixels of the Lake were segmented using the Automated Water Extraction Index (AWEI) and the shorelines of the Lake were extracted by a boundary detecting operator using the generated binary image of the Lake surface. The water volume fluctuation rate was then computed under the assumption that the two successive Lake surfaces and their corresponding water level differences demonstrate approximately a truncated pyramid. The analysis of the water level fluctuation rates were further extended by a sinusoidal curve fitted to the Topex altimetry data. This curve was intended to model the seasonal fluctuations of the water level. In the final stage of this article, the correlation between the fluctuation rates and the precipitation and temperature variations were also numerically determined. This paper reports in some details the stages mentioned above.

  11. Interoperating Cloud-based Virtual Farms

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Colamaria, F.; Colella, D.; Casula, E.; Elia, D.; Franco, A.; Lusso, S.; Luparello, G.; Masera, M.; Miniello, G.; Mura, D.; Piano, S.; Vallero, S.; Venaruzzo, M.; Vino, G.

    2015-12-01

    The present work aims at optimizing the use of computing resources available at the grid Italian Tier-2 sites of the ALICE experiment at CERN LHC by making them accessible to interactive distributed analysis, thanks to modern solutions based on cloud computing. The scalability and elasticity of the computing resources via dynamic (“on-demand”) provisioning is essentially limited by the size of the computing site, reaching the theoretical optimum only in the asymptotic case of infinite resources. The main challenge of the project is to overcome this limitation by federating different sites through a distributed cloud facility. Storage capacities of the participating sites are seen as a single federated storage area, preventing the need of mirroring data across them: high data access efficiency is guaranteed by location-aware analysis software and storage interfaces, in a transparent way from an end-user perspective. Moreover, the interactive analysis on the federated cloud reduces the execution time with respect to grid batch jobs. The tests of the investigated solutions for both cloud computing and distributed storage on wide area network will be presented.

  12. Methane emission estimation from landfills in Korea (1978-2004): quantitative assessment of a new approach.

    PubMed

    Kim, Hyun-Sun; Yi, Seung-Muk

    2009-01-01

    Quantifying methane emission from landfills is important to evaluating measures for reduction of greenhouse gas (GHG) emissions. To quantify GHG emissions and identify sensitive parameters for their measurement, a new assessment approach consisting of six different scenarios was developed using Tier 1 (mass balance method) and Tier 2 (the first-order decay method) methodologies for GHG estimation from landfills, suggested by the Intergovernmental Panel on Climate Change (IPCC). Methane emissions using Tier 1 correspond to trends in disposed waste amount, whereas emissions from Tier 2 gradually increase as disposed waste decomposes over time. The results indicate that the amount of disposed waste and the decay rate for anaerobic decomposition were decisive parameters for emission estimation using Tier 1 and Tier 2. As for the different scenarios, methane emissions were highest under Scope 1 (scenarios I and II), in which all landfills in Korea were regarded as one landfill. Methane emissions under scenarios III, IV, and V, which separated the dissimilated fraction of degradable organic carbon (DOC(F)) by waste type and/or revised the methane correction factor (MCF) by waste layer, were underestimated compared with scenarios II and III. This indicates that the methodology of scenario I, which has been used in most previous studies, may lead to an overestimation of methane emissions. Additionally, separate DOC(F) and revised MCF were shown to be important parameters for methane emission estimation from landfills, and revised MCF by waste layer played an important role in emission variations. Therefore, more precise information on each landfill and careful determination of parameter values and characteristics of disposed waste in Korea should be used to accurately estimate methane emissions from landfills.

  13. Computing Evaporation Using Meteorological Data for Hydrological Budget of Lake Wapalanne in NJ School of Conservation

    NASA Astrophysics Data System (ADS)

    Jordan, J. J.; Barrett, K. R.; Galster, J. C.; Ophori, D. U.; Flores, D.; Kelly, S. A.; Lutey, A. M.

    2011-12-01

    Lake Wapalanne is small manmade lake about 5.4 hectares in northwest New Jersey in the Highlands Physiographic province within permanently protected land. The lake's surrounding area consists of forested vegetation and is relatively unoccupied which minimizes human influence. The lake's small size, minimal external influence, geographic isolation, and protected status provide an optimal research environment to record meteorological data used in calculation of potential evaporation. Between July 7h and August 3rd meteorological data was collected from a professional weather station placed on an island directly in the center of Lake Wapalanne. The Vantage Pro2 weather station provided accurate readings of temperate, humidity, wind-speed and direction, precipitation, and atmospheric pressure. A bathometric survey of the lake was conducted to determine the surface area with variations in depth of the lake's water level. Using the collected weather station data, a rate of potential evaporation was determined with several evaporation equations. A quantified volume was then derived from the rate and surface area of the lake. Using small scale evaporation measurements of known volumes of water within small pans placed in the lake water and National Oceanic and Atmospheric Administration evaporation stations near the experiment site, a comparison and validation of the calculated potential evaporation accuracy and regional evaporation is achieved. This three year study is part of an ongoing NSF Research Experience for Undergraduates (REU) project that encompasses additional topics of lake research; see abstract from Kelly et al. AGU 2011 for more information on the lake's hydrologic budget. The results and methods of this study will be of use in future forecasting and baseline measurements of hydrologic budgets for lakes and reservoirs within regional proximity, which provide drinking water to over five million people in the State of New Jersey.

  14. Comparison of emission estimates for non-CO2 greenhouse gases from livestock and poultry in Korea from 1990 to 2010.

    PubMed

    Paik, Chunhyun; Chung, Yongjoo; Kim, Hugon; Kim, Young Jin

    2016-04-01

    It has often been claimed that non-carbon dioxide greenhouse gases (NCGGs), such as methane, nitrous oxide and fluorinated greenhouse gases, are significant contributors to climate change. Here we nvestigate emission estimates of methane and nitrous oxide from livestock and poultry production, which is recognized as a major source of those NCGGs, in Korea over the period of 1990 through 2010. Based on the data on livestock and poultry populations, emission estimates of methane and nitrous oxide are first derived based on the Tier 1 approach. Then, the Tier 2 approach is adopted to obtain emission estimates of methane and nitrous oxide from cattle, which are known to be the largest sources of these NCGGs and account for about 70% of emissions from livestock and poultry in Korea. The result indicates that the Tier 2 estimates of methane and nitrous oxide emissions from enteric fermentation and manure management are significantly different from the Tier 1 estimates over the analysis period. © 2015 Japanese Society of Animal Science.

  15. OMI observations of bromine monoxide emissions from salt lakes

    NASA Astrophysics Data System (ADS)

    Suleiman, R. M.; Chance, K.; Liu, X.; Gonzalez Abad, G.; Kurosu, T. P.

    2015-12-01

    In this study, we analyze bromine monoxide (BrO) data from the Ozone Monitoring Instrument (OMI) over various salt lakes. We used OMI data from 2005 to 2014 to investigate BrO signatures from salt lakes. The salt lakes regions we cover include Dead Sea; Salt Lake City, US; Salar de Uyuni, Bolivia; and Namtso, Tibet. Elevated signatures of BrO was found in July and August BrO monthly averages over the Dead Sea. Similar results were found in the BrO monthly averages for August 2006 for the Bolivian Salt Flats. We present a detailed description of the retrieval algorithm for the OMI operational bromine monoxide (BrO) product. The algorithm is based on direct fitting of radiances from 319.0-347.5 nm, within the UV-2 channel of OMI. Radiances are modeled from the solar irradiance, attenuated by contributions from the target gas and interfering gases, rotational Raman scattering, additive and multiplicative closure polynomials and a common mode spectrum. The common mode spectra (one per cross-track position, computed on-line) are the average of several hundred fitting residuals. They include any instrument effects that are unrelated to molecular scattering and absorption cross sections. The BrO retrieval uses albedo- and wavelength-dependent air mass factors (AMFs), which have been pre-computed using climatological BrO profiles. The wavelength-dependent AMF is applied pre-fit to the BrO cross-sections so that vertical column densities are retrieved directly. We validate OMI BrO with ground-based measurements from three stations (Harestua, Lauder, and Barrow) and with chemical transport model simulations. We analyze the global distribution and seasonal variation of BrO and investigate BrO emissions from volcanoes and salt lakes.

  16. Long-term simulations of dissolved oxygen concentrations in Lake Trout lakes

    NASA Astrophysics Data System (ADS)

    Jabbari, A.; Boegman, L.; MacKay, M.; Hadley, K.; Paterson, A.; Jeziorski, A.; Nelligan, C.; Smol, J. P.

    2016-02-01

    Lake Trout are a rare and valuable natural resource that are threatened by multiple environmental stressors. With the added threat of climate warming, there is growing concern among resource managers that increased thermal stratification will reduce the habitat quality of deep-water Lake Trout lakes through enhanced oxygen depletion. To address this issue, a three-part study is underway, which aims to: analyze sediment cores to understand the past, develop empirical formulae to model the present and apply computational models to forecast the future. This presentation reports on the computational modeling efforts. To this end, a simple dissolved oxygen sub-model has been embedded in the one-dimensional bulk mixed-layer thermodynamic Canadian Small Lake Model (CSLM). This model is currently being incorporated into the Canadian Land Surface Scheme (CLASS), the primary land surface component of Environment Canada's global and regional climate modelling systems. The oxygen model was calibrated and validated by hind-casting temperature and dissolved oxygen profiles from two Lake Trout lakes on the Canadian Shield. These data sets include 5 years of high-frequency (10 s to 10 min) data from Eagle Lake and 30 years of bi-weekly data from Harp Lake. Initial results show temperature and dissolved oxygen was predicted with root mean square error <1.5 °C and <3 mgL-1, respectively. Ongoing work is validating the model, over climate-change relevant timescales, against dissolved oxygen reconstructions from the sediment cores and predicting future deep-water temperature and dissolved oxygen concentrations in Canadian Lake Trout lakes under future climate change scenarios. This model will provide a useful tool for managers to ensure sustainable fishery resources for future generations.

  17. Bimanual Psychomotor Performance in Neurosurgical Resident Applicants Assessed Using NeuroTouch, a Virtual Reality Simulator.

    PubMed

    Winkler-Schwartz, Alexander; Bajunaid, Khalid; Mullah, Muhammad A S; Marwa, Ibrahim; Alotaibi, Fahad E; Fares, Jawad; Baggiani, Marta; Azarnoush, Hamed; Zharni, Gmaan Al; Christie, Sommer; Sabbagh, Abdulrahman J; Werthner, Penny; Del Maestro, Rolando F

    Current selection methods for neurosurgical residents fail to include objective measurements of bimanual psychomotor performance. Advancements in computer-based simulation provide opportunities to assess cognitive and psychomotor skills in surgically naive populations during complex simulated neurosurgical tasks in risk-free environments. This pilot study was designed to answer 3 questions: (1) What are the differences in bimanual psychomotor performance among neurosurgical residency applicants using NeuroTouch? (2) Are there exceptionally skilled medical students in the applicant cohort? and (3) Is there an influence of previous surgical exposure on surgical performance? Participants were instructed to remove 3 simulated brain tumors with identical visual appearance, stiffness, and random bleeding points. Validated tier 1, tier 2, and advanced tier 2 metrics were used to assess bimanual psychomotor performance. Demographic data included weeks of neurosurgical elective and prior operative exposure. This pilot study was carried out at the McGill Neurosurgical Simulation Research and Training Center immediately following neurosurgical residency interviews at McGill University, Montreal, Canada. All 17 medical students interviewed were asked to participate, of which 16 agreed. Performances were clustered in definable top, middle, and bottom groups with significant differences for all metrics. Increased time spent playing music, increased applicant self-evaluated technical skills, high self-ratings of confidence, and increased skin closures statistically influenced performance on univariate analysis. A trend for both self-rated increased operating room confidence and increased weeks of neurosurgical exposure to increased blood loss was seen in multivariate analysis. Simulation technology identifies neurosurgical residency applicants with differing levels of technical ability. These results provide information for studies being developed for longitudinal studies on the acquisition, development, and maintenance of psychomotor skills. Technical abilities customized training programs that maximize individual resident bimanual psychomotor training dependant on continuously updated and validated metrics from virtual reality simulation studies should be explored. Copyright © 2016 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  18. 50 CFR 86.53 - What are funding tiers?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 9 2013-10-01 2013-10-01 false What are funding tiers? 86.53 Section 86... (BIG) PROGRAM How States Apply for Grants § 86.53 What are funding tiers? (a) This grant program will consist of two tiers of funding. (i) You may apply for one or both tiers. (ii) The two tiers will allow...

  19. Grow--a computer subroutine that projects the growth of trees in the Lake States' forests.

    Treesearch

    Gary J. Brand

    1981-01-01

    A computer subroutine, Grow, has been written in 1977 Standard FORTRAN to implement a distance-independent, individual tree growth model for Lake States' forests. Grow is a small and easy-to-use version of the growth model. All the user has to do is write a calling program to read initial conditions, call Grow, and summarize the results.

  20. 75 FR 7426 - Tier 2 Light-Duty Vehicle and Light-Duty Truck Emission Standards and Gasoline Sulfur Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-19

    ... 2060-AI23; 2060-AQ12 Tier 2 Light-Duty Vehicle and Light-Duty Truck Emission Standards and Gasoline... February 10, 2000 (65 FR 6698), EPA published emission standards for light-duty vehicles and light-duty... new passenger cars and light trucks, including pickup trucks, vans, minivans, and sport-utility...

  1. RTI in a Middle School: Findings and Practical Implications of a Tier 2 Reading Comprehension Study

    ERIC Educational Resources Information Center

    Faggella-Luby, Michael; Wardwell, Michelle

    2011-01-01

    Response to intervention (RTI) has received considerable attention from both researchers and practitioners as a schoolwide model for service delivery. However, research is limited on RTI applications in middle and high schools. The purpose of this article is to describe the outcomes of an experimental examination of a secondary (Tier 2) literacy…

  2. Intensity of Focus, Richness of Content: Crafting Tier 2 Response to Intervention in an Era of the Common Core

    ERIC Educational Resources Information Center

    Jaeger, Elizabeth L.

    2016-01-01

    This article describes a Tier 2 intervention program for fourth graders that is well suited to supporting implementation of the Common Core State Standards. Screening assessments and miscue analyses were used to clarify students' strengths and challenges. Students then attended only classes that were suited to their particular literacy needs,…

  3. Understanding Unresponsiveness to Tier 2 Reading Intervention: Exploring the Classification and Profiles of Adequate and Inadequate Responders in First Grade

    ERIC Educational Resources Information Center

    Toste, Jessica R.; Compton, Donald L.; Fuchs, Douglas; Fuchs, Lynn S.; Gilbert, Jennifer K.; Cho, Eunsoo; Barquero, Laura A.; Bouton, Bobette D.

    2014-01-01

    The purpose of the current study was to examine academic and cognitive profiles of first graders who responded adequately and inadequately to intensive small-group reading intervention (Tier 2), as well as assess how these profiles differ based on the criteria used for classification of unresponsiveness. Nonresponders were identified using two…

  4. The Targeted Reading Intervention (TRI): A Classroom Teacher Tier 2 Intervention to Help Struggling Readers in Early Elementary School

    ERIC Educational Resources Information Center

    Vernon-Feagans, Lynne; Amendum, Steve; Kainz, Kirsten; Ginsburg, Marnie

    2009-01-01

    The two studies presented in this report were designed to test the effectiveness of a new diagnostic-based reading intervention for classroom teachers, called the Targeted Reading Intervention (TRI). This TRI Tier 2 intervention stressed diagnostic teaching as the key to helping struggling readers make rapid progress in reading in the regular…

  5. Preliminary Evaluation of a Tier 2 Mathematics Intervention for First-Grade Students: Using a Theory of Change to Guide Formative Evaluation Activities

    ERIC Educational Resources Information Center

    Clarke, Ben; Doabler, Christian T.; Strand Cary, Mari; Kosty, Derek; Baker, Scott; Fien, Hank; Smolkowski, Keith

    2014-01-01

    This pilot study examined the efficacy of a Tier 2 first-grade mathematics intervention program targeting whole-number understanding for students at risk in mathematics. The study used a randomized block design. Students (N = 89) were randomly assigned to treatment (Fusion) or control (standard district practice) conditions. Measures of…

  6. Small Groups, Big Gains: Efficacy of a Tier 2 Phonological Awareness Intervention with Preschoolers with Early Literacy Deficits

    ERIC Educational Resources Information Center

    Kruse, Lydia G.; Spencer, Trina D.; Olszewski, Arnold; Goldstein, Howard

    2015-01-01

    Purpose: The purpose of the present study was to evaluate the efficacy of a phonological awareness (PA) intervention, designed for Tier 2 instruction in a Response to Intervention (RTI) model, delivered to small groups of preschoolers. Method: A multiple-baseline design across participants was used to evaluate the efficacy of the intervention on…

  7. Tier One: Draft Environmental Impact Statement. Volume 1. Realignment of Mountain Home Air Force Base and Proposed Expanded Range Capability

    DTIC Science & Technology

    1990-02-01

    one of the most valuable assets in the Air Force inventory . This Tier 1 EIS is one tool designed to contribute to the decisionmaking process. The...M3.2-7 M3.2.5 Emission Inventories ...M3.5.3.2 Prehistoric Archaeological Resource Inventory ........................................................... M3.5-4 M3.5.4 Historic and Architectural

  8. Meeting the Needs of ELLs with Response to Instruction and Intervention: A Mixed Methods Case Study Focusing on the Implementation of Tier 2 Intervention

    ERIC Educational Resources Information Center

    Nguyen-Quang, Florence

    2012-01-01

    The purpose of this study was to examine how the Response to Instruction and Intervention (RtI2) framework was implemented at a culturally, ethnically, and linguistically diverse urban school. This study also evaluated the effectiveness of Burst: Early Literacy Intervention (Wireless Generation®, 2009), a Tier 2 intervention program, in regards to…

  9. Induction of a Tier-1-Like Phenotype in Diverse Tier-2 Isolates by Agents That Guide HIV-1 Env to Perturbation-Sensitive, Nonnative States.

    PubMed

    Johnson, Jacklyn; Zhai, Yinjie; Salimi, Hamid; Espy, Nicole; Eichelberger, Noah; DeLeon, Orlando; O'Malley, Yunxia; Courter, Joel; Smith, Amos B; Madani, Navid; Sodroski, Joseph; Haim, Hillel

    2017-08-01

    The envelope glycoproteins (Envs) on the surfaces of HIV-1 particles are targeted by host antibodies. Primary HIV-1 isolates demonstrate different global sensitivities to antibody neutralization; tier-1 isolates are sensitive, whereas tier-2 isolates are more resistant. Single-site mutations in Env can convert tier-2 into tier-1-like viruses. We hypothesized that such global change in neutralization sensitivity results from weakening of intramolecular interactions that maintain Env integrity. Three strategies commonly applied to perturb protein structure were tested for their effects on global neutralization sensitivity: exposure to low temperature, Env-activating ligands, and a chaotropic agent. A large panel of diverse tier-2 isolates from clades B and C was analyzed. Incubation at 0°C, which globally weakens hydrophobic interactions, causes gradual and reversible exposure of the coreceptor-binding site. In the cold-induced state, Envs progress at isolate-specific rates to unstable forms that are sensitive to antibody neutralization and then gradually lose function. Agents that mimic the effects of CD4 (CD4Ms) also induce reversible structural changes to states that exhibit isolate-specific stabilities. The chaotropic agent urea (at low concentrations) does not affect the structure or function of native Env. However, urea efficiently perturbs metastable states induced by cold and CD4Ms and increases their sensitivity to antibody neutralization and their inactivation rates Therefore, chemical and physical agents can guide Env from the stable native state to perturbation-sensitive forms and modulate their stability to bestow tier-1-like properties on primary tier-2 strains. These concepts can be applied to enhance the potency of vaccine-elicited antibodies and microbicides at mucosal sites of HIV-1 transmission. IMPORTANCE An effective vaccine to prevent transmission of HIV-1 is a primary goal of the scientific and health care communities. Vaccine-elicited antibodies target the viral envelope glycoproteins (Envs) and can potentially inhibit infection. However, the potency of such antibodies is generally low. Single-site mutations in Env can enhance the global sensitivity of HIV-1 to neutralization by antibodies. We found that such a hypersensitivity phenotype can also be induced by agents that destabilize protein structure. Exposure to 0°C or low concentrations of Env-activating ligands gradually guides Env to metastable forms that expose cryptic epitopes and that are highly sensitive to neutralization. Low concentrations of the chaotropic agent urea do not affect native Env but destabilize perturbed states induced by cold or CD4Ms and increase their neutralization. The concept of enhancing antibody sensitivity by chemical agents that affect the structural stability of proteins can be applied to increase the potency of topical microbicides and vaccine-elicited antibodies. Copyright © 2017 American Society for Microbiology.

  10. The combined approach to lysis utilizing eptifibatide and rt-PA in acute ischemic stroke: the CLEAR stroke trial.

    PubMed

    Pancioli, Arthur M; Broderick, Joseph; Brott, Thomas; Tomsick, Thomas; Khoury, Jane; Bean, Judy; del Zoppo, Gregory; Kleindorfer, Dawn; Woo, Daniel; Khatri, Pooja; Castaldo, John; Frey, James; Gebel, James; Kasner, Scott; Kidwell, Chelsea; Kwiatkowski, Thomas; Libman, Richard; Mackenzie, Richard; Scott, Phillip; Starkman, Sidney; Thurman, R Jason

    2008-12-01

    Multiple approaches are being studied to enhance the rate of thrombolysis for acute ischemic stroke. Treatment of myocardial infarction with a combination of a reduced-dose fibrinolytic agent and a glycoprotein (GP) IIb/IIIa receptor antagonist has been shown to improve the rate of recanalization versus fibrinolysis alone. The combined approach to lysis utilizing eptifibatide and recombinant tissue-type plasminogen activator (rt-PA) (CLEAR) stroke trial assessed the safety of treating acute ischemic stroke patients within 3 hours of symptom onset with this combination. The CLEAR trial was a National Institutes of Health/National Institute of Neurological Disorders and Stroke-funded multicenter, double-blind, randomized, dose-escalation and safety study. Patients were randomized 3:1 to either low-dose rt-PA (tier 1=0.3 mg/kg, tier 2=0.45 mg/kg) plus eptifibatide (75 microg/kg bolus followed by 0.75 microg/kg per min infusion for 2 hours) or standard-dose rt-PA (0.9 mg/kg). The primary safety end point was the incidence of symptomatic intracerebral hemorrhage within 36 hours. Secondary analyses were performed regarding clinical efficacy. Ninety-four patients (40 in tier 1 and 54 in tier 2) were enrolled. The combination group of the 2 dose tiers (n=69) had a median age of 71 years and a median baseline National Institutes of Health Stroke Scale (NIHSS) score of 14, and the standard-dose rt-PA group (n=25) had a median age of 61 years and a median baseline NIHSS score of 10 (P=0.01 for NIHSS score). Fifty-two (75%) of the combination treatment group and 24 (96%) of the standard treatment group had a baseline modified Rankin scale score of 0 (P=0.04). There was 1 (1.4%; 95% CI, 0% to 4.3%) symptomatic intracranial hemorrhage in the combination group and 2 (8.0%; 95% CI, 0% to 19.2%) in the rt-PA-only arm (P=0.17). During randomization in tier 2, a review by the independent data safety monitoring board demonstrated that the safety profile of combination therapy at the tier 2 doses was such that further enrollment was statistically unlikely to indicate inadequate safety for the combination treatment group, the ultimate outcome of the study. Thus, the study was halted. There was a trend toward increased clinical efficacy of standard-dose rt-PA compared with the combination treatment group. The safety of the combination of reduced-dose rt-PA plus eptifibatide justifies further dose-ranging trials in acute ischemic stroke.

  11. Investigation of land subsidence due to climate changes in surrounding areas of Urmia Lake (located in northwest of Iran) using wavelet coherence analysis of geodetic measurements and methodological data

    NASA Astrophysics Data System (ADS)

    Moghtased-Azar, K.; Mirzaei, A.; Nankali, H. R.; Tavakoli, F.

    2012-04-01

    Urmia Lake (salt lake in northwest of Iran) plays a valuable role in environment, wildlife and economy of Iran and the region, and now faces great challenges for survival. The Lake is in immediate and great danger and rapidly going to become salty desert. During the recent years and new heat wave, Iran, like many other countries are experiencing, is faced with relativity reduced rain fall. From a few years ago environment activists warned about potential dangers. Geodetic measurements, e.g., repeated leveling measurements of first order leveling network of Iran and continuous GPS measurements of Iranian Permanent GPS network of Iran (IPGN) showed that there is subsidence in surrounding areas of the lake. This paper investigates the relation between subsidence and climate changing in the area, using the wavelet coherence of the data of permanent GPS stations and daily methodological data. The results show that there is strong coherence between the subsidence phenomena induced by GPS data and climate warming from January 2009 up to end of August 2009. However, relative lake height variations computed from altimetry observations (TOPEX/POSEIDON (T/P), Jason-1 and Jason-2/OSTM) confirms maximum evaporation rates of the lake in this period.

  12. The Effects of a Tier 3 Intervention on the Mathematics Performance of Second Grade Students With Severe Mathematics Difficulties.

    PubMed

    Bryant, Brian R; Bryant, Diane Pedrotty; Porterfield, Jennifer; Dennis, Minyi Shih; Falcomata, Terry; Valentine, Courtney; Brewer, Chelsea; Bell, Kathy

    2016-01-01

    The purpose of this study was to determine the effectiveness of a systematic, explicit, intensive Tier 3 (tertiary) intervention on the mathematics performance of students in second grade with severe mathematics difficulties. A multiple-baseline design across groups of participants showed improved mathematics performance on number and operations concepts and procedures, which are the foundation for later mathematics success. In the previous year, 12 participants had experienced two doses (first and second semesters) of a Tier 2 intervention. In second grade, the participants continued to demonstrate low performance, falling below the 10th percentile on a researcher-designed universal screener and below the 16th percentile on a distal measure, thus qualifying for the intensive intervention. A project interventionist, who met with the students 5 days a week for 10 weeks (9 weeks for one group), conducted the intensive intervention. The intervention employed more intensive instructional design features than the previous Tier 2 secondary instruction, and also included weekly games to reinforce concepts and skills from the lessons. Spring results showed significantly improved mathematics performance (scoring at or above the 25th percentile) for most of the students, thus making them eligible to exit the Tier 3 intervention. © Hammill Institute on Disabilities 2014.

  13. Vaccine-Elicited Tier 2 HIV-1 Neutralizing Antibodies Bind to Quaternary Epitopes Involving Glycan-Deficient Patches Proximal to the CD4 Binding Site

    PubMed Central

    Crooks, Ema T.; Tong, Tommy; Chakrabarti, Bimal; Narayan, Kristin; Georgiev, Ivelin S.; Menis, Sergey; Huang, Xiaoxing; Kulp, Daniel; Osawa, Keiko; Muranaka, Janelle; Stewart-Jones, Guillaume; Destefano, Joanne; O’Dell, Sijy; LaBranche, Celia; Robinson, James E.; Montefiori, David C.; McKee, Krisha; Du, Sean X.; Doria-Rose, Nicole; Kwong, Peter D.; Mascola, John R.; Zhu, Ping; Schief, William R.; Wyatt, Richard T.; Whalen, Robert G.; Binley, James M.

    2015-01-01

    Eliciting broad tier 2 neutralizing antibodies (nAbs) is a major goal of HIV-1 vaccine research. Here we investigated the ability of native, membrane-expressed JR-FL Env trimers to elicit nAbs. Unusually potent nAb titers developed in 2 of 8 rabbits immunized with virus-like particles (VLPs) expressing trimers (trimer VLP sera) and in 1 of 20 rabbits immunized with DNA expressing native Env trimer, followed by a protein boost (DNA trimer sera). All 3 sera neutralized via quaternary epitopes and exploited natural gaps in the glycan defenses of the second conserved region of JR-FL gp120. Specifically, trimer VLP sera took advantage of the unusual absence of a glycan at residue 197 (present in 98.7% of Envs). Intriguingly, removing the N197 glycan (with no loss of tier 2 phenotype) rendered 50% or 16.7% (n = 18) of clade B tier 2 isolates sensitive to the two trimer VLP sera, showing broad neutralization via the surface masked by the N197 glycan. Neutralizing sera targeted epitopes that overlap with the CD4 binding site, consistent with the role of the N197 glycan in a putative “glycan fence” that limits access to this region. A bioinformatics analysis suggested shared features of one of the trimer VLP sera and monoclonal antibody PG9, consistent with its trimer-dependency. The neutralizing DNA trimer serum took advantage of the absence of a glycan at residue 230, also proximal to the CD4 binding site and suggesting an epitope similar to that of monoclonal antibody 8ANC195, albeit lacking tier 2 breadth. Taken together, our data show for the first time that strain-specific holes in the glycan fence can allow the development of tier 2 neutralizing antibodies to native spikes. Moreover, cross-neutralization can occur in the absence of protecting glycan. Overall, our observations provide new insights that may inform the future development of a neutralizing antibody vaccine. PMID:26023780

  14. Vaccine-Elicited Tier 2 HIV-1 Neutralizing Antibodies Bind to Quaternary Epitopes Involving Glycan-Deficient Patches Proximal to the CD4 Binding Site.

    PubMed

    Crooks, Ema T; Tong, Tommy; Chakrabarti, Bimal; Narayan, Kristin; Georgiev, Ivelin S; Menis, Sergey; Huang, Xiaoxing; Kulp, Daniel; Osawa, Keiko; Muranaka, Janelle; Stewart-Jones, Guillaume; Destefano, Joanne; O'Dell, Sijy; LaBranche, Celia; Robinson, James E; Montefiori, David C; McKee, Krisha; Du, Sean X; Doria-Rose, Nicole; Kwong, Peter D; Mascola, John R; Zhu, Ping; Schief, William R; Wyatt, Richard T; Whalen, Robert G; Binley, James M

    2015-05-01

    Eliciting broad tier 2 neutralizing antibodies (nAbs) is a major goal of HIV-1 vaccine research. Here we investigated the ability of native, membrane-expressed JR-FL Env trimers to elicit nAbs. Unusually potent nAb titers developed in 2 of 8 rabbits immunized with virus-like particles (VLPs) expressing trimers (trimer VLP sera) and in 1 of 20 rabbits immunized with DNA expressing native Env trimer, followed by a protein boost (DNA trimer sera). All 3 sera neutralized via quaternary epitopes and exploited natural gaps in the glycan defenses of the second conserved region of JR-FL gp120. Specifically, trimer VLP sera took advantage of the unusual absence of a glycan at residue 197 (present in 98.7% of Envs). Intriguingly, removing the N197 glycan (with no loss of tier 2 phenotype) rendered 50% or 16.7% (n = 18) of clade B tier 2 isolates sensitive to the two trimer VLP sera, showing broad neutralization via the surface masked by the N197 glycan. Neutralizing sera targeted epitopes that overlap with the CD4 binding site, consistent with the role of the N197 glycan in a putative "glycan fence" that limits access to this region. A bioinformatics analysis suggested shared features of one of the trimer VLP sera and monoclonal antibody PG9, consistent with its trimer-dependency. The neutralizing DNA trimer serum took advantage of the absence of a glycan at residue 230, also proximal to the CD4 binding site and suggesting an epitope similar to that of monoclonal antibody 8ANC195, albeit lacking tier 2 breadth. Taken together, our data show for the first time that strain-specific holes in the glycan fence can allow the development of tier 2 neutralizing antibodies to native spikes. Moreover, cross-neutralization can occur in the absence of protecting glycan. Overall, our observations provide new insights that may inform the future development of a neutralizing antibody vaccine.

  15. Hydrology of Crater, East and Davis Lakes, Oregon; with section on Chemistry of the Lakes

    USGS Publications Warehouse

    Phillips, Kenneth N.; Van Denburgh, A.S.

    1968-01-01

    Crater, East, and Davis Lakes are small bodies of fresh water that occupy topographically closed basins in Holocene volcanic terrane. Because the annual water supply exceeds annual evaporation, water must be lost by seepage from each lake. The seepage rates vary widely both in volume and in percentage of the total water supply. Crater Lake loses about 89 cfs (cubic feet per second), equivalent to about 72 percent of its average annual supply. East Lake loses about 2.3 cfs, or about 44 percent of its estimated supply. Davis Lake seepage varies greatly with lake level, but the average loss is about 150 cfs, more than 90 percent of its total supply. The destination of the seepage loss is not definitely known for any of the lakes. An approximate water budget was computed for stationary level for each lake, by using estimates 'by the writer to supplement the hydrologic data available. The three lake waters are dilute. Crater Lake contains about 80 ppm, (parts per million) of dissolved solids---mostly silica, sodium, and bicarbonate, and lesser amounts of calcium, sulfate, and chloride. Much of the dissolved-solids content of Crater Lake---especially the sulfate and chloride---may be related to fumarole and thermal-spring activity that presumably followed the collapse of Mount Mazama. Although Grater Lake loses an estimated 7,000 tons of its 1.5million-ton salt content each year by leakage, the chemical character of the lake did not change appreciably between 1912 and 1964. East Lake contains 200 ppm of dissolved solids, which includes major proportions of calcium, sodium, bicarbonate, and sulfate, but almost no chloride. The lake apparently receives much of its dissolved solids from subsurface thermal springs. Annual solute loss from East Lake by leakage is about 450 tons, or 3 percent of the lake's 15,000-ton estimated solute content. Davis Lake contains only 48 ppm of dissolved solids, much of which is silica and bicarbonate; chloride is almost completely absent. Approximate physical and hydrologic data for the lakes are summarized in the following table. [Table

  16. KSC-99pp1226

    NASA Image and Video Library

    1999-10-06

    Nancy Nichols, principal of South Lake Elementary School, Titusville, Fla., joins students in teacher Michelle Butler's sixth grade class who are unwrapping computer equipment donated by Kennedy Space Center. South Lake is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. KSC employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated

  17. Genomic sequencing in cystic fibrosis newborn screening: what works best, two-tier predefined CFTR mutation panels or second-tier CFTR panel followed by third-tier sequencing?

    PubMed

    Currier, Robert J; Sciortino, Stan; Liu, Ruiling; Bishop, Tracey; Alikhani Koupaei, Rasoul; Feuchtbaum, Lisa

    2017-10-01

    PurposeThe purpose of this study was to model the performance of several known two-tier, predefined mutation panels and three-tier algorithms for cystic fibrosis (CF) screening utilizing the ethnically diverse California population.MethodsThe cystic fibrosis transmembrane conductance regulator (CFTR) mutations identified among the 317 CF cases in California screened between 12 August 2008 and 18 December 2012 were used to compare the expected CF detection rates for several two- and three-tier screening approaches, including the current California approach, which consists of a population-specific 40-mutation panel followed by third-tier sequencing when indicated.ResultsThe data show that the strategy of using third-tier sequencing improves CF detection following an initial elevated immunoreactive trypsinogen and detection of only one mutation on a second-tier panel.ConclusionIn a diverse population, the use of a second-tier panel followed by third-tier CFTR gene sequencing provides a better detection rate for CF, compared with the use of a second-tier approach alone, and is an effective way to minimize the referrals of CF carriers for sweat testing. Restricting screening to a second-tier testing to predefined mutation panels, even broad ones, results in some missed CF cases and demonstrates the limited utility of this approach in states that have diverse multiethnic populations.

  18. USE OF A LUMPED MODEL (MAGIC) TO BOUND THE ESTIMATION OF POTENTIAL FUTURE EFFECTS OF SULFUR AND NITROGEN DEPOSITION ON LAKE CHEMISTRY IN THE ADIRONDACK MOUNTAINS

    EPA Science Inventory

    Leaching of atmospherically deposited nitrogen from forested watersheds can acidify lakes and streams. Using a modified version of the Model of Acidification of Groundwater in Catchments, we made computer simulations of such effects for 36 lake catchments in the Adirondack Mount...

  19. The Design and Development of a Computerized Attention-Training Game System for School-Aged Children

    ERIC Educational Resources Information Center

    Wang, Tsui-Ying; Huang, Ho-Chuan

    2013-01-01

    A computerized attention-training game system has been developed to support attention training for school-aged children. The present system offers various types of computer games that provide training in different aspects of attention, such as selective attention, sustained attention, and divided attention. The N-tier architecture of the Web-based…

  20. SAGE as a Source for Undergraduate Research Projects

    ERIC Educational Resources Information Center

    Hutz, Benjamin

    2017-01-01

    This article examines the use of the computer algebra system SAGE for undergraduate student research projects. After reading this article, the reader should understand the benefits of using SAGE as a source of research projects and how to commence working with SAGE. The author proposes a tiered working group model to allow maximum benefit to the…

Top