Science.gov

Sample records for implementing high availability

  1. Implementation and use of a highly available and innovative IaaS solution: the Cloud Area Padovana

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Biasotto, M.; Dal Pra, S.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Frizziero, E.; Gulmini, M.; Michelotto, M.; Sgaravatto, M.; Traldi, S.; Venaruzzo, M.; Verlato, M.; Zangrando, L.

    2015-12-01

    While in the business world the cloud paradigm is typically implemented purchasing resources and services from third party providers (e.g. Amazon), in the scientific environment there's usually the need of on-premises IaaS infrastructures which allow efficient usage of the hardware distributed among (and owned by) different scientific administrative domains. In addition, the requirement of open source adoption has led to the choice of products like OpenStack by many organizations. We describe a use case of the Italian National Institute for Nuclear Physics (INFN) which resulted in the implementation of a unique cloud service, called ’Cloud Area Padovana’, which encompasses resources spread over two different sites: the INFN Legnaro National Laboratories and the INFN Padova division. We describe how this IaaS has been implemented, which technologies have been adopted and how services have been configured in high-availability (HA) mode. We also discuss how identity and authorization management were implemented, adopting a widely accepted standard architecture based on SAML2 and OpenID: by leveraging the versatility of those standards the integration with authentication federations like IDEM was implemented. We also discuss some other innovative developments, such as a pluggable scheduler, implemented as an extension of the native OpenStack scheduler, which allows the allocation of resources according to a fair-share based model and which provides a persistent queuing mechanism for handling user requests that can not be immediately served. Tools, technologies, procedures used to install, configure, monitor, operate this cloud service are also discussed. Finally we present some examples that show how this IaaS infrastructure is being used.

  2. High Availability Electronics Standards

    SciTech Connect

    Larsen, R.S.; /SLAC

    2006-12-13

    Availability modeling of the proposed International Linear Collider (ILC) predicts unacceptably low uptime with current electronics systems designs. High Availability (HA) analysis is being used as a guideline for all major machine systems including sources, utilities, cryogenics, magnets, power supplies, instrumentation and controls. R&D teams are seeking to achieve total machine high availability with nominal impact on system cost. The focus of this paper is the investigation of commercial standard HA architectures and packaging for Accelerator Controls and Instrumentation. Application of HA design principles to power systems and detector instrumentation are also discussed.

  3. Small PACS implementation using publicly available software

    NASA Astrophysics Data System (ADS)

    Passadore, Diego J.; Isoardi, Roberto A.; Gonzalez Nicolini, Federico J.; Ariza, P. P.; Novas, C. V.; Omati, S. A.

    1998-07-01

    Building cost effective PACS solutions is a main concern in developing countries. Hardware and software components are generally much more expensive than in developed countries and also more tightened financial constraints are the main reasons contributing to a slow rate of implementation of PACS. The extensive use of Internet for sharing resources and information has brought a broad number of freely available software packages to an ever-increasing number of users. In the field of medical imaging is possible to find image format conversion packages, DICOM compliant servers for all kinds of service classes, databases, web servers, image visualization, manipulation and analysis tools, etc. This paper describes a PACS implementation for review and storage built on freely available software. It currently integrates four diagnostic modalities (PET, CT, MR and NM), a Radiotherapy Treatment Planning workstation and several computers in a local area network, for image storage, database management and image review, processing and analysis. It also includes a web-based application that allows remote users to query the archive for studies from any workstation and to view the corresponding images and reports. We conclude that the advantage of using this approach is twofold. It allows a full understanding of all the issues involved in the implementation of a PACS and also contributes to keep costs down while enabling the development of a functional system for storage, distribution and review that can prove to be helpful for radiologists and referring physicians.

  4. High Availability in Optical Networks

    NASA Astrophysics Data System (ADS)

    Grover, Wayne D.; Wosinska, Lena; Fumagalli, Andrea

    2005-09-01

    Call for Papers: High Availability in Optical Networks Submission Deadline: 1 January 2006 The Journal of Optical Networking (JON) is soliciting papers for a feature Issue pertaining to all aspects of reliable components and systems for optical networks and concepts, techniques, and experience leading to high availability of services provided by optical networks. Most nations now recognize that telecommunications in all its forms -- including voice, Internet, video, and so on -- are "critical infrastructure" for the society, commerce, government, and education. Yet all these services and applications are almost completely dependent on optical networks for their realization. "Always on" or apparently unbreakable communications connectivity is the expectation from most users and for some services is the actual requirement as well. Achieving the desired level of availability of services, and doing so with some elegance and efficiency, is a meritorious goal for current researchers. This requires development and use of high-reliability components and subsystems, but also concepts for active reconfiguration and capacity planning leading to high availability of service through unseen fast-acting survivability mechanisms. The feature issue is also intended to reflect some of the most important current directions and objectives in optical networking research, which include the aspects of integrated design and operation of multilevel survivability and realization of multiple Quality-of-Protection service classes. Dynamic survivable service provisioning, or batch re-provisioning is an important current theme, as well as methods that achieve high availability at far less investment in spare capacity than required by brute force service path duplication or 100% redundant rings, which is still the surprisingly prevalent practice. Papers of several types are envisioned in the feature issue, including outlook and forecasting types of treatments, optimization and analysis, new

  5. Implementation and Validation of Uncertainty Analysis of Available Energy and Available Power

    SciTech Connect

    Jon P. Christophersen; John L. Morrison; B. J. Schubert; Shawn Allred

    2007-04-01

    The Idaho National Laboratory does extensive testing and evaluation of state-of-the-art batteries and ultracapacitors for hybrid-electric vehicle applications as part of the FreedomCAR and Vehicle Technologies Program. Significant parameters of interest include Available Energy and Available Power. Documenting the uncertainty analysis of these derived parameters is a very complex problem. The error is an unknown combination of both linearity and offset; the analysis presented in this paper computes the uncertainty both ways and then the most conservative method is assumed (which is the worst case scenario). Each method requires the use of over 134 equations, some of which are derived and some are measured values. This includes the measurement device error (calibration error) and bit resolution and analog noise error (standard deviation error). The implementation of these equations to acquire a closed form answer was done using Matlab (an array based programming language) and validated using Monte Carlo simulations.

  6. Designing and Running for High Accelerator Availability

    SciTech Connect

    Willeke,F.

    2009-05-04

    The report provides an overview and examples of high availability design considerations and operational aspects making references to some of the available methods to assess and improve on accelerator reliability.

  7. 45 CFR 162.920 - Availability of implementation specifications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... incorporation by reference in subparts I through S of this part in accordance with 5 U.S.C. 552(a) and 1 CFR... Security Boulevard, Baltimore, Maryland 21244. For more information on the availability on the materials at... obtained from the National Council for Prescription Drug Programs, 9240 East Raintree Drive, Scottsdale,...

  8. GPS time transfer with implementation of selective availability

    NASA Astrophysics Data System (ADS)

    Allan, David W.; Granveaud, Michel P.; Klepczynski, William J.; Lewandowski, Wlodzimierz W.

    1990-05-01

    The international community of time metrology is facing a major challenge with the Selective Availability (SA) degradation of GPS satellite signals. At present there are 6 Block 1 satellites and 8 Block 2 satellites operating. According to the policy of the U.S. Department of Defence the Block 1 satellite signals will not be degraded, but these satellites are old with a finite life. The Block 2 satellites, which have all been launched since 1988, were subject to Selective Availability from March 25, 1990. The effect of SA should be to limit precision to about 100 meters for navigation and 167 ns for timing. A study was conducted in order to understand the nature of the actual introduced degradation, and to elaborate the means of removing the effects of this degradation on time transfer. This study concerns the time extraction from GPS satellites at NIST, USNO and Paris Observatory, and the comparison of atomic clocks between these laboratories by common view approach. The results show that when using the data taken over several days the time extraction can be achieved with uncertainty of a few tens of nanoseconds, while strict common-view has removed entirely the effects of SA during the periods under study.

  9. GPS time transfer with implementation of selective availability

    NASA Technical Reports Server (NTRS)

    Allan, David W.; Granveaud, Michel P.; Klepczynski, William J.; Lewandowski, Wlodzimierz W.

    1990-01-01

    The international community of time metrology is facing a major challenge with the Selective Availability (SA) degradation of GPS satellite signals. At present there are 6 Block 1 satellites and 8 Block 2 satellites operating. According to the policy of the U.S. Department of Defence the Block 1 satellite signals will not be degraded, but these satellites are old with a finite life. The Block 2 satellites, which have all been launched since 1988, were subject to Selective Availability from March 25, 1990. The effect of SA should be to limit precision to about 100 meters for navigation and 167 ns for timing. A study was conducted in order to understand the nature of the actual introduced degradation, and to elaborate the means of removing the effects of this degradation on time transfer. This study concerns the time extraction from GPS satellites at NIST, USNO and Paris Observatory, and the comparison of atomic clocks between these laboratories by common view approach. The results show that when using the data taken over several days the time extraction can be achieved with uncertainty of a few tens of nanoseconds, while strict common-view has removed entirely the effects of SA during the periods under study.

  10. Digitally Controlled High Availability Power Supply

    SciTech Connect

    MacNair, David; /SLAC

    2008-09-25

    This paper reports the design and test results on novel topology, high-efficiency, and low operating temperature, 1,320-watt power modules for high availability power supplies. The modules permit parallel operation for N+1 redundancy with hot swap capability. An embedded DSP provides intelligent start-up and shutdown, output regulation, general control and fault detection. PWM modules in the DSP drive the FET switches at 20 to 100 kHz. The DSP also ensures current sharing between modules, synchronized switching, and soft start up for hot swapping. The module voltage and current have dedicated ADCs (>200 kS/sec) to provide pulse-by-pulse output control. A Dual CAN bus interface provides for low cost redundant control paths. Over-rated module components provide high reliability and high efficiency at full load. Low on-resistance FETs replace conventional diodes in the buck regulator. Saturable inductors limit the FET reverse diode current during switching. The modules operate in a two-quadrant mode, allowing bipolar output from complimentary module groups. Controllable, low resistance FETs at the input and output provide fault isolation and allow module hot swapping.

  11. 75 FR 69373 - Source Specific Federal Implementation Plan for Implementing Best Available Retrofit Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-12

    ... Federal Register on October 19, 2010 (75 FR 64221). Given the significant public interest in this source... rule was published in the Federal Register on October 19, 2010 (75 FR 64221) and can be accessed the... AGENCY 40 CFR Part 49 Source Specific Federal Implementation Plan for Implementing Best...

  12. ATF2 High Availability Power Supplies

    SciTech Connect

    Bellomo, A; Lira, C.de; Lam, B.; MacNair, D.; White, G.; /SLAC

    2008-06-27

    ATF2 is an accelerator test facility modeled after the final focus beamline envisioned for the ILC. By the end of 2008, KEK plans to commission the ATF2 [1]. SLAC and OCEM collaborated on the design of 38 power systems for beamline magnets. The systems range in output power from 1.5 kW to 6 kW. Since high availability is essential for the success of the ILC, Collaborators employed an N+1 modular approach, allowing for redundancy and the use of a single power module rating. This approach increases the availability of the power systems. Common power modules reduces inventory and eases maintenance. Current stability requirements are as tight as 10 ppm. A novel, SLAC designed 20-bit Ethernet Power Supply Controller provides the required precision current regulation. In this paper, Collaborators present the power system design, the expected reliability, fault immunity features, and the methods for satisfying the control and monitoring challenges. Presented are test results and the status of the power systems.

  13. Digitally Controlled High Availability Power Supply

    SciTech Connect

    MacNair, David; /SLAC

    2009-05-07

    This paper will report on the test results of a prototype 1320 watt power module for a high availability power supply. The module will allow parallel operation for N+1 redundancy with hot swap capability. The two quadrant output of each module allows pairs of modules to provide a 4 quadrant (bipolar) operation. Each module employs a novel 4 FET buck regulator arranged in a bridge configuration. Each side of the bridge alternately conducts through a small saturable ferrite that limits the reverse current in the FET body diode during turn off. This allows hard switching of the FETs with low switching losses. The module is designed with over-rated components to provide high reliability and better then 97% efficiency at full load. The modules use a Microchip DSP for control, monitoring, and fault detection. The switching FETS are driven by PWM modules in the DSP at 60 KHz. A Dual CAN bus interface provides for low cost redundant control paths. The DSP will also provide current sharing between modules, synchronized switching, and soft start up for hot swapping. The input and output of each module have low resistance FETs to allow hot swapping and isolation of faulted units.

  14. High Available COTS Based Computer for Space

    NASA Astrophysics Data System (ADS)

    Hartmann, J.; Magistrati, Giorgio

    2015-09-01

    The availability and reliability factors of a system are central requirements of a target application. From a simple fuel injection system used in cars up to a flight control system of an autonomous navigating spacecraft, each application defines its specific availability factor under the target application boundary conditions. Increasing quality requirements on data processing systems used in space flight applications calling for new architectures to fulfill the availability, reliability as well as the increase of the required data processing power. Contrary to the increased quality request simplification and use of COTS components to decrease costs while keeping the interface compatibility to currently used system standards are clear customer needs. Data processing system design is mostly dominated by strict fulfillment of the customer requirements and reuse of available computer systems were not always possible caused by obsolescence of EEE-Parts, insufficient IO capabilities or the fact that available data processing systems did not provide the required scalability and performance.

  15. 77 FR 43205 - Notice of Data Availability for Approval, Disapproval and Promulgation of Implementation Plans...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... FR 33022, June 4, 2012). III. New Information Placed in the Docket EPA requests comment on the... Implementation Plans; State of Wyoming; Regional Haze State Implementation Plan; Federal Implementation Plan for Regional Haze AGENCY: Environmental Protection Agency. ACTION: Notice of data availability (NODA)....

  16. 75 FR 44007 - Notice of Availability: Notice of Fiscal Year (FY) 2009 Implementation of the Veterans...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ... Development, HUD. ACTION: Notice. SUMMARY: Through this notice, HUD announces the availability on its website... URBAN DEVELOPMENT Notice of Availability: Notice of Fiscal Year (FY) 2009 Implementation of the Veterans... Development. BILLING CODE 4210-67-P...

  17. Status Update for Implementation of Best Available Technology per DOE Order 5400.5

    SciTech Connect

    C. A. Major

    1999-07-01

    This report provides an update, as of July 1999, on the implementation of best available technology to control or eliminate radionuclide discharges to soil columns at facilities at the Idaho National Engineering and Environmental Laboratory in accordance with DOE Order 5400.5, ''Radiation Protection of the Public and Environment.'' The best available technology to reduce or eliminate radionuclide discharges to soil columns currently implemented by the different facilities appears to be generally effective. Therefore, the different facilities should continue their current best available technology approaches, and also implement the specific recommendations listed in this report for their respective facility.

  18. A high-availability distributed hardware control system using Java

    NASA Astrophysics Data System (ADS)

    Niessner, Albert F.

    2010-07-01

    Two independent coronagraph experiments that require 24/7 availability with different optical layouts and different motion control requirements are commanded and controlled with the same Java software system executing on many geographically scattered computer systems interconnected via TCP/IP. High availability of a distributed system requires that the computers have a robust communication messaging system making the mix of TCP/IP (a robust transport), and XML (a robust message) a natural choice. XML also adds the configuration flexibility. Java then adds object-oriented paradigms, exception handling, heavily tested libraries, and many third party tools for implementation robustness. The result is a software system that provides users 24/7 access to two diverse experiments with XML files defining the differences.

  19. A High-Availability, Distributed Hardware Control System Using Java

    NASA Technical Reports Server (NTRS)

    Niessner, Albert F.

    2011-01-01

    Two independent coronagraph experiments that require 24/7 availability with different optical layouts and different motion control requirements are commanded and controlled with the same Java software system executing on many geographically scattered computer systems interconnected via TCP/IP. High availability of a distributed system requires that the computers have a robust communication messaging system making the mix of TCP/IP (a robust transport), and XML (a robust message) a natural choice. XML also adds the configuration flexibility. Java then adds object-oriented paradigms, exception handling, heavily tested libraries, and many third party tools for implementation robustness. The result is a software system that provides users 24/7 access to two diverse experiments with XML files defining the differences

  20. Implementation of a versatile research data acquisition system using a commercially available medical ultrasound scanner.

    PubMed

    Hemmsen, Martin Christian; Nikolov, Svetoslav Ivanov; Pedersen, Mads Møller; Pihl, Michael Johannes; Enevoldsen, Marie Sand; Hansen, Jens Munk; Jensen, Jørgen Arendt

    2012-07-01

    This paper describes the design and implementation of a versatile, open-architecture research data acquisition system using a commercially available medical ultrasound scanner. The open architecture will allow researchers and clinicians to rapidly develop applications and move them relatively easy to the clinic. The system consists of a standard PC equipped with a camera link and an ultrasound scanner equipped with a research interface. The ultrasound scanner is an easy-to-use imaging device that is capable of generating high-quality images. In addition to supporting the acquisition of multiple data types, such as B-mode, M-mode, pulsed Doppler, and color flow imaging, the machine provides users with full control over imaging parameters such as transmit level, excitation waveform, beam angle, and focal depth. Beamformed RF data can be acquired from regions of interest throughout the image plane and stored to a file with a simple button press. For clinical trials and investigational purposes, when an identical image plane is desired for both an experimental and a reference data set, interleaved data can be captured. This form of data acquisition allows switching between multiple setups while maintaining identical transducer, scanner, region of interest, and recording time. Data acquisition is controlled through a graphical user interface running on the PC. This program implements an interface for third-party software to interact with the application. A software development toolkit is developed to give researchers and clinicians the ability to utilize third-party software for data analysis and flexible manipulation of control parameters. Because of the advantages of speed of acquisition and clinical benefit, research projects have successfully used the system to test and implement their customized solutions for different applications. Three examples of system use are presented in this paper: evaluation of synthetic aperture sequential beamformation, transverse

  1. Universal Basic Education in Nigeria: Availability of Schools' Infrastructure for Effective Program Implementation

    ERIC Educational Resources Information Center

    Ikoya, Peter O.; Onoyase, D.

    2008-01-01

    This paper examines the availability and adequacy of schools' infrastructural facilities for implementation of the Universal Basic Education program in Nigeria. Adopting the "ex post facto" design, the researchers used existing school data on physical facilities, including a survey of key stakeholders in the education sector. Data analysed…

  2. 78 FR 4800 - Approval and Promulgation of Air Quality Implementation Plans; Connecticut; Reasonably Available...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-23

    ...EPA is proposing approval of State Implementation Plan revisions submitted by the State of Connecticut. These SIP revisions consist of a demonstration that Connecticut meets the requirements of reasonably available control technology for oxides of nitrogen and volatile organic compounds set forth by the Clean Air Act with respect to the 1997 8-hour ozone standard. Additionally, we are......

  3. Nationwide survey on resource availability for implementing current sepsis guidelines in Mongolia

    PubMed Central

    Bataar, Otgon; Lundeg, Ganbold; Tsenddorj, Ganbat; Jochberger, Stefan; Grander, Wilhelm; Baelani, Inipavudu; Wilson, Iain; Baker, Tim

    2010-01-01

    Abstract Objective To assess if secondary and tertiary hospitals in Mongolia have the resources needed to implement the 2008 Surviving Sepsis Campaign (SSC) guidelines. Methods To obtain key informant responses, we conducted a nationwide survey by sending a 74-item questionnaire to head physicians of the intensive care unit or department for emergency and critically ill patients of 44 secondary and tertiary hospitals in Mongolia. The questionnaire inquired about the availability of the hospital facilities, equipment, drugs and disposable materials required to implement the SSC guidelines. Descriptive methods were used for statistical analysis. Comparisons between central and peripheral hospitals were performed using non-parametric tests. Findings The response rate was 86.4% (38/44). No Mongolian hospital had the resources required to consistently implement the SSC guidelines. The median percentage of implementable recommendations and suggestions combined was 52.8% (interquartile range, IQR: 45.8–67.4%); of implementable recommendations only, 68% (IQR: 58.0–80.5%) and of implementable suggestions only, 43.5% (IQR: 34.8–57.6%). These percentages did not differ between hospitals located in the capital city and those located in rural areas. Conclusion The results of this study strongly suggest that the most recent SSC guidelines cannot be implemented in Mongolia due to a dramatic shortage of the required hospital facilities, equipment, drugs and disposable materials. Further studies are needed on current awareness of the problem, development of national reporting systems and guidelines for sepsis care in Mongolia, as well as on the quality of diagnosis and treatment and of the training of health-care professionals. PMID:21076565

  4. SeaDataNet network services monitoring: Definition and Implementation of Service availability index

    NASA Astrophysics Data System (ADS)

    Lykiardopoulos, Angelos; Mpalopoulou, Stavroula; Vavilis, Panagiotis; Pantazi, Maria; Iona, Sissy

    2014-05-01

    SeaDataNet (SDN) is a standardized system for managing large and diverse data sets collected by the oceanographic fleets and the automatic observation systems. The SeaDataNet network is constituted of national oceanographic data centres of 35 countries, active in data collection. SeaDataNetII project's objective is to upgrade the present SeaDataNet infrastructure into an operationally robust and state-of-the-art infrastructure; therefore Network Monitoring is a step to this direction. The term Network Monitoring describes the use of system that constantly monitors a computer network for slow or failing components and that notifies the network administrator in case of outages. Network monitoring is crucial when implementing widely distributed systems over the Internet and in real-time systems as it detects malfunctions that may occur and notifies the system administrator who can immediately respond and correct the problem. In the framework of SeaDataNet II project a monitoring system was developed in order to monitor the SeaDataNet components. The core system is based on Nagios software. Some plug-ins were developed to support SeaDataNet modules. On the top of Nagios Engine a web portal was developed in order to give access to local administrators of SeaDataNet components, to view detailed logs of their own service(s). Currently the system monitors 35 SeaDataNet Download Managers, 9 SeaDataNet Services, 25 GeoSeas Download Managers and 23 UBSS Download Managers . Taking advantage of the continuous monitoring of SeaDataNet system components a total availability index will be implemented. The term availability can be defined as the ability of a functional unit to be in a state to perform a required function under given conditions at a given instant of time or over a given time interval, assuming that the required external resources are provided. Availability measures can be considered as a are very important benefit becauseT - The availability trends that can be

  5. High plant availability of phosphorus and low availability of cadmium in four biomass combustion ashes.

    PubMed

    Li, Xiaoxi; Rubæk, Gitte H; Sørensen, Peter

    2016-07-01

    For biomass combustion to become a sustainable energy production system, it is crucial to minimise landfill of biomass ashes, to recycle the nutrients and to minimise the undesirable impact of hazardous substances in the ash. In order to test the plant availability of phosphorus (P) and cadmium (Cd) in four biomass ashes, we conducted two pot experiments on a P-depleted soil and one mini-plot field experiment on a soil with adequate P status. Test plants were spring barley and Italian ryegrass. Ash applications were compared to triple superphosphate (TSP) and a control without P application. Both TSP and ash significantly increased crop yields and P uptake on the P-depleted soil. In contrast, on the adequate-P soil, the barley yield showed little response to soil amendment, even at 300-500kgPha(-1) application, although the barley took up more P at higher applications. The apparent P use efficiency of the additive was 20% in ryegrass - much higher than that of barley for which P use efficiencies varied on the two soils. Generally, crop Cd concentrations were little affected by the increasing and high applications of ash, except for relatively high Cd concentrations in barley after applying 25Mgha(-1) straw ash. Contrarily, even modest increases in the TSP application markedly increased Cd uptake in plants. This might be explained by the low Cd solubility in the ash or by the reduced Cd availability due to the liming effect of ash. High concentrations of resin-extractable P (available P) in the ash-amended soil after harvest indicate that the ash may also contribute to P availability for the following crops. In conclusion, the biomass ashes in this study had P availability similar to the TSP fertiliser and did not contaminate the crop with Cd during the first year. PMID:27082447

  6. Comprehensive evaluation and clinical implementation of commercially available Monte Carlo dose calculation algorithm.

    PubMed

    Zhang, Aizhen; Wen, Ning; Nurushev, Teamour; Burmeister, Jay; Chetty, Indrin J

    2013-01-01

    A commercial electron Monte Carlo (eMC) dose calculation algorithm has become available in Eclipse treatment planning system. The purpose of this work was to evaluate the eMC algorithm and investigate the clinical implementation of this system. The beam modeling of the eMC algorithm was performed for beam energies of 6, 9, 12, 16, and 20 MeV for a Varian Trilogy and all available applicator sizes in the Eclipse treatment planning system. The accuracy of the eMC algorithm was evaluated in a homogeneous water phantom, solid water phantoms containing lung and bone materials, and an anthropomorphic phantom. In addition, dose calculation accuracy was compared between pencil beam (PB) and eMC algorithms in the same treatment planning system for heterogeneous phantoms. The overall agreement between eMC calculations and measurements was within 3%/2 mm, while the PB algorithm had large errors (up to 25%) in predicting dose distributions in the presence of inhomogeneities such as bone and lung. The clinical implementation of the eMC algorithm was investigated by performing treatment planning for 15 patients with lesions in the head and neck, breast, chest wall, and sternum. The dose distributions were calculated using PB and eMC algorithms with no smoothing and all three levels of 3D Gaussian smoothing for comparison. Based on a routine electron beam therapy prescription method, the number of eMC calculated monitor units (MUs) was found to increase with increased 3D Gaussian smoothing levels. 3D Gaussian smoothing greatly improved the visual usability of dose distributions and produced better target coverage. Differences of calculated MUs and dose distributions between eMC and PB algorithms could be significant when oblique beam incidence, surface irregularities, and heterogeneous tissues were present in the treatment plans. In our patient cases, monitor unit differences of up to 7% were observed between PB and eMC algorithms. Monitor unit calculations were also preformed

  7. SLAC Next-Generation High Availability Power Supply

    SciTech Connect

    Bellomo, P.; MacNair, D.; ,

    2010-06-11

    SLAC recently commissioned forty high availability (HA) magnet power supplies for Japan's ATF2 project. SLAC is now developing a next-generation N+1 modular power supply with even better availability and versatility. The goal is to have unipolar and bipolar output capability. It has novel topology and components to achieve very low output voltage to drive superconducting magnets. A redundant, embedded, digital controller in each module provides increased bandwidth for use in beam-based alignment, and orbit correction systems. The controllers have independent inputs for connection to two external control nodes. Under fault conditions, they sense failures and isolate the modules. Power supply speed mitigates the effects of fault transients and obviates subsequent magnet standardization. Hot swap capability promises higher availability and other exciting benefits for future, more complex, accelerators, and eventually the International Linear Collider project.

  8. Estimating the Cost-Effectiveness of Implementation: Is Sufficient Evidence Available?

    PubMed Central

    Whyte, Sophie; Dixon, Simon; Faria, Rita; Walker, Simon; Palmer, Stephen; Sculpher, Mark; Radford, Stefanie

    2016-01-01

    Background Timely implementation of recommended interventions can provide health benefits to patients and cost savings to the health service provider. Effective approaches to increase the implementation of guidance are needed. Since investment in activities that improve implementation competes for funding against other health generating interventions, it should be assessed in term of its costs and benefits. Objective In 2010, the National Institute for Health and Care Excellence released a clinical guideline recommending natriuretic peptide (NP) testing in patients with suspected heart failure. However, its implementation in practice was variable across the National Health Service in England. This study demonstrates the use of multi-period analysis together with diffusion curves to estimate the value of investing in implementation activities to increase uptake of NP testing. Methods Diffusion curves were estimated based on historic data to produce predictions of future utilization. The value of an implementation activity (given its expected costs and effectiveness) was estimated. Both a static population and a multi-period analysis were undertaken. Results The value of implementation interventions encouraging the utilization of NP testing is shown to decrease over time as natural diffusion occurs. Sensitivity analyses indicated that the value of the implementation activity depends on its efficacy and on the population size. Conclusions Value of implementation can help inform policy decisions of how to invest in implementation activities even in situations in which data are sparse. Multi-period analysis is essential to accurately quantify the time profile of the value of implementation given the natural diffusion of the intervention and the incidence of the disease. PMID:27021746

  9. High Availability Instrumentation Packaging Standards for the ILC and Detectors

    SciTech Connect

    Downing, R.W.; Larsen, R.S.; /SLAC

    2006-11-30

    ILC designers are exploring new packaging standards for Accelerator Controls and Instrumentation, particularly high-speed serial interconnect systems for intelligent instruments versus the existing parallel backplanes of VME, VXI and CAMAC. The High Availability Advanced Telecom Computing Architecture (ATCA) system is a new industrial open standard designed to withstand single-point hardware or software failures. The standard crate, controller, applications module and sub-modules are being investigated. All modules and sub-modules are hot-swappable. A single crate is designed for a data throughput in communications applications of 2 Tb/s and an Availability of 0.99999, which translates into a downtime of five minutes per year. The ILC is planning to develop HA architectures for controls, beam instrumentation and detector systems.

  10. Examining Perceptions over the Effectiveness of Professional Development and Available Resources on the Common Core State Standards Implementation in Arkansas

    ERIC Educational Resources Information Center

    Sheppard, Julie Trammell

    2013-01-01

    The purpose of this qualitative case study is to examine the perceptions of teachers and curriculum specialists over the effectiveness of professional development and available resources of the Common Core State Standards (CCSS) implementation process in Arkansas. Arkansas divided the implementation process into three stages: Phase I implemented…

  11. 77 FR 27162 - Notice of Data Availability Supporting Approval and Promulgation of Air Quality Implementation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-09

    ... Generating Station, Kodak Operations at Eastman Business Park, Oswego Harbor Power Owens Corning Delmar Plant... proposed on April 25, 2012 (77 FR 24794) to take action on a revision to the state implementation plan...

  12. GEODE An electrical energy supply with high availability

    SciTech Connect

    Mertz, J.L.; Gerard, M.J.; Girard, J.

    1983-10-01

    Project GEODE describes an electrical energy supply characterized by its very high availability. It is to be used in the PTT (French Telephone Company) telephone exchanges and is targeted for an unavailability of better than 10/sup -6/. In order to achieve this performance Merlin Gerin has adopted: a double bus bar architecture, remote controlled electrical equipment, a motor-generator set specifically designed for this project, and computer assisted surveillance. The authors present the overall reliability calculations for this project along with that for energy sources. The E.d.F (French Utility Company) network and the Motor-Generators.

  13. Instrumentation Standard Architectures for Future High Availability Control Systems

    SciTech Connect

    Larsen, R.S.; /SLAC

    2005-10-13

    Architectures for next-generation modular instrumentation standards should aim to meet a requirement of High Availability, or robustness against system failure. This is particularly important for experiments both large and small mounted on production accelerators and light sources. New standards should be based on architectures that (1) are modular in both hardware and software for ease in repair and upgrade; (2) include inherent redundancy at internal module, module assembly and system levels; (3) include modern high speed serial inter-module communications with robust noise-immune protocols; and (4) include highly intelligent diagnostics and board-management subsystems that can predict impending failure and invoke evasive strategies. The simple design principles lead to fail-soft systems that can be applied to any type of electronics system, from modular instruments to large power supplies to pulsed power modulators to entire accelerator systems. The existing standards in use are briefly reviewed and compared against a new commercial standard which suggests a powerful model for future laboratory standard developments. The past successes of undertaking such projects through inter-laboratory engineering-physics collaborations will be briefly summarized.

  14. 78 FR 19599 - Approval and Promulgation of Implementation Plans; Texas; Reasonably Available Control Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-02

    ...The EPA is finalizing its proposal to approve revisions to the Texas State Implementation Plan (SIP) for the Houston/Galveston/ Brazoria (HGB) 1997 8-Hour ozone nonattainment Area (Area). The HGB Area consists of Brazoria, Chambers, Fort Bend, Galveston, Harris, Liberty, Montgomery and Waller counties. Specifically, we are finalizing our proposed approval of portions of two revisions to the......

  15. 76 FR 48754 - Approval and Promulgation of Air Quality Implementation Plans; Ohio; Reasonably Available Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-09

    ... Rule To Implement the 8-Hour Ozone National Ambient Air Quality Standard--Phase 2'' (70 FR 71612... and Budget under Executive Order 12866 (58 FR 51735, October 4, 1993); Does not impose an information...); Does not have Federalism implications as specified in Executive Order 13132 (64 FR 43255, August...

  16. The availability of novelty sweets within high school localities.

    PubMed

    Aljawad, A; Morgan, M Z; Rees, J S; Fairchild, R

    2016-06-10

    Background Reducing sugar consumption is a primary focus of current global public health policy. Achieving 5% of total energy from free sugars will be difficult acknowledging the concentration of free sugars in sugar sweetened beverages, confectionery and as hidden sugars in many savoury items. The expansion of the novelty sweet market in the UK has significant implications for children and young adults as they contribute to dental caries, dental erosion and obesity.Objective To identify the most available types of novelty sweets within the high school fringe in Cardiff, UK and to assess their price range and where and how they were displayed in shops.Subjects and methods Shops within a ten minute walking distance around five purposively selected high schools in the Cardiff aea representing different levels of deprivation were visited. Shops in Cardiff city centre and three supermarkets were also visited to identify the most commonly available novelty sweets.Results The ten most popular novelty sweets identified in these scoping visits were (in descending order): Brain Licker, Push Pop, Juicy Drop, Lickedy Lips, Big Baby Pop, Vimto candy spray, Toxic Waste, Tango candy spray, Brain Blasterz Bitz and Mega Mouth candy spray. Novelty sweets were located on low shelves which were accessible to all age-groups in 73% (14 out of 19) of the shops. Novelty sweets were displayed in the checkout area in 37% (seven out of 19) shops. The price of the top ten novelty sweets ranged from 39p to £1.Conclusion A wide range of acidic and sugary novelty sweets were easily accessible and priced within pocket money range. Those personnel involved in delivering dental and wider health education or health promotion need to be aware of recent developments in children's confectionery. The potential effects of these novelty sweets on both general and dental health require further investigation. PMID:27283564

  17. 75 FR 23640 - Approval and Promulgation of Implementation Plans; New York Reasonably Available Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-04

    ... control measure analysis and New York's efforts to meet the reasonably available control technology... addition, EPA is proposing a conditional approval of the reasonably available control measure analysis... New York's reasonably available control measure (RACM) analysis and New York's efforts to meet...

  18. 75 FR 71548 - Availability of Federally-Enforceable State Implementation Plans for All States

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-24

    ... Register on November 1, 1995 at 60 FR 55459. The second notice of availability was published in the Federal Register on November 18, 1998 at 63 FR 63986. The third notice of availability was published in the Federal Register on November 20, 2001 at 66 FR 58070. The fourth notice of availability was published in...

  19. 45 CFR 162.920 - Availability of implementation specifications and operating rules.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....S.C. 552(a) and 1 CFR part 51. To enforce any edition other than that specified in this section, the..., call (202) 714-6030, or go to: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr... availability on the materials at CMS, call (410) 786-6597. The materials are also available from the...

  20. 45 CFR 162.920 - Availability of implementation specifications and operating rules.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....S.C. 552(a) and 1 CFR part 51. To enforce any edition other than that specified in this section, the..., call (202) 714-6030, or go to: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr... availability on the materials at CMS, call (410) 786-6597. The materials are also available from the...

  1. 45 CFR 162.920 - Availability of implementation specifications and operating rules.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....S.C. 552(a) and 1 CFR part 51. To enforce any edition other than that specified in this section, the..., call (202) 714-6030, or go to: http://www.archives.gov/federal_register/code_of_federal_regulations/ibr... availability on the materials at CMS, call (410) 786-6597. The materials are also available from the...

  2. 78 FR 71508 - Availability of Federally-Enforceable State Implementation Plans for All States

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-29

    ... FR 55459. Subsequent notices of availability were published in the Federal Register on November 18, 1998 (63 FR 63986), November 20, 2001 (66 FR 58070), December 22, 2004 (69 FR 76617), November 15, 2007 (72 FR 64158), and November 24, 2010 (75 FR 71548). This is the seventh notice of availability of...

  3. Developing and Implementing a Discipline Plan for Hawthorne High School.

    ERIC Educational Resources Information Center

    Evanac, Diane M.

    This practicum paper describes a method for developing and implementing a student-discipline plan in a small, rural high school in north central Florida. The combined middle- and high-school is the poorest in the county. When corporal punishment was banned in the county and no alternatives were implemented, the number of suspensions increased. An…

  4. Engineering in High School: Implementing TMMW & TPE.

    ERIC Educational Resources Information Center

    Bordoloi, Kiron C.; Cole, Joseph D.

    1979-01-01

    The success of two engineering and technology-oriented secondary school programs is discussed. Also presented is the Man Made World and the Technology-People-Environment at two suburban high schools. (BB)

  5. A DSP Based POD Implementation for High Speed Multimedia Communications

    NASA Astrophysics Data System (ADS)

    Zhang, Chang Nian; Li, Hua; Zhang, Nuannuan; Xie, Jiesheng

    2002-12-01

    In the cable network services, the audio/video entertainment contents should be protected from unauthorized copying, intercepting, and tampering. Point-of-deployment (POD) security module, proposed by[InlineEquation not available: see fulltext.], allows viewers to receive secure cable services such as premium subscription channels, impulse pay-per-view, video-on-demand as well as other interactive services. In this paper, we present a digital signal processor (DSP) (TMS320C6211) based POD implementation for the real-time applications which include elliptic curve digital signature algorithm (ECDSA), elliptic curve Diffie Hellman (ECDH) key exchange, elliptic curve key derivation function (ECKDF), cellular automata (CA) cryptography, communication processes between POD and Host, and Host authentication. In order to get different security levels and different rates of encryption/decryption, a CA based symmetric key cryptography algorithm is used whose encryption/decryption rate can be up to[InlineEquation not available: see fulltext.]. The experiment results indicate that the DSP based POD implementation provides high speed and flexibility, and satisfies the requirements of real-time video data transmission.

  6. Implementation of High Speed Distributed Data Acquisition System

    NASA Astrophysics Data System (ADS)

    Raju, Anju P.; Sekhar, Ambika

    2012-09-01

    This paper introduces a high speed distributed data acquisition system based on a field programmable gate array (FPGA). The aim is to develop a "distributed" data acquisition interface. The development of instruments such as personal computers and engineering workstations based on "standard" platforms is the motivation behind this effort. Using standard platforms as the controlling unit allows independence in hardware from a particular vendor and hardware platform. The distributed approach also has advantages from a functional point of view: acquisition resources become available to multiple instruments; the acquisition front-end can be physically remote from the rest of the instrument. High speed data acquisition system transmits data faster to a remote computer system through Ethernet interface. The data is acquired through 16 analog input channels. The input data commands are multiplexed and digitized and then the data is stored in 1K buffer for each input channel. The main control unit in this design is the 16 bit processor implemented in the FPGA. This 16 bit processor is used to set up and initialize the data source and the Ethernet controller, as well as control the flow of data from the memory element to the NIC. Using this processor we can initialize and control the different configuration registers in the Ethernet controller in a easy manner. Then these data packets are sending to the remote PC through the Ethernet interface. The main advantages of the using FPGA as standard platform are its flexibility, low power consumption, short design duration, fast time to market, programmability and high density. The main advantages of using Ethernet controller AX88796 over others are its non PCI interface, the presence of embedded SRAM where transmit and reception buffers are located and high-performance SRAM-like interface. The paper introduces the implementation of the distributed data acquisition using FPGA by VHDL. The main advantages of this system are high

  7. 77 FR 58063 - Approval and Promulgation of Implementation Plans; Texas; Reasonably Available Control Technology...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-19

    ... submittal on October 24, 2008, at 73 FR 63378. The revision to these sections has been in effect, federally... helped lower ozone levels in the HGB Area. See 75 FR 15348 of March 29, 2010. The revisions to these... reasonably available, considering technological and economic feasibility. See 44 FR 53761, September 17,...

  8. 77 FR 28338 - Approval and Promulgation of Air Quality Implementation Plans; Maryland; Reasonably Available...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-14

    ... technology that is reasonably available considering technological and economic feasibility. See 44 FR 53761... regulations to control VOCs. See 58 FR 63085, November 30, 1993; 59 FR 46180, September 7, 1994; 59 FR 60908, November 29, 1994; and 60 FR 2018, January 6, 1995. The second requirement, set forth in section...

  9. 75 FR 4769 - Availability of Grant Funds and Proposed Implementation Guidelines; Withdrawal of Solicitation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-29

    ...The National Oceanic and Atmospheric Administration publishes this notice to announce the withdrawal of the solicitation of applications for the NOAA Marine Aquaculture Initiative 2010, which was published in the NOAA ``Availability of Grant Funds for Fiscal Year 2010'' on January 19, 2010. A new funding opportunity with revised requirements and goals is under development and will be published......

  10. 76 FR 44279 - Implementation of Section 304 of the Telecommunications Act of 1996: Commercial Availability of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-25

    ...(b)(5), and 76.1602(b), published at 76 FR 40263, July 8, 2011, are effective on August 8, 2011. FOR..., FCC 10-181, and published in the Federal Register on July 8, 2011, 76 FR 40263, the Federal...: Commercial Availability of Navigation Devices; Compatibility Between Cable Systems and Consumer...

  11. 75 FR 14116 - Approval of Implementation Plans of Wisconsin: Nitrogen Oxides Reasonably Available Control...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-24

    ... approved in the Federal Register on January 26, 1996 (61 FR 2428). This NO X waiver, issued under section... (66 FR 56931). These other NO X rules were submitted as part of Wisconsin's reasonable further... reasonably available considering technological and economic feasibility (44 FR 53762). Section 302 of the...

  12. Status Update for Implementing Best Available Technology per DOE Order 5400.5 - September 2002

    SciTech Connect

    Lewis, Michael George

    2002-09-01

    This report identifies discharges of liquid waste streams that require documentation of the Best Available Technology selection process at Bechtel BWXT Idaho, LLC, operated facilities at the Idaho National Engineering and Environmental Laboratory. The Best Available Technology selection process is conducted according to Department of Energy Order 5400.5, Chapter II (3),“Management and Control of Radioactive Materials in Liquid Discharges and Phaseout of Soil Columns” and Department of Energy guidance. Only those liquid waste streams and facilities requiring the Best Available Technology selection process are evaluated in further detail. In addition, this report will be submitted to the Department of Energy Idaho Operations Office Field Office manager for approval according to DOE Order 5400.5, Chapter II, Section 3.b.(1). Two facilities (Idaho Nuclear Technology and Engineering Center existing Percolation Ponds and Test Area North/Technical Support Facility Disposal Pond) at the Idaho National Engineering and Environmental Laboratory required documentation of the Best Available Technology selection process (Section 4). These two facilities required documentation of the Best Available Technology selection process because they discharge wastewater that may contain process-derived radionuclides to a soil column even though the average radioactivity levels are typically below drinking water maximum contaminant levels. At the request of the Department of Energy Idaho Operations Office, the 73.5-acre Central Facilities Area Sewage Treatment Plant land application site is included in Section 4 of this report to ensure the requirements of DOE Order 5400.5, Chapter II, Section 3 are met. The Central Facilities Area Sewage Treatment Plant effluent contains process-derived radionuclides from radioactive tracers used in certain analytical procedures. The radioactivity levels of these radionuclides are below maximum contaminant levels. According to Department of Energy

  13. Status Update for Implementing Best Available Technology per DOE Order 5400.5

    SciTech Connect

    Michael G. Lewis

    2003-09-01

    This report identifies discharges of liquid waste streams that require documentation of the best available technology selection process at Bechtel BWXT Idaho, LLC, operated facilities at the Idaho National Engineering and Environmental Laboratory. The best available technology selection process is conducted according to Department of Energy Order 5400.5, Chapter II (3), ''Management and Control of Radioactive Materials in Liquid Discharges and Phaseout of Soil Columns'' and Department of Energy guidance. This report evaluates only those liquid waste streams and facilities where the best available technology selection process was determined to be applicable. In addition, the Department of Energy Idaho Operations Office will submit this report to their field office manager for approval according to DOE Order 5400.5, Chapter II, Section 3.b.(1). According to Department of Energy guidance, ''If the liquid waste stream is below maximum contaminant levels, then the goals of the best available technology selection process are being met and the liquid waste stream is considered 'clean water.' However, it is necessary to document this through the best available technology selection process.'' Because liquid waste streams below drinking water maximum contaminant levels are already considered ''clean water,'' additional treatment technologies are considered unnecessary and unjustifiable on a cost-benefit basis and are not addressed in this report. Two facilities (Idaho Nuclear Technology and Engineering Center New Percolation Ponds and Test Area North/Technical Support Facility Disposal Pond) at the Idaho National Engineering and Environmental Laboratory required documentation of the best available technology selection process (Section 4). These two facilities required documentation of the best available technology selection process because they discharge wastewater that may contain process-derived radionuclides to a soil column even though the average radioactivity levels

  14. Status Update for Implementing Best Available Technology per DOE Order 5400.5 (2003)

    SciTech Connect

    Michael Lewis

    2004-09-01

    This report identifies discharges of liquid waste streams that require documentation of the best available technology selection process at Bechtel BWXT Idaho, LLC, operated facilities at the Idaho National Engineering and Environmental Laboratory. The best available technology selection process is conducted according to Department of Energy Order 5400.5, Chapter II (3), “Management and Control of Radioactive Materials in Liquid Discharges and Phaseout of Soil Columns” and Department of Energy guidance. This report evaluates only those liquid waste streams and facilities where the best available technology selection process was determined to apply. Two facilities (Idaho Nuclear Technology and Engineering Center New Percolation Ponds and Test Area North/Technical Support Facility Sewage Treatment Plant Disposal Pond) at the Idaho National Engineering and Environmental Laboratory required documentation of the best available technology selection process. These two facilities required documentation of the best available technology selection process because they discharge wastewater that may contain process-derived radionuclides to a soil column even though the average radioactivity levels are typically below drinking water maximum contaminant levels. At the request of the Department of Energy Idaho Operations Office, the 73.5-acre Central Facilities Area Sewage Treatment Plant land application site is included in this report to ensure the requirements of DOE Order 5400.5, Chapter II, Section 3 are met. The Central Facilities Area Sewage Treatment Plant effluent contains process-derived radionuclides from radioactive tracers used in certain analytical procedures. The radioactivity levels of these radionuclides are below maximum contaminant levels. The Department of Energy Idaho Operations Office will submit this report to their field office manager for approval according to DOE Order 5400.5, Chapter II, Section 3.b.(1).

  15. Profits, Commercial Food Supplier Involvement, and School Vending Machine Snack Food Availability: Implications for Implementing the New Competitive Foods Rule

    ERIC Educational Resources Information Center

    Terry-McElrath, Yvonne M.; Hood, Nancy E.; Colabianchi, Natalie; O'Malley, Patrick M.; Johnston, Lloyd D.

    2014-01-01

    Background: The 2013-2014 school year involved preparation for implementing the new US Department of Agriculture (USDA) competitive foods nutrition standards. An awareness of associations between commercial supplier involvement, food vending practices, and food vending item availability may assist schools in preparing for the new standards.…

  16. Status Update for Implementing Best Available Technology per DOE Order 5400.5

    SciTech Connect

    Lewis, Michael George

    2001-09-01

    This report documents the Bechtel BWXT Idaho, LCC, operated facilities at the Idaho National Engineering and Environmental Laboratory that require the Best Available Technology selection process in accordance with Department of Energy Order 5400.5, Chapter II (3), “Management and Control of Radioactive Materials in Liquid Discharges.”1 This report differs from previous reports in that only those liquid waste streams and facilities requiring the Best Available Technology selection process will be evaluated in detail. In addition, this report will be submitted to the DOE-ID Field Office Manager for approval in accordance with DOE Order 5400.5, Chapter II, Section 3.b.(1). The report also identifies facilities addressed in last year’s report that do not require the Best Available Technology selection process to be completed. These facilities will not be addressed in future reports. This report reviews the following facilities: • Auxiliary Reactor Area • Idaho National Engineering and Environmental Laboratory Block Areas • Central Facilities Area • Idaho Nuclear Technology and Engineering Center • Idaho Falls Facilities • Power Burst Facility • Radioactive Waste Management Complex • Test Area North • Test Reactor Area. Three facilities (Central Facilities Area Sewage Treatment Plant, Idaho Nuclear Technology and Engineering Center Percolation Ponds and Test Area North/Technical Support Facility Disposal Pond) at the Idaho National Engineering and Environmental Laboratory required documentation of the Best Available Technology selection process. The Idaho Nuclear Technology and Engineering Center Percolation Ponds and Test Area North/Technical Support Facility Disposal Pond discharge wastewater that may contain process-derived radionuclides to a soil column with average radionuclide concentrations below drinking water MCLs. At the request of the Department of Energy Idaho Operations Office, Bechtel BWXT Idaho, LLC has included the 73.5acre

  17. 2006 Update for Implementing Best Available Technology per DOE Order 5400.5

    SciTech Connect

    Michael G. Lewis

    2007-09-01

    In accordance with Contract Data Requirements List F.19, this report addresses the Best Available Technology requirements per Department of Energy (DOE) Order 5400.5, “Radiation Protection of the Public and the Environment,” as they apply to radiological discharges to the soil for Calendar Year 2006. The report includes review of discharges for both, Battelle Energy Alliance, LLC and CH2M WG Idaho, LLC. The Best Available Technology selection process is applicable to wastewater discharges containing process derived radionuclides to surface waters, sanitary sewerages greater than five times the Derived Concentration Guideline (found in DOE Order 5400.5), and to the soil. Wastewater at the Idaho National Laboratory Site is not discharged to surface water (Big Lost River and Birch Creek) nor is it discharged to sanitary sewerages at activity levels greater than five times a Derived Concentration Guideline. Therefore, this report focuses on radiological discharges to the soil.

  18. High Molybdenum availability for evolution in a Mesoproterozoic lacustrine environment

    NASA Astrophysics Data System (ADS)

    Parnell, John; Spinks, Samuel; Andrews, Steven; Thayalan, Wanethon; Bowden, Stephen

    2015-05-01

    Trace metal data for Proterozoic marine euxinic sediments imply that the expansion of nitrogen-fixing cyanobacteria and diversification of eukaryotes were delayed while the availability of bioessential metals such as molybdenum in the ocean was limited. However, there is increasing recognition that the Mesoproterozoic evolution of nitrogen fixation and eukaryotic life may have been promoted in marginal marine and terrestrial environments, including lakes, rather than in the deep ocean. Molybdenum availability is critical to life in lakes, just as it is in the oceans. It is, therefore, important to assess molybdenum availability to the lacustrine environment in the Mesoproterozoic. Here we show that the flux of molybdenum to a Mesoproterozoic lake was 1 to 2 orders of magnitude greater than typical fluxes in the modern and ancient marine environment. Thus, there was no barrier to availability to prevent evolution in the terrestrial environment, in contrast to the nutrient-limited Mesoproterozoic oceans.

  19. High Molybdenum availability for evolution in a Mesoproterozoic lacustrine environment.

    PubMed

    Parnell, John; Spinks, Samuel; Andrews, Steven; Thayalan, Wanethon; Bowden, Stephen

    2015-01-01

    Trace metal data for Proterozoic marine euxinic sediments imply that the expansion of nitrogen-fixing cyanobacteria and diversification of eukaryotes were delayed while the availability of bioessential metals such as molybdenum in the ocean was limited. However, there is increasing recognition that the Mesoproterozoic evolution of nitrogen fixation and eukaryotic life may have been promoted in marginal marine and terrestrial environments, including lakes, rather than in the deep ocean. Molybdenum availability is critical to life in lakes, just as it is in the oceans. It is, therefore, important to assess molybdenum availability to the lacustrine environment in the Mesoproterozoic. Here we show that the flux of molybdenum to a Mesoproterozoic lake was 1 to 2 orders of magnitude greater than typical fluxes in the modern and ancient marine environment. Thus, there was no barrier to availability to prevent evolution in the terrestrial environment, in contrast to the nutrient-limited Mesoproterozoic oceans. PMID:25988499

  20. 2005 Update for Implementing Best Available Technology per DOE Order 5400.5

    SciTech Connect

    INL

    2006-09-01

    The report addresses Best Available Technology per DOE Order 5400.5 in relation to wastewater discharges to the soil. In accordance with Contract Data Requirements List F.19, this report addresses the Best Available Technology requirements per Department of Energy (DOE) Order 5400.5, "Radiation Protection of the Public and the Environment", as they apply to radiological discharges to the soil for Calendar Year 2005. The report includes review of discharges for both, Battelle Energy Alliance, LLC and CH2M WG Idaho, LLC. The Best Available Technology selection process is applicable to wastewater discharges containing process derived radionuclides to surface waters, sanitary sewerages greater than five times the Derived Concentration Guideline (found in DOE Order 5400.5), and to the soil. Wastewater at the Idaho National Laboratory Site is not discharged to surface water (Big Lost River and Birch Creek) nor is it discharged to sanitary sewerages at activity levels greater than five times a Derived Concentration Guideline. Therefore, this report focuses on radiological discharges to the soil.

  1. 75 FR 27256 - Implementation of Section 304 of the Telecommunications Act of 1996: Commercial Availability of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ..., (ii) an Ethernet interface, (iii) Wi-Fi connectivity, or (iv) USB 3.0 on all high-definition set-top... the mandate of Section 629, the FCC adopted rules in its First Report and Order, 63 FR 38089, that... Report and Order, 68 FR 66728, that largely reflected the terms of a Memorandum of Understanding...

  2. Implementation of high throughput experimentation techniques for kinetic reaction testing.

    PubMed

    Nagy, Anton J

    2012-02-01

    Successful implementation of High throughput Experimentation (EE) tools has resulted in their increased acceptance as essential tools in chemical, petrochemical and polymer R&D laboratories. This article provides a number of concrete examples of EE systems, which have been designed and successfully implemented in studies, which focus on deriving reaction kinetic data. The implementation of high throughput EE tools for performing kinetic studies of both catalytic and non-catalytic systems results in a significantly faster acquisition of high-quality kinetic modeling data, required to quantitatively predict the behavior of complex, multistep reactions. PMID:21902639

  3. Does High Educational Attainment Limit the Availability of Romantic Partners?

    ERIC Educational Resources Information Center

    Burt, Isaac; Lewis, Sally V.; Beverly, Monifa G.; Patel, Samir H.

    2010-01-01

    Research indicates that highly educated individuals endure hardships in finding suitable romantic partners. Romantic hardships affect social and emotional adjustment levels, leading to low self-efficacy in relationship decision making. To address the need for research pertaining to this topic, the authors explored the experiences of eight…

  4. High sensitivity of northeastern broadleaf forest trees to water availability

    NASA Astrophysics Data System (ADS)

    Levesque, M.; Pederson, N.; Andreu-Hayles, L.

    2015-12-01

    Temperate deciduous forests of eastern US provide goods and services to millions of people and play a vital role in the terrestrial carbon and hydrological cycles. However, ongoing climate change and increased in CO2 concentration in the atmosphere (ca) are expected to alter growth and gas exchange of trees, and ultimately forest productivity. Still, the magnitude of these effects is unclear. A better comprehension of the species-specific responses to environmental changes will better inform models and managers on the vulnerability and resiliency of these forests. Tree-ring analysis was combined with δ¹³C and δ18O measurements to investigate growth and physiological responses of red oak (Quercus rubra L.) and tulip poplar (Liriodendron tulipifera L.) in northeastern US to changes in water availability and ca for the period 1950-2014. We found very strong correlations between summer climatic water balance (June-August) and isotopic tree-ring series for δ¹³C (r = -0.65 and -0.73), and δ18O (r = -0.59 and -0.70), for red oak and tulip poplar, respectively. In contrast, tree-ring width was less sensitive to summer water availability (r = 0.33-0.39). Prior to the mid 1980s, low water availability resulted in low stomatal conductance, photosynthesis, and growth. Since that period, pluvial conditions occurring in northeastern US have increased stomatal conductance, carbon uptake, and growth of both species. These findings demonstrate that broadleaf trees in this region could be more sensitive to drought than expected. This appears especially true since much of the calibration period looks wet in a multi-centennial perspective. Further, stronger spatial correlations were found between climate data with tree-ring isotopes than with tree-ring width and the geographical area of the observed δ18O-precipitation response (i.e. the area over which correlations are > 0.5) covers most of the northeastern US. Given the good fit between the isotopic time series and water

  5. Availability of high-pressure safety injection system in PWRs

    SciTech Connect

    Sun, Y.H.; Fresco, A.; Papazoglou, I.A.

    1983-01-01

    This paper presents an evaluation of the impact of typical variations in configuration of the design of the High Pressure Injection (HPSI) System on system unavailability. The HPSI systems in seventeen nuclear power plants were reviewed for variations in design, systems operation, testing and maintenance policies, and possible sources for common cause failures. The power plants reviewed include PWRs with two, three and four loop Reactor Coolant Systems and cover all three PWR vendors. As a result of this effort, the following five representative configurations (along with some variations) were identified and their unavailability to initiate injection was estimated.

  6. Dynamically tuned high-Q AC-dipole implementation

    SciTech Connect

    Oddo, P.; Bai, M.; Dawson, W.C.; Meng, W.; Mernick, K.; Pai, C.; Roser, T.; Russo, T.

    2010-05-02

    AC-dipole magnets are typically implemented as a parallel LC resonant circuit. To maximize efficiency, it's beneficial to operate at a high Q. This, however, limits the magnet to a narrow frequency range. Current designs therefore operate at a low Q to provide a wider bandwidth at the cost of efficiency. Dynamically tuning a high Q resonant circuit tries to maintain a high efficiency while providing a wide frequency range. The results of ongoing efforts at BNL to implement dynamically tuned high-Q AC dipoles will be presented.

  7. Availability of High School Extracurricular Sports Programs and High-Risk Behaviors

    ERIC Educational Resources Information Center

    Cohen, Deborah A.; Taylor, Stephanie L.; Zonta, Michela; Vestal, Katherine D.; Schuster, Mark A.

    2007-01-01

    Background: The Surgeon General has called for an expansion of school-based extracurricular sports programs to address the obesity epidemic. However, little is known about the availability of and participation in high school extracurricular sports and how participation in these sports is related to high-risk behaviors. Methods: We surveyed Los…

  8. Symmetric Active/Active Metadata Service for High Availability Parallel File Systems

    SciTech Connect

    He, X.; Ou, Li; Engelmann, Christian; Chen, Xin; Scott, Stephen L

    2009-01-01

    High availability data storage systems are critical for many applications as research and business become more data-driven. Since metadata management is essential to system availability, multiple metadata services are used to improve the availability of distributed storage systems. Past research focused on the active/standby model, where each active service has at least one redundant idle backup. However, interruption of service and even some loss of service state may occur during a fail-over depending on the used replication technique. In addition, the replication overhead for multiple metadata services can be very high. The research in this paper targets the symmetric active/active replication model, which uses multiple redundant service nodes running in virtual synchrony. In this model, service node failures do not cause a fail-over to a backup and there is no disruption of service or loss of service state. We further discuss a fast delivery protocol to reduce the latency of the needed total order broadcast. Our prototype implementation shows that metadata service high availability can be achieved with an acceptable performance trade-off using our symmetric active/active metadata service solution.

  9. Implementation of weather stations at Ghanaian high schools

    NASA Astrophysics Data System (ADS)

    Pieron, M.

    2012-04-01

    The Trans-African Hydro-Meteorological Observatory (www.tahmo.org) is an initiative that aims to develop a dense weather observation network in Sub-Sahara Africa. The ambition is to have 20.000 low-cost innovative weather stations in place in 2015. An increased amount of weather data is locally required to provide stakeholders that are dependent on the weather, such as farmers and fishermen, with accurate forecasts. As a first proof of concept, showing that sensors can be built at costs lower than commercially available, a disdrometer was developed. In parallel with the design of the measurement instruments, a high school curriculum is developed that covers environmental sciences. In order to find out which requirements the TAHMO weather station and accompanying educational materials should meet for optimal use at Junior High Schools research was done at Ghanaian schools. Useful insights regarding the future African context of the weather station and requirements for an implementation strategy were obtained during workshops with teachers and students, visits to WMO observatories and case studies regarding use of educational materials. The poster presents the conclusions of this research, which is part of the bigger TAHMO framework.

  10. High School Reform Implementation: Principals' Perceptions on Their Leadership Role

    ERIC Educational Resources Information Center

    White-Smith, Kimberly A.; White, Monica A.

    2009-01-01

    This research is a collection of comparative case studies that examine the perspectives of four principals in their 1st year of implementing the High School College Collaborative (HSCC), which works to provide traditionally underserved high school students with the opportunity to receive college credit, possibly an associate of arts degree,…

  11. Performing Arts Program, Badger High School: Justification, Proposal, Implementation, Stage One Implementation.

    ERIC Educational Resources Information Center

    Holmes, Dan

    This document presents a justification, proposal, and implementation plan for a comprehensive theatre arts program at Badger High School, Lake Geneva, Wisconsin that would offer a full schedule of amateur and professional arts programs involving the students and the community. The brief Justification section notes that every elementary and…

  12. On implementing MPI-IO portably and with high performance.

    SciTech Connect

    Thakur, R.; Gropp, W.; Lusk, E.

    1998-11-30

    We discuss the issues involved in implementing MPI-IO portably on multiple machines and file systems and also achieving high performance. One way to implement MPI-IO portably is to implement it on top of the basic Unix I/O functions (open, seek, read, write, and close), which are themselves portable. We argue that this approach has limitations in both functionality and performance. We instead advocate an implementation approach that combines a large portion of portable code and a small portion of code that is optimized separately for different machines and file systems. We have used such an approach to develop a high-performance, portable MPI-IO implementation, called ROMIO. In addition to basic I/O functionality, we consider the issues of supporting other MPI-IO features, such as 64-bit file sizes, noncontiguous accesses, collective I/O, asynchronous I/O, consistency and atomicity semantics, user-supplied hints, shared file pointers, portable data representation, file preallocation, and some miscellaneous features. We describe how we implemented each of these features on various machines and file systems. The machines we consider are the HP Exemplar, IBM SP, Intel Paragon, NEC SX-4, SGI Origin2000, and networks of workstations; and the file systems we consider are HP HFS, IBM PIOFS, Intel PFS, NEC SFS, SGI XFS, NFS, and any general Unix file system (UFS). We also present our thoughts on how a file system can be designed to better support MPI-IO. We provide a list of features desired from a file system that would help in implementing MPI-IO correctly and with high performance.

  13. The Design, Development, and Implementation of LUDA Virtual High School

    ERIC Educational Resources Information Center

    Vrasidas, Charalambos

    2003-01-01

    The purpose of this paper is to present the Large Unit District Association virtual high school (LUDA-VHS) project and discuss its design, development, and implementation. A model developed at the Center for the Application of Information Technologies for designing online classes will be presented and discussed. The focus of the paper will be to…

  14. Implementing RTI in a High School: A Case Study

    ERIC Educational Resources Information Center

    Fisher, Douglas; Frey, Nancy

    2013-01-01

    This case study chronicles the efforts of a small high school over a 2-year period as it designed and implemented a response to intervention (RTI) program for students at the school. Their efforts were largely successful, with improved achievement, attendance, and grade point averages and a decrease in special education referrals. Major themes…

  15. Technology's Achilles Heel: Achieving High-Quality Implementation

    ERIC Educational Resources Information Center

    Hall, Gene E.

    2010-01-01

    An inherent characteristic of technology education is the continual development of new technologies and creating innovative applications of already existing technologies. As exciting as these innovations can be, technology educators and school staffs are frequently challenged to accomplish high levels of implementation. The metaphor of the…

  16. Cooperative implementation of a high temperature acoustic sensor

    NASA Astrophysics Data System (ADS)

    Baldini, S. E.; Nowakowski, Edward; Smith, Herbert G.; Friebele, E. J.; Putnam, Martin A.; Rogowski, Robert; Melvin, Leland D.; Claus, Richard O.; Tran, Tuan; Holben, Milford S., Jr.

    1991-12-01

    The current status and results of a cooperative program aimed at the implementation of a high-temperature acoustic/strain sensor onto metallic structures are reported. The sensor systems that are to be implemented under this program will measure thermal expansion, maneuver loads, aircraft buffet, sonic fatigue, and acoustic emissions in environments that approach 1800 F. The discussion covers fiber development, fabrication of an extrinsic Fabry-Perot interferometer acoustic sensor, sensor mounting/integration, and results of an evaluation of the sensor capabilities.

  17. High-Performance CCSDS Encapsulation Service Implementation in FPGA

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Torgerson, Jordan L.; Pang, Jackson

    2010-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Encapsulation Service is a convergence layer between lower-layer space data link framing protocols, such as CCSDS Advanced Orbiting System (AOS), and higher-layer networking protocols, such as CFDP (CCSDS File Delivery Protocol) and Internet Protocol Extension (IPE). CCSDS Encapsulation Service is considered part of the data link layer. The CCSDS AOS implementation is described in the preceding article. Recent advancement in RF modem technology has allowed multi-megabit transmission over space links. With this increase in data rate, the CCSDS Encapsulation Service needs to be optimized to both reduce energy consumption and operate at a high rate. CCSDS Encapsulation Service has been implemented as an intellectual property core so that the aforementioned problems are solved by way of operating the CCSDS Encapsulation Service inside an FPGA. The CCSDS En capsula tion Service in FPGA implementation consists of both packetizing and de-packetizing features

  18. FPGA Implementation of Highly Modular Fast Universal Discrete Transforms

    NASA Astrophysics Data System (ADS)

    Potipantong, Panan; Sirisuk, Phaophak; Oraintara, Soontorn; Worapishet, Apisak

    This paper presents an FPGA implementation of highly modular universal discrete transforms. The implementation relies upon the unified discrete Fourier Hartley transform (UDFHT), based on which essential sinusoidal transforms including discrete Fourier transform (DFT), discrete Hartley transform (DHT), discrete cosine transform (DCT) and discrete sine transform (DST) can be realized. It employs a reconfigurable, scalable and modular architecture that consists of a memory-based FFT processor equipped with pre- and post-processing units. Besides, a pipelining technique is exploited to seamlessly harmonize the operation between each sub-module. Experimental results based on Xilinx Virtex-II Pro are given to examine the performance of the proposed UDFHT implementation. Two practical applications are also shown to demonstrate the flexibility and modularity of the proposed work.

  19. A high performance hardware implementation image encryption with AES algorithm

    NASA Astrophysics Data System (ADS)

    Farmani, Ali; Jafari, Mohamad; Miremadi, Seyed Sohrab

    2011-06-01

    This paper describes implementation of a high-speed encryption algorithm with high throughput for encrypting the image. Therefore, we select a highly secured symmetric key encryption algorithm AES(Advanced Encryption Standard), in order to increase the speed and throughput using pipeline technique in four stages, control unit based on logic gates, optimal design of multiplier blocks in mixcolumn phase and simultaneous production keys and rounds. Such procedure makes AES suitable for fast image encryption. Implementation of a 128-bit AES on FPGA of Altra company has been done and the results are as follow: throughput, 6 Gbps in 471MHz. The time of encrypting in tested image with 32*32 size is 1.15ms.

  20. Design and implementation of super broadband high speed waveguide switches

    NASA Astrophysics Data System (ADS)

    Zhu, Wenbin; Chao, Ju-Hung; Wang, Chao; Yao, Jimmy; Yin, Stuart

    2015-08-01

    In this paper, based on the theory of dynamic waveguiding effect in nanodisordered KTN crystals, a detailed design and implementation of a super broadband 1x2 high speed waveguide switch is presented. The important waveguide parameters, including the dimension, the refractive index distribution, and the electric field distribution within the waveguide are quantitatively simulated and analyzed. An experimental verification of switching effect based on the design is also conducted, which confirmed the design. The broadband and high speed nature of such kind of switch can play a key role in data center networks and cloud computing, which needs low power consumption and high speed switches.

  1. Developing and implementing a high precision setup system

    NASA Astrophysics Data System (ADS)

    Peng, Lee-Cheng

    The demand for high-precision radiotherapy (HPRT) was first implemented in stereotactic radiosurgery using a rigid, invasive stereotactic head frame. Fractionated stereotactic radiotherapy (SRT) with a frameless device was developed along a growing interest in sophisticated treatment with a tight margin and high-dose gradient. This dissertation establishes the complete management for HPRT in the process of frameless SRT, including image-guided localization, immobilization, and dose evaluation. The most ideal and precise positioning system can allow for ease of relocation, real-time patient movement assessment, high accuracy, and no additional dose in daily use. A new image-guided stereotactic positioning system (IGSPS), the Align RT3C 3D surface camera system (ART, VisionRT), which combines 3D surface images and uses a real-time tracking technique, was developed to ensure accurate positioning at the first place. The uncertainties of current optical tracking system, which causes patient discomfort due to additional bite plates using the dental impression technique and external markers, are found. The accuracy and feasibility of ART is validated by comparisons with the optical tracking and cone-beam computed tomography (CBCT) systems. Additionally, an effective daily quality assurance (QA) program for the linear accelerator and multiple IGSPSs is the most important factor to ensure system performance in daily use. Currently, systematic errors from the phantom variety and long measurement time caused by switching phantoms were discovered. We investigated the use of a commercially available daily QA device to improve the efficiency and thoroughness. Reasonable action level has been established by considering dosimetric relevance and clinic flow. As for intricate treatments, the effect of dose deviation caused by setup errors remains uncertain on tumor coverage and toxicity on OARs. The lack of adequate dosimetric simulations based on the true treatment coordinates from

  2. High performance computing and communications: FY 1996 implementation plan

    SciTech Connect

    1995-05-16

    The High Performance Computing and Communications (HPCC) Program was formally authorized by passage of the High Performance Computing Act of 1991, signed on December 9, 1991. Twelve federal agencies, in collaboration with scientists and managers from US industry, universities, and research laboratories, have developed the Program to meet the challenges of advancing computing and associated communications technologies and practices. This plan provides a detailed description of the agencies` HPCC implementation plans for FY 1995 and FY 1996. This Implementation Plan contains three additional sections. Section 3 provides an overview of the HPCC Program definition and organization. Section 4 contains a breakdown of the five major components of the HPCC Program, with an emphasis on the overall directions and milestones planned for each one. Section 5 provides a detailed look at HPCC Program activities within each agency.

  3. Implementing High Performance Remote Method Invocation in CCA

    SciTech Connect

    Yin, Jian; Agarwal, Khushbu; Krishnan, Manoj Kumar; Chavarría-Miranda, Daniel; Gorton, Ian; Epperly, Thomas G.

    2011-09-30

    We report our effort in engineering a high performance remote method invocation (RMI) mechanism for the Common Component Architecture (CCA). This mechanism provides a highly efficient and easy-to-use mechanism for distributed computing in CCA, enabling CCA applications to effectively leverage parallel systems to accelerate computations. This work is built on the previous work of Babel RMI. Babel is a high performance language interoperability tool that is used in CCA for scientific application writers to share, reuse, and compose applications from software components written in different programming languages. Babel provides a transparent and flexible RMI framework for distributed computing. However, the existing Babel RMI implementation is built on top of TCP and does not provide the level of performance required to distribute fine-grained tasks. We observed that the main reason the TCP based RMI does not perform well is because it does not utilize the high performance interconnect hardware on a cluster efficiently. We have implemented a high performance RMI protocol, HPCRMI. HPCRMI achieves low latency by building on top of a low-level portable communication library, Aggregated Remote Message Copy Interface (ARMCI), and minimizing communication for each RMI call. Our design allows a RMI operation to be completed by only two RDMA operations. We also aggressively optimize our system to reduce copying. In this paper, we discuss the design and our experimental evaluation of this protocol. Our experimental results show that our protocol can improve RMI performance by an order of magnitude.

  4. Implementation Challenges for Ceramic Matrix Composites in High Temperature Applications

    NASA Technical Reports Server (NTRS)

    Singh, Mrityunjay

    2004-01-01

    Ceramic matrix composites are leading candidate materials for a number of applications in aeronautics, space, energy, electronics, nuclear, and transportation industries. In the aeronautics and space exploration systems, these materials are being considered for applications in hot sections of jet engines such as the combustor liner, nozzle components, nose cones, leading edges of reentry vehicles and space propulsion components. Applications in the energy and environmental industries include radiant heater tubes, heat exchangers, heat recuperators, gas and diesel particulate filters (DPFs), and components for land based turbines for power generation. These materials are also being considered for use in the first wall and blanket components of fusion reactors. There are a number of critical issues and challenges related to successful implementation of composite materials. Fabrication of net and complex shape components with high density and tailorable matrix properties is quite expensive, and even then various desirable properties are not achievable. In this presentation, microstructure and thermomechanical properties of composites fabricated by two techniques (chemical vapor infiltration and melt infiltration), will be presented. In addition, critical need for robust joining and assembly technologies in successful implementation of these systems will be discussed. Other implementation issues will be discussed along with advantages and benefits of using these materials for various components in high temperature applications.

  5. High performance computing and communications: FY 1995 implementation plan

    SciTech Connect

    1994-04-01

    The High Performance Computing and Communications (HPCC) Program was formally established following passage of the High Performance Computing Act of 1991 signed on December 9, 1991. Ten federal agencies in collaboration with scientists and managers from US industry, universities, and laboratories have developed the HPCC Program to meet the challenges of advancing computing and associated communications technologies and practices. This plan provides a detailed description of the agencies` HPCC implementation plans for FY 1994 and FY 1995. This Implementation Plan contains three additional sections. Section 3 provides an overview of the HPCC Program definition and organization. Section 4 contains a breakdown of the five major components of the HPCC Program, with an emphasis on the overall directions and milestones planned for each one. Section 5 provides a detailed look at HPCC Program activities within each agency. Although the Department of Education is an official HPCC agency, its current funding and reporting of crosscut activities goes through the Committee on Education and Health Resources, not the HPCC Program. For this reason the Implementation Plan covers nine HPCC agencies.

  6. High Performance Storage System Scalability: Architecture, Implementation, and Experience

    SciTech Connect

    Watson, R W

    2005-01-05

    The High Performance Storage System (HPSS) provides scalable hierarchical storage management (HSM), archive, and file system services. Its design, implementation and current dominant use are focused on HSM and archive services. It is also a general-purpose, global, shared, parallel file system, potentially useful in other application domains. When HPSS design and implementation began over a decade ago, scientific computing power and storage capabilities at a site, such as a DOE national laboratory, was measured in a few 10s of gigaops, data archived in HSMs in a few 10s of terabytes at most, data throughput rates to an HSM in a few megabytes/s, and daily throughput with the HSM in a few gigabytes/day. At that time, the DOE national laboratories and IBM HPSS design team recognized that we were headed for a data storage explosion driven by computing power rising to teraops/petaops requiring data stored in HSMs to rise to petabytes and beyond, data transfer rates with the HSM to rise to gigabytes/s and higher, and daily throughput with a HSM in 10s of terabytes/day. This paper discusses HPSS architectural, implementation and deployment experiences that contributed to its success in meeting the above orders of magnitude scaling targets. We also discuss areas that need additional attention as we continue significant scaling into the future.

  7. Compliance with best practice: implementing the best available evidence in the use of physical restraint in residential aged care.

    PubMed

    Timmins, Janet

    2008-09-01

    The Aged Care Clinical Fellowship, funded by the Commonwealth Department of Health and Ageing and conducted through the Joanna Briggs Institute is an initiative designed to improve the care of older Australians through clinical leadership and promotion of best practice. This paper outlines one of the projects undertaken at Carinya of Bicton, a residential aged high care facility, using an audit and feedback process to implement best practice standards in the use of physical restraint. Aims  Between 12% and 47% of residents in residential care facilities are restrained; however, initial observation of residents restrained in the project facility showed that restraint devices were utilised in up to 40% of residents. Within the aged care sector there has been a shift in attitude to reducing or eliminating restraint in aged care facilities. Restraint is seen as a negative experience for the resident, being associated with physical discomfort, embarrassment and restriction of freedom and of movement. The purpose of the project was to improve practice in the area of physical restraint through the process of auditing current practice against evidence-based, best practice criteria and ultimately to reduce the level of restraint in the facility. Methods  This practice improvement project utilised an audit and implementation cycle. The Joanna Briggs Institute Practical Application of Clinical Evidence System and best practice criteria developed from a systematic review were used to determine compliance with best practice. The Getting Research into Practice module was then employed to develop strategies to improve practice. Results  The follow-up audit indicated there has been a reduction in the number of residents restrained, increased use of alternatives to restraint and an awareness on the part of all care staff of the policies and procedures, which govern the use of restraint in the facility. Conclusions  It is recognised that the success of this project is in

  8. Implementing RTI in a high school: a case study.

    PubMed

    Fisher, Douglas; Frey, Nancy

    2013-01-01

    This case study chronicles the efforts of a small high school over a 2-year period as it designed an implemented a response to intervention (RTI) program for students at the school. Their efforts were largely successful, with improved achievement, attendance, and grade point averages and a decrease in special education referrals. Major themes include the need to focus on quality core instruction as a means for preventing school failure, adopting a schoolwide approach, and developing curriculum-based assessments that make intervention meaningful. PMID:21685349

  9. High-Performance CCSDS AOS Protocol Implementation in FPGA

    NASA Technical Reports Server (NTRS)

    Clare, Loren P.; Torgerson, Jordan L.; Pang, Jackson

    2010-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Advanced Orbiting Systems (AOS) space data link protocol provides a framing layer between channel coding such as LDPC (low-density parity-check) and higher-layer link multiplexing protocols such as CCSDS Encapsulation Service, which is described in the following article. Recent advancement in RF modem technology has allowed multi-megabit transmission over space links. With this increase in data rate, the CCSDS AOS protocol implementation needs to be optimized to both reduce energy consumption and operate at a high rate.

  10. Systems and Methods for Implementing High-Temperature Tolerant Supercapacitors

    NASA Technical Reports Server (NTRS)

    Brandon, Erik J. (Inventor); West, William C. (Inventor); Bugga, Ratnakumar V. (Inventor)

    2016-01-01

    Systems and methods in accordance with embodiments of the invention implement high-temperature tolerant supercapacitors. In one embodiment, a high-temperature tolerant super capacitor includes a first electrode that is thermally stable between at least approximately 80C and approximately 300C; a second electrode that is thermally stable between at least approximately 80C and approximately 300C; an ionically conductive separator that is thermally stable between at least approximately 80C and 300C; an electrolyte that is thermally stable between approximately at least 80C and approximately 300C; where the first electrode and second electrode are separated by the separator such that the first electrode and second electrode are not in physical contact; and where each of the first electrode and second electrode is at least partially immersed in the electrolyte solution.

  11. Design and implementation of spaceborne high resolution infrared touch screen

    NASA Astrophysics Data System (ADS)

    Li, Tai-guo; Li, Wen-xin; Dong, Yi-peng; Ma, Wen; Xia, Jia-gao

    2015-10-01

    For the consideration of the special application environment of the electronic products used in aerospace and to further more improve the human-computer interaction of the manned aerospace area. The research is based on the design and implementation way of the high resolution spaceborne infrared touch screen on the basis of FPGA and DSP frame structure. Beside the introduction of the whole structure for the high resolution spaceborne infrared touch screen system, this essay also gives the detail information about design of hardware for the high resolution spaceborne infrared touch screen system, FPGA design, GUI design and DSP algorithm design based on Lagrange interpolation. What is more, the easy makes a comprehensive research of the reliability design for the high resolution spaceborne infrared touch screen for the special purpose of it. Besides, the system test is done after installation of spaceborne infrared touch screen. The test result shows that the system is simple and reliable enough, which has a stable running environment and high resolution, which certainly can meet the special requirement of the manned aerospace instrument products.

  12. Highly parallel implementation of non-adiabatic Ehrenfest molecular dynamics

    NASA Astrophysics Data System (ADS)

    Kanai, Yosuke; Schleife, Andre; Draeger, Erik; Anisimov, Victor; Correa, Alfredo

    2014-03-01

    While the adiabatic Born-Oppenheimer approximation tremendously lowers computational effort, many questions in modern physics, chemistry, and materials science require an explicit description of coupled non-adiabatic electron-ion dynamics. Electronic stopping, i.e. the energy transfer of a fast projectile atom to the electronic system of the target material, is a notorious example. We recently implemented real-time time-dependent density functional theory based on the plane-wave pseudopotential formalism in the Qbox/qb@ll codes. We demonstrate that explicit integration using a fourth-order Runge-Kutta scheme is very suitable for modern highly parallelized supercomputers. Applying the new implementation to systems with hundreds of atoms and thousands of electrons, we achieved excellent performance and scalability on a large number of nodes both on the BlueGene based ``Sequoia'' system at LLNL as well as the Cray architecture of ``Blue Waters'' at NCSA. As an example, we discuss our work on computing the electronic stopping power of aluminum and gold for hydrogen projectiles, showing an excellent agreement with experiment. These first-principles calculations allow us to gain important insight into the the fundamental physics of electronic stopping.

  13. Design and implementation of a high performance network security processor

    NASA Astrophysics Data System (ADS)

    Wang, Haixin; Bai, Guoqiang; Chen, Hongyi

    2010-03-01

    The last few years have seen many significant progresses in the field of application-specific processors. One example is network security processors (NSPs) that perform various cryptographic operations specified by network security protocols and help to offload the computation intensive burdens from network processors (NPs). This article presents a high performance NSP system architecture implementation intended for both internet protocol security (IPSec) and secure socket layer (SSL) protocol acceleration, which are widely employed in virtual private network (VPN) and e-commerce applications. The efficient dual one-way pipelined data transfer skeleton and optimised integration scheme of the heterogenous parallel crypto engine arrays lead to a Gbps rate NSP, which is programmable with domain specific descriptor-based instructions. The descriptor-based control flow fragments large data packets and distributes them to the crypto engine arrays, which fully utilises the parallel computation resources and improves the overall system data throughput. A prototyping platform for this NSP design is implemented with a Xilinx XC3S5000 based FPGA chip set. Results show that the design gives a peak throughput for the IPSec ESP tunnel mode of 2.85 Gbps with over 2100 full SSL handshakes per second at a clock rate of 95 MHz.

  14. Formation Flying Control Implementation in Highly Elliptical Orbits

    NASA Technical Reports Server (NTRS)

    Capo-Lugo, Pedro A.; Bainum, Peter M.

    2009-01-01

    The Tschauner-Hempel equations are widely used to correct the separation distance drifts between a pair of satellites within a constellation in highly elliptical orbits [1]. This set of equations was discretized in the true anomaly angle [1] to be used in a digital steady-state hierarchical controller [2]. This controller [2] performed the drift correction between a pair of satellites within the constellation. The objective of a discretized system is to develop a simple algorithm to be implemented in the computer onboard the satellite. The main advantage of the discrete systems is that the computational time can be reduced by selecting a suitable sampling interval. For this digital system, the amount of data will depend on the sampling interval in the true anomaly angle [3]. The purpose of this paper is to implement the discrete Tschauner-Hempel equations and the steady-state hierarchical controller in the computer onboard the satellite. This set of equations is expressed in the true anomaly angle in which a relation will be formulated between the time and the true anomaly angle domains.

  15. Design and implementation of high throughput screening assays.

    PubMed

    Macarrón, Ricardo; Hertzberg, Robert P

    2011-03-01

    High throughput screening (HTS) is at the core of the drug discovery process, and so it is critical to design and implement HTS assays in a comprehensive fashion involving scientists from the disciplines of biology, chemistry, engineering, and informatics. This requires careful analysis of many variables, starting with the choice of assay target and ending with the discovery of lead compounds. At every step in this process, there are decisions to be made that can greatly impact the outcome of the HTS effort, to the point of making it a success or a failure. Although specific guidelines should be established to insure that the screening assay reaches an acceptable level of quality, many choices require pragmatism and the ability to compromise opposing forces. PMID:20865348

  16. Design and implementation of high dynamic GNSS digital receiver

    NASA Astrophysics Data System (ADS)

    Li, Hanmei; Geng, Shengqun; Wang, Ce; Xu, Yong; Zhang, Qishan

    2007-11-01

    The paper presents a scheme of high dynamic GNSS digital receiver using FPGA xc4vsx55 of XILINX and DSP TMS320VC6701 of TI as core controller. Besides brief introduction of scheme design and hardware structure, the paper comprehensively introduces design and implementation of algorithms of fast acquisition and tracking of spread spectrum signal in high dynamic environment. Through optimized design, fast acquisition and tracking of both C code (coarse ranging code) and P code (precision ranging code) are realized in one chip of FPGA, under the control of DSP. Employing FFT-based fast acquisition algorithm, acquisition unit realizes the fast acquisition by duplicated using two FFT/IFFT units with time-sharing fashion, and other optimized FFT calculation structures. Carrier tracking loop is realized by adopting FLL+PLL method which using FLL tracking carrier Doppler shift with greater bandwidth making loop closed rapidly and using PLL precisely tracking carrier phase so as to achieve perfect tracking effects. PN code tracking loop is realized by using multiple non-coherent DLLs with various correlation spacing, which satisfying the requirements of larger tracking range as well as higher tracking precision by using broad spacing accomplishing initial tracking and narrow spacing realizing high precision tracking.

  17. A Very High Order, Adaptable MESA Implementation for Aeroacoustic Computations

    NASA Technical Reports Server (NTRS)

    Dydson, Roger W.; Goodrich, John W.

    2000-01-01

    Since computational efficiency and wave resolution scale with accuracy, the ideal would be infinitely high accuracy for problems with widely varying wavelength scales. Currently, many of the computational aeroacoustics methods are limited to 4th order accurate Runge-Kutta methods in time which limits their resolution and efficiency. However, a new procedure for implementing the Modified Expansion Solution Approximation (MESA) schemes, based upon Hermitian divided differences, is presented which extends the effective accuracy of the MESA schemes to 57th order in space and time when using 128 bit floating point precision. This new approach has the advantages of reducing round-off error, being easy to program. and is more computationally efficient when compared to previous approaches. Its accuracy is limited only by the floating point hardware. The advantages of this new approach are demonstrated by solving the linearized Euler equations in an open bi-periodic domain. A 500th order MESA scheme can now be created in seconds, making these schemes ideally suited for the next generation of high performance 256-bit (double quadruple) or higher precision computers. This ease of creation makes it possible to adapt the algorithm to the mesh in time instead of its converse: this is ideal for resolving varying wavelength scales which occur in noise generation simulations. And finally, the sources of round-off error which effect the very high order methods are examined and remedies provided that effectively increase the accuracy of the MESA schemes while using current computer technology.

  18. Design and implementation of highly parallel pipelined VLSI systems

    NASA Astrophysics Data System (ADS)

    Delange, Alphonsus Anthonius Jozef

    A methodology and its realization as a prototype CAD (Computer Aided Design) system for the design and analysis of complex multiprocessor systems is presented. The design is an iterative process in which the behavioral specifications of the system components are refined into structural descriptions consisting of interconnections and lower level components etc. A model for the representation and analysis of multiprocessor systems at several levels of abstraction and an implementation of a CAD system based on this model are described. A high level design language, an object oriented development kit for tool design, a design data management system, and design and analysis tools such as a high level simulator and graphics design interface which are integrated into the prototype system and graphics interface are described. Procedures for the synthesis of semiregular processor arrays, and to compute the switching of input/output signals, memory management and control of processor array, and sequencing and segmentation of input/output data streams due to partitioning and clustering of the processor array during the subsequent synthesis steps, are described. The architecture and control of a parallel system is designed and each component mapped to a module or module generator in a symbolic layout library, compacted for design rules of VLSI (Very Large Scale Integration) technology. An example of the design of a processor that is a useful building block for highly parallel pipelined systems in the signal/image processing domains is given.

  19. Assessing a parsimonious eco-hydrological model implementation to an Aleppo pine semiarid forest through available remote sensing data

    NASA Astrophysics Data System (ADS)

    Medici, C.; Pasquato, M.; Frances, F.

    2013-12-01

    Arid and semi-arid climates cover a large portion of Earth's terrestrial surface and most of the ecosystems under these conditions represent hot spots in terms of Global Change consequences. In fact, the ecosystems are controlled by water availability, inducing a tight interconnection between the hydrological cycle and the vegetation dynamics. For this reason, it is essential to model these two systems, vegetational and hydrological, concurrently. However, frequently, in operational applications the available information is quite limited. Therefore parsimonious models together with available satellite information can be valuable tools to predict the vegetation dynamic. In this work, a parsimonious dynamic vegetation model is applied to a semi-arid Aleppo Pine forest area in the south-east of Spain. The model simulates biomass increase as related to the absorbed photosynthetically active radiation (APAR) through the light use efficiency (LUE). The model is then tested against several available products offered by MODIS instruments flying onboard the Terra and Aqua satellite. The satellite information used in this study is the following: the Normalized Difference Vegetation Index (NDVI) and the Enhanced Vegetation Index (EVI), both included in the products MOD13Q1 and MYD13Q1 and provided every 16 days at 250-meters spatial resolution; the Leaf Area Index (LAI), included in the products MOD15A2 and MYD15A2 and provided every 8 days at 1000-meters spatial resolution; the actual Evapotranspiration (ET), included in the MOD16A2 product and provided every 8 days at 1000-meters spatial resolution. These satellite data were analyzed for the period 2000 - 2011 over the study area, averaging the spatial distributed data to obtain the evolution through time. All four products showed a marked seasonal quasi-sinusoidal behavior, but differences between them were noticed regarding the timing of peaks. NDVI showed a strong dependence on soil moisture and leaf water content

  20. Clinical review: Strict or loose glycemic control in critically ill patients - implementing best available evidence from randomized controlled trials

    PubMed Central

    2010-01-01

    Glycemic control aiming at normoglycemia, frequently referred to as 'strict glycemic control' (SGC), decreased mortality and morbidity of adult critically ill patients in two randomized controlled trials (RCTs). Five successive RCTs, however, failed to show benefit of SGC with one trial even reporting an unexpected higher mortality. Consequently, enthusiasm for the implementation of SGC has declined, hampering translation of SGC into daily ICU practice. In this manuscript we attempt to explain the variances in outcomes of the RCTs of SGC, and point out other limitations of the current literature on glycemic control in ICU patients. There are several alternative explanations for why the five negative RCTs showed no beneficial effects of SGC, apart from the possibility that SGC may indeed not benefit ICU patients. These include, but are not restricted to, variability in the performance of SGC, differences among trial designs, changes in standard of care, differences in timing (that is, initiation) of SGC, and the convergence between the intervention groups and control groups with respect to achieved blood glucose levels in the successive RCTs. Additional factors that may hamper translation of SGC into daily ICU practice include the feared risk of severe hypoglycemia, additional labor associated with SGC, and uncertainties about who the primarily responsible caregiver should be for the implementation of SGC. PMID:20550725

  1. Building a highly available and intrusion tolerant Database Security and Protection System (DSPS).

    PubMed

    Cai, Liang; Yang, Xiao-Hu; Dong, Jin-Xiang

    2003-01-01

    Database Security and Protection System (DSPS) is a security platform for fighting malicious DBMS. The security and performance are critical to DSPS. The authors suggested a key management scheme by combining the server group structure to improve availability and the key distribution structure needed by proactive security. This paper detailed the implementation of proactive security in DSPS. After thorough performance analysis, the authors concluded that the performance difference between the replicated mechanism and proactive mechanism becomes smaller and smaller with increasing number of concurrent connections; and that proactive security is very useful and practical for large, critical applications. PMID:12765281

  2. High nutrient availability reduces the diversity and stability of the equine caecal microbiota

    PubMed Central

    Hansen, Naja C. K.; Avershina, Ekaterina; Mydland, Liv T.; Næsset, Jon A.; Austbø, Dag; Moen, Birgitte; Måge, Ingrid; Rudi, Knut

    2015-01-01

    Background It is well known that nutrient availability can alter the gut microbiota composition, while the effect on diversity and temporal stability remains largely unknown. Methods Here we address the equine caecal microbiota temporal stability, diversity, and functionality in response to diets with different levels of nutrient availability. Hay (low and slower nutrient availability) versus a mixture of hay and whole oats (high and more rapid nutrient availability) were used as experimental diets. Results We found major effects on the microbiota despite that the caecal pH was far from sub-clinical acidosis. We found that the low nutrient availability diet was associated with a higher level of both diversity and temporal stability of the caecal microbiota than the high nutrient availability diet. These observations concur with general ecological theories, suggesting a stabilising effect of biological diversity and that high nutrient availability has a destabilising effect through reduced diversity. Conclusion Nutrient availability does not only change the composition but also the ecology of the caecal microbiota. PMID:26246403

  3. SU-E-T-492: Implementing a Method for Brain Irradiation in Rats Utilizing a Commercially Available Radiosurgery Irradiator

    SciTech Connect

    Cates, J; Drzymala, R

    2014-06-01

    Purpose: The purpose of the study was to implement a method for accurate rat brain irradiation using the Gamma Knife Perfexion unit. The system needed to be repeatable, efficient, and dosimetrically and spatially accurate. Methods: A platform (“rat holder”) was made such that it is attachable to the Leskell Gamma Knife G Frame. The rat holder utilizes two ear bars contacting bony anatomy and a front tooth bar to secure the rat. The rat holder fits inside of the Leskell localizer box, which utilizes fiducial markers to register with the GammaPlan planning system. This method allows for accurate, repeatable setup.A cylindrical phantom was made so that film can be placed axially in the phantom. We then acquired CT image sets of the rat holder and localizer box with both a rat and the phantom. Three treatment plans were created: a plan on the rat CT dataset, a phantom plan with the same prescription dose as the rat plan, and a phantom plan with the same delivery time as the rat plan. Results: Film analysis from the phantom showed that our setup is spatially accurate and repeatable. It is also dosimetrically accurate, with an difference between predicted and measured dose of 2.9%. Film analysis with prescription dose equal between rat and phantom plans showed a difference of 3.8%, showing that our phantom is a good representation of the rat for dosimetry purposes, allowing for +/- 3mm diameter variation. Film analysis with treatment time equal showed an error of 2.6%, which means we can deliver a prescription dose within 3% accuracy. Conclusion: Our method for irradiation of rat brain has been shown to be repeatable, efficient, and accurate, both dosimetrically and spatially. We can treat a large number of rats efficiently while delivering prescription doses within 3% at millimeter level accuracy.

  4. The High Plains Groundwater Availability Study: Abundant Groundwater Doesn't Necessarily Mean Abundant Surface Water

    NASA Astrophysics Data System (ADS)

    Peterson, S. M.; Stanton, J. S.; Flynn, A. T.

    2013-12-01

    The U.S. Geological Survey's Groundwater Resources Program is conducting an assessment of groundwater availability to gain a clearer understanding of the status of the Nation's groundwater resources and the natural and human factors that can affect those resources. Additional goals are to better estimate availability and suitability of those resources in the future for various uses. The High Plains aquifer is a nationally important water resource that underlies about 174,000 square miles in parts of eight western states. The aquifer serves as a primary source of drinking water for approximately 2.3 million people and also sustains more than one quarter of the Nation's agricultural production. In 2000, total water withdrawals of 17.5 billion gallons per day from the aquifer accounted for 20 percent of all groundwater withdrawn in the United States, making it the most intensively pumped aquifer in the Nation. In the Central and Southern High Plains, the aquifer historically had less saturated thickness, and current resource management issues are focused on the availability of water, and reduced ability to irrigate as water levels and well productivity have declined. In contrast, the Northern High Plains aquifer includes the thickest part of the aquifer and a larger saturated thickness than the other parts of the aquifer, and current water resource management issues are related to the interaction of groundwater with surface water and resource management triggered primarily by the availability of surface water. The presentation will cover major components of the High Plains Groundwater Availability Study, including estimating water budget components for the entire High Plains aquifer, building a refined groundwater model for the Northern High Plains aquifer, and using that model to better understand surface- and groundwater interaction and characterize water availability.

  5. What's the 411? High School Leaders' Perceptions of Inclusion Implementation

    ERIC Educational Resources Information Center

    Jamison, Arnella L.

    2013-01-01

    This quantitative, descriptive study explored and described urban and suburban school leaders' perceptions of the definition of "inclusion" and perceptions of their level of involvement in the implementation of inclusion. Additionally, the study determined if there was a significant difference in the urban and suburban school leaders'…

  6. High light-induced hydrogen peroxide production in Chlamydomonas reinhardtii is increased by high CO2 availability.

    PubMed

    Roach, Thomas; Na, Chae Sun; Krieger-Liszkay, Anja

    2015-03-01

    The production of reactive oxygen species (ROS) is an unavoidable part of photosynthesis. Stress that accompanies high light levels and low CO2 availability putatively includes enhanced ROS production in the so-called Mehler reaction. Such conditions are thought to encourage O2 to become an electron acceptor at photosystem I, producing the ROS superoxide anion radical (O2·-) and hydrogen peroxide (H2 O2 ). In contrast, here it is shown in Chlamydomonas reinhardtii that CO2 depletion under high light levels lowered cellular H2 O2 production, and that elevated CO2 levels increased H2 O2 production. Using various photosynthetic and mitochondrial mutants of C. reinhardtii, the chloroplast was identified as the main source of elevated H2 O2 production under high CO2 availability. High light levels under low CO2 availability induced photoprotective mechanisms called non-photochemical quenching, or NPQ, including state transitions (qT) and high energy state quenching (qE). The qE-deficient mutant npq4 produced more H2 O2 than wild-type cells under high light levels, although less so under high CO2 availability, whereas it demonstrated equal or greater enzymatic H2 O2 -degrading capacity. The qT-deficient mutant stt7-9 produced the same H2 O2 as wild-type cells under high CO2 availability. Physiological levels of H2 O2 were able to hinder qT and the induction of state 2, providing an explanation for why under high light levels and high CO2 availability wild-type cells behaved like stt7-9 cells stuck in state 1. PMID:25619314

  7. On finite element implementation and computational techniques for constitutive modeling of high temperature composites

    NASA Technical Reports Server (NTRS)

    Saleeb, A. F.; Chang, T. Y. P.; Wilt, T.; Iskovitz, I.

    1989-01-01

    The research work performed during the past year on finite element implementation and computational techniques pertaining to high temperature composites is outlined. In the present research, two main issues are addressed: efficient geometric modeling of composite structures and expedient numerical integration techniques dealing with constitutive rate equations. In the first issue, mixed finite elements for modeling laminated plates and shells were examined in terms of numerical accuracy, locking property and computational efficiency. Element applications include (currently available) linearly elastic analysis and future extension to material nonlinearity for damage predictions and large deformations. On the material level, various integration methods to integrate nonlinear constitutive rate equations for finite element implementation were studied. These include explicit, implicit and automatic subincrementing schemes. In all cases, examples are included to illustrate the numerical characteristics of various methods that were considered.

  8. High School Physics Availability: Results from the 2012-13 Nationwide Survey of High School Physics Teachers. Focus On

    ERIC Educational Resources Information Center

    White, Susan; Tesfaye, Casey Langer

    2014-01-01

    In this report, the authors share their analysis of the data from over 3,500 high schools in the U.S. beginning with an examination of the availability of physics in U.S. high schools. The schools in their sample are a nationally-representative random sample of the almost 25,000 high schools in forty-nine of the fifty states. Table 1 shows the…

  9. Is plant migration restrained by available nitrogen supply in high latitudes?

    NASA Astrophysics Data System (ADS)

    Lee, E.; Schlosser, C. A.; Felzer, B.; Kicklighter, D.; Cronin, T.; Melillo, J.; Prinn, R. G.

    2008-12-01

    Recent studies suggest that growth and distribution of natural vegetation in high latitudes may be controlled by the amount of available nitrogen. Yet few studies have examined the role of available nitrogen on plant migration in response to anticipated climate change. We use a modeling approach to explore this issue. With a projected climate dataset (GFDL CM 2.0) from the IPCC AR4 archive, we first estimate net nitrogen mineralization values for natural plant functional types in high latitudes (north of 52N), using the Terrestrial Ecosystem Model (TEM). Previous work with TEM indicates that warming increases the rates of net nitrogen mineralization in high latitudes (e.g. 10 percent increase in boreal forests), which may help support a pattern of increased woodiness in northern systems such as boreal woodlands filling in with trees and tundra becoming more shrubby. Constrained with the available nitrogen for each vegetation type, a simple rule- based model, which describes the migration process and adopts processes of climatic tolerances of trees from the BIOME biogeography model, is used to generate a newly projected vegetation map for high latitudes. Our study emphasizes the significance of the role of nitrogen in the high latitude plant distribution. We also investigate the climatic consequences of the changing albedo, resulting from shifts in the vegetation distribution.

  10. Implementing Schoolwide Positive Behavior Interventions and Supports in High Schools: Contextual Factors and Stages of Implementation

    ERIC Educational Resources Information Center

    Swain-Bradway, Jessica; Pinkney, Christopher; Flannery, K. Brigid

    2015-01-01

    Schoolwide positive behavior interventions and supports (SWPBIS) are an increasingly popular framework for school improvement practices, but many high schools are still lagging behind their elementary counterparts. High school leadership teams can struggle with merging the SWPBIS framework with current operations, and there are limited examples of…

  11. A Systemic Approach to Implementing Response to Intervention in Three Colorado High Schools

    ERIC Educational Resources Information Center

    Duffy, Helen; Scala, Jenny

    2012-01-01

    The National High School Center continues to receive inquiries about how to support high school implementation of response to intervention (RTI). Given the National High School Center's previous work on the topic, the authors wanted to better understand the conditions that contribute to or inhibit implementation of tiered frameworks in high…

  12. High-speed dynamic domino circuit implemented with gaas mesfets

    NASA Technical Reports Server (NTRS)

    Yang, Long (Inventor); Long, Stephen I. (Inventor)

    1990-01-01

    A dynamic logic circuit (AND or OR) utilizes one depletion-mode metal-semiconductor FET for precharging an internal node A, and a plurality of the same type of FETs in series, or a FET in parallel with one or more of the series connected FETs for implementing the logic function. A pair of FETs are connected to provide an output inverter with two series diodes for level shift. A coupling capacitor may be employed with a further FET to provide level shifting required between the inverter and the logic circuit output terminal. These circuits may be cascaded to form a domino chain.

  13. Implementation of the high-order schemes QUICK and LECUSSO in the COMMIX-1C Program

    SciTech Connect

    Sakai, K.; Sun, J.G.; Sha, W.T.

    1995-08-01

    Multidimensional analysis computer programs based on the finite volume method, such as COMMIX-1C, have been commonly used to simulate thermal-hydraulic phenomena in engineering systems such as nuclear reactors. In COMMIX-1C, the first-order schemes with respect to both space and time are used. In many situations such as flow recirculations and stratifications with steep gradient of velocity and temperature fields, however, high-order difference schemes are necessary for an accurate prediction of the fields. For these reasons, two second-order finite difference numerical schemes, QUICK (Quadratic Upstream Interpolation for Convective Kinematics) and LECUSSO (Local Exact Consistent Upwind Scheme of Second Order), have been implemented in the COMMIX-1C computer code. The formulations were derived for general three-dimensional flows with nonuniform grid sizes. Numerical oscillation analyses for QUICK and LECUSSO were performed. To damp the unphysical oscillations which occur in calculations with high-order schemes at high mesh Reynolds numbers, a new FRAM (Filtering Remedy and Methodology) scheme was developed and implemented. To be consistent with the high-order schemes, the pressure equation and the boundary conditions for all the conservation equations were also modified to be of second order. The new capabilities in the code are listed. Test calculations were performed to validate the implementation of the high-order schemes. They include the test of the one-dimensional nonlinear Burgers equation, two-dimensional scalar transport in two impinging streams, von Karmann vortex shedding, shear driven cavity flow, Couette flow, and circular pipe flow. The calculated results were compared with available data; the agreement is good.

  14. Implementing a Case Management Initiative in High-Need Schools

    PubMed Central

    Gifford, Elizabeth J.

    2013-01-01

    States continue to experiment with ways of improving health and human service use by people with complex needs. Such efforts have often sought to increase individual and family control over services as well as to enhance coordination among providers. Paths to achieving these goals are not well understood. This study draws on two previously distinct conceptual frameworks to examine how 71 public schools implemented a team approach to increasing family and agency engagement for children at risk. Results from longitudinal data fit the core components expected to affect implementation and also indicated sustainability, but in ways distinctive to the initiative's public school settings. Accountability to the state appeared to be a major catalyst, yet in some respects also constrained local agencies from participating as intended. School inertia may have both undermined the program through some evaluation practices and gaps in administrative support, and supported integration into organizational routines and successful experimentation over time in increasing caregiver involvement. Family hesitation about sharing information with multiple agencies may also help explain why the goal of seamless coordination remains elusive. PMID:23976809

  15. Intraspecific competition and high food availability are associated with insular gigantism in a lizard

    NASA Astrophysics Data System (ADS)

    Pafilis, Panayiotis; Meiri, Shai; Foufopoulos, Johannes; Valakos, Efstratios

    2009-09-01

    Resource availability, competition, and predation commonly drive body size evolution. We assess the impact of high food availability and the consequent increased intraspecific competition, as expressed by tail injuries and cannibalism, on body size in Skyros wall lizards ( Podarcis gaigeae). Lizard populations on islets surrounding Skyros (Aegean Sea) all have fewer predators and competitors than on Skyros but differ in the numbers of nesting seabirds. We predicted the following: (1) the presence of breeding seabirds (providing nutrients) will increase lizard population densities; (2) dense lizard populations will experience stronger intraspecific competition; and (3) such aggression, will be associated with larger average body size. We found a positive correlation between seabird and lizard densities. Cannibalism and tail injuries were considerably higher in dense populations. Increases in cannibalism and tail loss were associated with large body sizes. Adult cannibalism on juveniles may select for rapid growth, fuelled by high food abundance, setting thus the stage for the evolution of gigantism.

  16. Nitrogen saturation and soil N availability in a high-elevation spruce and fir forest

    SciTech Connect

    Garten Jr, Charles T

    2000-06-01

    A field study was conducted during the summer of 1995 to gain abetter understanding of the causes of nitrate (NO{sub 3}-N) leaching and ongoing changes in soil nitrogen (N) availability in high-elevation (1524-2000 m) spruce (Picea rubens) and fir (Abies fraseri) forests of the Great Smoky Mountains National Park, Tennessee and North Carolina, U.S.A. Indicators of soil N availability (total soil N concentrations, extractable NH{sub 4}-N, extractable NO{sub 3}-N, and C/N ratios) were measured in Oa and A horizons at 33 study plots. Dynamic measures included potential net soil N mineralization determined in 12-week aerobic laboratory incubations at 22 C. Potential net nitrification in the A horizon was correlated (r = + 0.83, P < 0.001) with total soil n concentrations. mostmeasures of soil n availability did not exhibit significanttrends with elevation, but there were topographic differences. Potential net soil N mineralization and net nitrification in the A horizon were higher in coves than on ridges. Relative amounts of particulate and organomineral soil organic matter influenced potential net N mineralization and nitrification in the A horizon. Calculations indicate that soil N availability and NO{sub 3}-N leaching in high-elevation spruce and fir forests of the Great Smoky Mountains National Park will increase in response to regional warming.

  17. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.

  18. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.

  19. Design and Implementation of High-Throughput Screening Assays.

    PubMed

    Powell, David J; Hertzberg, Robert P; Macarrόn, Ricardo

    2016-01-01

    HTS remains at the core of the drug discovery process, and so it is critical to design and implement HTS assays in a comprehensive fashion involving scientists from the disciplines of biology, chemistry, engineering, and informatics. This requires careful consideration of many options and variables, starting with the choice of screening strategy and ending with the discovery of lead compounds. At every step in this process, there are decisions to be made that can greatly impact the outcome of the HTS effort, to the point of making it a success or a failure. Although specific guidelines should be established to ensure that the screening assay reaches an acceptable level of quality, many choices require pragmatism and the ability to compromise opposing forces. PMID:27316985

  20. Design and implementation of high throughput screening assays.

    PubMed

    Macarrón, Ricardo; Hertzberg, Robert P

    2002-01-01

    HTS is at the core of the drug-discovery process, and so it is critical to design and implement HTS assays in a comprehensive fashion involving scientists from the disciplines of biology, chemistry, engineering, and informatics. This requires careful analysis of many variables, starting with the choice of assay target and ending with the discovery of lead compounds. At every step in this process, there are decisions to be made that can greatly impact the outcome of the HTS effort, to the point of making it a success or a failure. Although specific guidelines can be established to ensure that the screening assay reaches an acceptable level of quality, many choices require pragmatism and the ability to compromise opposing forces. PMID:12029816

  1. Design and implementation of high-throughput screening assays.

    PubMed

    Macarrón, Ricardo; Hertzberg, Robert P

    2009-01-01

    HTS is at the core of the drug discovery process, and so it is critical to design and implement HTS assays in a comprehensive fashion involving scientists from the disciplines of biology, chemistry, engineering, and informatics. This requires careful analysis of many variables, starting with the choice of assay target and ending with the discovery of lead compounds. At every step in this process, there are decisions to be made that can greatly impact the outcome of the HTS effort, to the point of making it a success or a failure. Although specific guidelines should be established to ensure that the screening assay reaches an acceptable level of quality, many choices require pragmatism and the ability to compromise opposing forces. PMID:19551355

  2. The California High School Proficiency Examination Six Years After Implementation.

    ERIC Educational Resources Information Center

    Katz, Cynthia L.; Padia, William L.

    In 1975 California became the first state to offer an "early exit" proficiency test from high school to persons 16 years of age or older (California law normally requires young persons to attend school until they reach 18 or graduate regularly from high school). More than 87,000 persons have passed the early exit examination; a pass rate of…

  3. Implementation of a College Biology Course in High School.

    ERIC Educational Resources Information Center

    Druger, Marvin; Spector, Barbara S.

    1979-01-01

    Suggestions are given for dealing with "senioritis" in high school students. These include offering college-level courses at a reduced cost. A description of introducing the college freshman biology course at Syracuse University into a high school program is given, with guidelines to promote success. (Author/SA)

  4. Unusually high food availability in Kaikoura Canyon linked to distinct deep-sea nematode community

    NASA Astrophysics Data System (ADS)

    Leduc, D.; Rowden, A. A.; Nodder, S. D.; Berkenbusch, K.; Probert, P. K.; Hadfield, M. G.

    2014-06-01

    Kaikoura Canyon, on the eastern New Zealand continental margin, is the most productive, non-chemosynthetic deep-sea habitat described to date, with megafaunal biomass 100-fold higher than those of other deep-sea habitats. The present study, which focused on free-living nematodes, provides the first comparison of faunal community structure and diversity between Kaikoura Canyon and nearby open slope habitats. Results show substantially higher food availability in the canyon relative to open slope sediments, which probably reflects greater levels of primary productivity above the canyon, coupled with downwelling and/or topographically-induced channelling, which serves to concentrate surface-derived organic matter along the canyon axis. This high food availability appears to be responsible for the elevated nematode biomass in Kaikoura Canyon, with values exceeding all published nematode biomass data from canyons elsewhere. There was also markedly lower local species diversity of nematodes inside the canyon relative to the open slope habitat, as well as a distinct community structure. The canyon community was dominated by species, such as Sabateria pulchra, which were absent from the open slope and are typically associated with highly eutrophic and/or disturbed environments. The presence of these taxa, as well as the low observed diversity, is likely to reflect the high food availability, and potentially the high levels of physically and biologically induced disturbance within the canyon. Kaikoura Canyon is a relatively small habitat characterised by different environmental conditions that makes a disproportionate contribution to deep-sea diversity in the region, despite its low species richness.

  5. High responsivity CMOS imager pixel implemented in SOI technology

    NASA Technical Reports Server (NTRS)

    Zheng, X.; Wrigley, C.; Yang, G.; Pain, B.

    2000-01-01

    Availability of mature sub-micron CMOS technology and the advent of the new low noise active pixel sensor (APS) concept have enabled the development of low power, miniature, single-chip, CMOS digital imagers in the decade of the 1990's.

  6. Successful Strategies for Implementation of a High School Standards-Based Integrated Mathematics Curriculum

    ERIC Educational Resources Information Center

    Brown, Linda

    2012-01-01

    Math achievement for students in the United States is not as high as in other countries. In response, one state implemented a new standards-based, integrated math curriculum that combines traditional high school math courses and emphasizes student centered instruction. The purpose of this study was to examine the implementation of a standards…

  7. Factors Influencing the Implementation of an International Baccalaureate Diploma Program in a Diverse Urban High School

    ERIC Educational Resources Information Center

    Mayer, Anysia P.

    2010-01-01

    This article identifies factors that promoted the successful implementation of an International Baccalaureate Diploma Program in an urban high school. The study draws on data from an in-depth case study at a large high school serving an urban community in a Western state. The study investigates seven implementation mechanisms that research…

  8. Career Guidance: An Implementation Model for Small High Schools. A Maxi I Practicum.

    ERIC Educational Resources Information Center

    Stevens, Richard; And Others

    The purpose of this practicum was to design, develop, and implement a career guidance program for small high schools. The program description would act as a model for implementation at other high schools desiring a career guidance program. The method of communicating the program to others was the writing of a "how to" book which others would use…

  9. Urban Students Achieve When High Schools Implement Proven Practices. Research Brief

    ERIC Educational Resources Information Center

    Bottoms, Gene; Han, Lingling; Presson, Alice

    2006-01-01

    Students benefit from a year or more gain in student achievement when urban district and high school leaders commit to the implementation of the "High Schools That Work" ("HSTW") design. It is not enough to be a "HSTW" site--it is about taking effective actions to implement the design. Schools that do take action witness significant progress in…

  10. High School Principals' Rating of Success in Implementation of 21st Century Skills

    ERIC Educational Resources Information Center

    Sam, Sonn

    2011-01-01

    The purpose of this quantitative study was to investigate how Rhode Island high school principals rate success in implementing 21st century skills in their schools. Secondly, this study investigated how high school principals rate the influence of implementing of 21st century skills in curriculum and instruction in their schools. The high…

  11. High Availability On-line Relational Databases for Accelerator Control and Operation

    SciTech Connect

    Dohan,D.; Dalesio, L.; Carcassi, G.

    2009-05-04

    The role that relational database (RDB) technology plays in accelerator control and operation continues to grow in such areas as electronic logbooks, machine parameter definitions, and facility infrastructure management. RDBs are increasingly relied upon to provide the official 'master' copy of these data. Whereas the services provided by the RDB have traditionally not been 'mission critical', the availability of modern RDB management systems is now equivalent to that of standard computer file-systems. RDBs can be relied on to supply pseudo real-time response to operator and machine physicist requests. This paper describes recent developments in the IRMIS RDB project. Generic lattice support has been added, serving as the driver for model-based machine control. Abstract physics name service and process variable introspection has been added. Specific emphasis has been placed both on providing fast response time to accelerator operators and modeling code requests, as well as high (24/7) availability of the RDB service.

  12. Availability and cost estimate of a high naphthene, modified aviation turbine fuel

    NASA Technical Reports Server (NTRS)

    Prok, George M.

    1988-01-01

    Information from an Air Force study was used to determine the potential availability and cost of a modified conventional fuel with a naphthene content which could have a thermal stability near that of JP-7 for high-speed civil transports. Results showed sufficient capacity for a fuel made of a blend of 50 percent naphthenic straight run kerosene and 50 percent hydrocracked product, assuming a near-term requirement of 210,000 BBL per day. Fuel cost would be as low as 62.5 to 64.5 cents per gallon, assuming 20 dollars per barrel for crude.

  13. Innovative use of controlled availability fertilizers with high performance for intensive agriculture and environmental conservation.

    PubMed

    Shoji, Sadao

    2005-12-01

    A variety of slow release fertilizers, controlled release (availability) fertilizers (CAFs), and stability fertilizers have been developed in response to the serious drawbacks of the conventional fertilizers since the early 1960's. Of these fertilizers, CAFs which are coated with resin are consumed in the largest quantity in the world. Selecting CAFs with higher performance, the author will discuss about: 1) Innovation of agro-technologies for various field crops including new concepts of fertilizer application, 2) high yielding of field crops, 3) enhancing quality and safety of farm products, and 4) controlling the adverse effect of intensive agriculture on the environment. PMID:16512212

  14. Innovative use of controlled availability fertilizers with high performance for intensive agriculture and environmental conservation.

    PubMed

    Shoji, Sadao

    2005-09-01

    A variety of slow release fertilizers, controlled release (availability) fertilizers (CAFs), and stability fertilizers have been developed in response to the serious drawbacks of the conventional fertilizers since the early 1960's. Of these fertilizers, CAFs which are coated with resin are consumed in the largest quantity in the world. Selecting CAFs with higher performance, the author will discuss about: 1) Innovation of agro-technologies for various field crops including new concepts of fertilizer application, 2) high yielding of field crops, 3) enhancing quality and safety of farm products, and 4) controlling the adverse effect of intensive agriculture on the environment. PMID:20549445

  15. Competitive foods in schools: Availability and purchasing in predominately rural small and large high schools

    PubMed Central

    Nollen, Nicole L.; Befort, Christie; Davis, Ann McGrath; Snow, Tricia; Mahnken, Jonathan; Hou, Qingjiang; Story, Mary; Ahluwalia, Jasjit S.

    2013-01-01

    OBJECTIVES Schools have an important role to play in obesity prevention, but little is known about the food environment in small, predominately rural schools. The primary purpose of this study was to compare the availability and student purchasing of foods sold outside of the reimbursable meals program through a la carte (ALC) or vending (i.e., competitive foods) in small (n = 7) and large (n = 6) Kansas high schools. METHODS A cross-sectional observational study design was used to capture the number of ALC and vending items available and purchased, and the fat and caloric content of all available and purchased items on a single school day between January and May 2005. RESULTS Small schools had significantly fewer vending machines than large schools [median=3.0 (range=2.0–5.0) versus 6.5 (range=4.0–8.0), p<0.01]. Vending and ALC items at small schools contained a median of 2.3 fewer fat grams per item (p≤0.05), while vending products contained a median of 25.0 fewer calories per item (p≤0.05) than at large schools. Significantly less fat (median= −15.4 grams/student) and fewer calories (median= −306.8 kcal/student) were purchased per student from all competitive food sources and from ALC (median= −12.9 fat grams and −323.3 kcal/student) by students in small schools compared to students in large schools (p≤0.05). CONCLUSIONS The findings, which highlight less availability and lower energy content from competitive foods at small compared to large schools, have implications for understanding how small schools support their food service programs with limited dependence on competitive foods and the impact that food and nutrition professionals can have on the school environment by providing more oversight into the nutritional quality of foods available. PMID:19394472

  16. High performance computing and communications: FY 1997 implementation plan

    SciTech Connect

    1996-12-01

    The High Performance Computing and Communications (HPCC) Program was formally authorized by passage, with bipartisan support, of the High-Performance Computing Act of 1991, signed on December 9, 1991. The original Program, in which eight Federal agencies participated, has now grown to twelve agencies. This Plan provides a detailed description of the agencies` FY 1996 HPCC accomplishments and FY 1997 HPCC plans. Section 3 of this Plan provides an overview of the HPCC Program. Section 4 contains more detailed definitions of the Program Component Areas, with an emphasis on the overall directions and milestones planned for each PCA. Appendix A provides a detailed look at HPCC Program activities within each agency.

  17. High-frequency nutrient monitoring to infer seasonal patterns in catchment source availability, mobilisation and delivery.

    PubMed

    Bende-Michl, Ulrike; Verburg, Kirsten; Cresswell, Hamish P

    2013-11-01

    To explore the value of high-frequency monitoring to characterise and explain riverine nutrient concentration dynamics, total phosphorus (TP), reactive phosphorus (RP), ammonium (NH4-N) and nitrate (NO3-N) concentrations were measured hourly over a 2-year period in the Duck River, in north-western Tasmania, Australia, draining a 369-km(2) mixed land use catchment area. River discharge was observed at the same location and frequency, spanning a wide range of hydrological conditions. Nutrient concentrations changed rapidly and were higher than previously observed. Maximum nutrient concentrations were 2,577 μg L(-1) TP, 1,572 μg L(-1) RP, 972 μg L(-1) NH₄-N and 1,983 μg L(-1) NO₃-N, respectively. Different nutrient response patterns were evident at seasonal, individual event and diurnal time scales-patterns that had gone largely undetected in previous less frequent water quality sampling. Interpretation of these patterns in terms of nutrient source availability, mobilisation and delivery to the stream allowed the development of a conceptual model of catchment nutrient dynamics. Functional stages of nutrient release were identified for the Duck River catchment and were supported by a cluster analysis which confirmed the similarities and differences in nutrient responses caused by the sequence of hydrologic events: (1) a build-up of nutrients during periods with low hydrologic activity, (2) flushing of readily available nutrient sources at the onset of the high flow period, followed by (3) a switch from transport to supply limitation, (4) the accessibility of new nutrient sources with increasing catchment wetness and hydrologic connectivity and (5) high nutrient spikes occurring when new sources become available that are easily mobilised with quickly re-established hydrologic connectivity. Diurnal variations that could be influenced by riverine processes and/or localised point sources were also identified as part of stage (1) and during late recession of some of

  18. The research and design for a high availability object storage system

    NASA Astrophysics Data System (ADS)

    Zhan, Ling; Tan, Zhihu; Gu, Peng; Wan, Jiguang

    2008-12-01

    With the growing scale of the computer storage systems, the likelihood of multi-disk failures happening in the storage systems has increased dramatically. Based on a thorough analysis on the fault-tolerance capability on various existing storage systems, we propose a new hierarchical, highly reliable, multi-disk fault-tolerant storage system architecture: High Availability Object Storage System (HAOSS). In the HAOSS, each object has an attribute field for reliability level, which can be set by the user according to the importance of data. Higher reliability level corresponds to better data survivability in case of multi-device failure. The HAOSS is composed of two layers: the upper-layer and the lower-layer. The upper-layer achieves the high availability by storing multiple replicas for each storage object in a set of storage devices. The individual replicas can service the I/O requests in parallel so as to obtain high performance. The lower-layer deploys RAID5, RAID6 or RAID_Blaum coding schemes to tolerate multi-disk failures. In addition, the disk utilization rate of RAID_Blaum is higher than that of multiple replicas, and it can be further improved by growing the RAID group size. These advantages come at the price of more complicated fault-tolerant coding schemes, which involve a large amount of calculation for encoding and cause an adverse impact on the I/O performance, especially on the write performance. Results from both our internal experiments and third-party independent tests have shown that HAOSS servers have better multi-disk- failure tolerance than existing similar products. In a 1000Mb Ethernet interconnection environment, with a request block size of 1024KB, the sequential read performance for a HAOSS server reaches 104MB/s, which is very close to the theoretical maximum effective bandwidth of Ethernet networks. The HAOSS offers a complete storage solution for high availability applications without the compromises that today's storage systems

  19. Complementarity among four highly productive grassland species depends on resource availability.

    PubMed

    Roscher, Christiane; Schmid, Bernhard; Kolle, Olaf; Schulze, Ernst-Detlef

    2016-06-01

    Positive species richness-productivity relationships are common in biodiversity experiments, but how resource availability modifies biodiversity effects in grass-legume mixtures composed of highly productive species is yet to be explicitly tested. We addressed this question by choosing two grasses (Arrhenatherum elatius and Dactylis glomerata) and two legumes (Medicago × varia and Onobrychis viciifolia) which are highly productive in monocultures and dominant in mixtures (the Jena Experiment). We established monocultures, all possible two- and three-species mixtures, and the four-species mixture under three different resource supply conditions (control, fertilization, and shading). Compared to the control, community biomass production decreased under shading (-56 %) and increased under fertilization (+12 %). Net diversity effects (i.e., mixture minus mean monoculture biomass) were positive in the control and under shading (on average +15 and +72 %, respectively) and negative under fertilization (-10 %). Positive complementarity effects in the control suggested resource partitioning and facilitation of growth through symbiotic N2 fixation by legumes. Positive complementarity effects under shading indicated that resource partitioning is also possible when growth is carbon-limited. Negative complementarity effects under fertilization suggested that external nutrient supply depressed facilitative grass-legume interactions due to increased competition for light. Selection effects, which quantify the dominance of species with particularly high monoculture biomasses in the mixture, were generally small compared to complementarity effects, and indicated that these species had comparable competitive strengths in the mixture. Our study shows that resource availability has a strong impact on the occurrence of positive diversity effects among tall and highly productive grass and legume species. PMID:26932467

  20. Implementing Concepts of Pharmaceutical Engineering into High School Science Classrooms

    ERIC Educational Resources Information Center

    Kimmel, Howard; Hirsch, Linda S.; Simon, Laurent; Burr-Alexander, Levelle; Dave, Rajesh

    2009-01-01

    The Research Experience for Teachers was designed to help high school science teachers develop skills and knowledge in research, science and engineering with a focus on the area of pharmaceutical particulate and composite systems. The experience included time for the development of instructional modules for classroom teaching. Results of the…

  1. Planning and Implementing a High Performance Knowledge Base.

    ERIC Educational Resources Information Center

    Cortez, Edwin M.

    1999-01-01

    Discusses the conceptual framework for developing a rapid-prototype high-performance knowledge base for the four mission agencies of the United States Department of Agriculture and their university partners. Describes the background of the project and methods used for establishing the requirements; examines issues and problems surrounding semantic…

  2. Biomedical Interdisciplinary Curriculum Project: Implementation Manual for High School Personnel.

    ERIC Educational Resources Information Center

    Biomedical Interdisciplinary Curriculum Project, Berkeley, CA.

    The Biomedical Interdisciplinary Curriculum Project (BICP) is a two-year interdisciplinary precollege curriculum designed to prepare high school students for entry into college and vocational programs leading to a career in the health field. Composed of three separate yet interrelated courses with interdisciplinary relationships between…

  3. Sexually transmitted infections treatment and care available to high risk populations in Pakistan.

    PubMed

    Rahimtoola, Minal; Hussain, Hamidah; Khowaja, Saira N; Khan, Aamir J

    2008-01-01

    Limited literature exists on the quality and availability of treatment and care of sexually transmitted infections (STIs) in Pakistan. This article aims to document existing services for the care and treatment of STIs available in Pakistan's public and private sectors to high risk groups (HRG), particularly the transgendered population. We conducted a cross-sectional survey to document STI services in Lahore, Karachi, Rawalpindi, Peshawar, and Quetta. Seventy-three interviews were administered with health service providers at the 3 largest public sector hospitals in each city, as well as with general physicians and traditional healers in the private sector. Twenty-five nongovernmental organizations (NGO) providing STI services were also interviewed. Fewer than 45% of private and public sector general practitioners had been trained in STI treatment after the completion of their medical curriculum, and none of the traditional healers had received any formal training or information on STIs. The World Health Organization (WHO) syndromic management guidelines were followed for STI management by 29% of public and private sector doctors and 5% of traditional healers. STI drugs were available at no cost at 44% of NGOs and at some public sector hospitals. Our findings show that although providers do treat HRGs for STIs, there are significant limitations in their ability to provide these services. These deterrents include, but are not limited to, a lack of STI training of service providers, privacy and adherence to recommended WHO syndromic management guidelines, and costly diagnostic and consultation fees. PMID:19856743

  4. HALR: A TCP Enhancement Scheme Using Local Caching in High-Availability Cluster

    NASA Astrophysics Data System (ADS)

    Feng, Yi-Hsuan; Huang, Nen-Fu; Wu, Yen-Min

    In this paper, we study the end-to-end TCP performance over a path deploying a High-Availability cluster, whose characteristics are highlighted by the failover procedure to remove single-point failure. This paper proposes an approach, called High-Availability Local Recovery (HALR), to enhance TCP performance in the face of a cluster failover. To minimize the latency of retransmission, HALR saves TCP packets selectively and resends them locally after the failover is finished. For better understanding, we further develop simple analytic models to predict the TCP performance in the aspect of flow latency under a range of failover times and the effects of HALR. Using simulation results, we validate our models and show that HALR improves the TCP performance significantly over a failover event as compared with the original TCP. Typically, HALR reduces the flow latency from 4.1sec to less than 1.9sec when the failover time equals to 500ms. The simulation by real packet trace further demonstrates that the memory requirement of the proposed solution is not a concern for modern network equipments.

  5. A High Throughput Medium Access Control Implementation Based on IEEE 802.11e Standard

    NASA Astrophysics Data System (ADS)

    Huang, Min Li; Lee, Jin; Setiawan, Hendra; Ochi, Hiroshi; Park, Sin-Chong

    With the growing demand for high-performance multimedia applications over wireless channels, we need to develop a Medium Access Control (MAC) system that supports high throughput and quality of service enhancements. This paper presents the standard analysis, design architecture and design issues leading to the implementation of an IEEE 802.11e based MAC system that supports MAC throughput of over 100Mbps. In order to meet the MAC layer timing constraints, a hardware/software co-design approach is adopted. The proposed MAC architecture is implemented on the Xilinx Virtex-II Pro Field-Programmable Gate Array (FPGA) (XC2VP70-5FF1704C) prototype, and connected to a host computer through an external Universal Serial Bus (USB) interface. The total FPGA resource utilization is 11, 508 out of 33, 088 (34%) available slices. The measured MAC throughput is 100.7Mbps and 109.2Mbps for voice and video access categories, transmitted at a data rate of 260Mbps based on IEEE 802.11n Physical Layer (PHY), using the contention-based hybrid coordination function channel access mechanism.

  6. Implementing high-latitude biogeochemical processes into Earth System Models

    NASA Astrophysics Data System (ADS)

    Brovkin, Victor; Kleinen, Thomas; Cresto-Aleina, Fabio; Kloster, Silvia; Ilyina, Tatiana

    2016-04-01

    Projections of future climate changes suggest that air temperatures in the Arctic could rise to the levels unprecedented in the last million years. Sensitivity of carbon storages on land and shelves to climate change of that scale is highly uncertain. Earth System models (ESMs), consisting of atmosphere, ocean, land, and cryosphere components are the main tools to understand interactions between carbon cycle and climate. However, ESM representation of ecological and biogeochemical processes in the Arctic is extremely simplistic. For example, all ESMs agree that tree cover in the future warming scenarios will move northwards to the Arctic coast, but they ignore interactions between vegetation, permafrost, and disturbances such as fires, which are critical for vegetation dynamics in this region. Improving modeling of interactions between model components and their evaluation against growing observational evidence is a promising research area. The first attempts to account for the permafrost carbon dynamics in the ESM framework suggest that CO2 and CH4 emissions from high-latitude regions in the 21st century are relatively small, but they become much more significant afterwards due to committed climate changes. Therefore, extension of ESM simulations beyond 2100 is essential to estimate a proper scale of frozen carbon pool response to human-induced climate change. Additionally, inclusion of sub-sea permafrost component into ESMs is an active research area that brings together terrestrial and marine biogeochemical communities, as well as geologists analyzing climate proxies on glacial timescales. Another challenging aspect of biogeochemical interactions in Arctic is an extreme land surface heterogeneity. A mixture of wetlands, lakes, and vegetation-covered surfaces on fine local scale is not properly reflected in the model structure. A promising approach of dealing with scaling gaps in modeling high-latitude biogeochemical processes in ESMs will be presented.

  7. High speed fiber optics local area networks: Design and implementation

    NASA Astrophysics Data System (ADS)

    Tobagi, Fouad A.

    1988-09-01

    The design of high speed local area networks (HSLAN) for communication among distributed devices requires solving problems in three areas: (1) the network medium and its topology; (2) the medium access control; and (3) the network interface. Considerable progress has been made in all areas. Accomplishments are divided into two groups according to their theoretical or experimental nature. A brief summary is given in Section 2, including references to papers which appeared in the literature, as well as to Ph.D. dissertations and technical reports published at Stanford University.

  8. Implementing high-performance work practices in healthcare organizations: qualitative and conceptual evidence.

    PubMed

    McAlearney, Ann Scheck; Robbins, Julie; Garman, Andrew N; Song, Paula H

    2013-01-01

    Studies across industries suggest that the systematic use of high-performance work practices (HPWPs) may be an effective but underused strategy to improve quality of care in healthcare organizations. Optimal use of HPWPs depends on how they are implemented, yet we know little about their implementation in healthcare. We conducted 67 key informant interviews in five healthcare organizations, each considered to have exemplary work practices in place and to deliver high-quality care, as part of an extensive study of HPWP use in healthcare. We analyzed interview transcripts inductively and deductively to examine why and how organizations implement HPWPs. We used an evidence-based model of complex innovation adoption to guide our exploration of factors that facilitate HPWP implementation. We found considerable variability in interviewees' reasons for implementing HPWPs, including macro-organizational (strategic level) and micro-organizational (individual level) reasons. This variability highlighted the complex context for HPWP implementation in many organizations. We also found that our application of an innovation implementation model helped clarify and categorize facilitators of HPWP implementation, thus providing insight on how these factors can contribute to implementation effectiveness. Focusing efforts on clarifying definitions, building commitment, and ensuring consistency in the application of work practices may be particularly important elements of successful implementation. PMID:24400459

  9. High sensitivity of broadleaf trees to water availability in northeastern United States

    NASA Astrophysics Data System (ADS)

    Levesque, Mathieu; Andreu-Hayles, Laia; Pederson, Neil

    2016-04-01

    Broadleaf dominated forests of eastern US cover more than one million km2 and provide ecosystem services to millions of people. High species diversity and a varied sensitivity to drought make it uncertain whether these forests will be carbon sinks or sources under climate change. Ongoing climate change, increased in atmospheric CO2 concentration (ca) and strong reductions in acidic depositions are expected to alter growth and gas exchange of trees, and ultimately forest productivity. Still, the magnitude of these effects is unclear. A better comprehension of the species-specific responses to environmental changes will better inform models and managers on the vulnerability and resiliency of these forests. Here, we combined tree-ring width data with δ13C and δ18O measurements to investigate growth and physiological responses of red oak (Quercus rubra L.) and tulip poplar (Liriodendron tulipifera L.) in northeastern US to changes in water availability, ca and acidic depositions for the period 1950-2014. Based on structural equation modeling approaches, we found that summer water availability (June-August) is the main environmental variable driving growth, water-use efficiency and δ18O of broadleaf trees whereas ca and acidic depositions have little effects. This high sensitivity to moisture availability was also supported by the very strong correlations found between summer vapor pressure deficit (VPD) and tree-ring δ13C (r = 0.67 and 0.71), and δ18O series (r = 0.62 and 0.72), for red oak and tulip poplar, respectively. In contrast, tree-ring width was less sensitive to summer VPD (r = ‑0.44 and‑0.31). Since the mid 1980s, pluvial conditions occurring in northeastern US have increased stomatal conductance, carbon uptake, and growth of both species. Further, the strong spatial field correlations found between the tree-ring δ13C and δ18O and summer VPD indicate a greater sensitivity of eastern US broadleaf forests to moisture availability than previously

  10. Implementing a High-Assurance Smart-Card OS

    NASA Astrophysics Data System (ADS)

    Karger, Paul A.; Toll, David C.; Palmer, Elaine R.; McIntosh, Suzanne K.; Weber, Samuel; Edwards, Jonathan W.

    Building a high-assurance, secure operating system for memory constrained systems, such as smart cards, introduces many challenges. The increasing power of smart cards has made their use feasible in applications such as electronic passports, military and public sector identification cards, and cell-phone based financial and entertainment applications. Such applications require a secure environment, which can only be provided with sufficient hardware and a secure operating system. We argue that smart cards pose additional security challenges when compared to traditional computer platforms. We discuss our design for a secure smart card operating system, named Caernarvon, and show that it addresses these challenges, which include secure application download, protection of cryptographic functions from malicious applications, resolution of covert channels, and assurance of both security and data integrity in the face of arbitrary power losses.

  11. Geographic Information Network of Alaska: Real-Time Synoptic Satellite Data for Alaska and the High Arctic, Best Available DEMs, and Highest Available Resolution Imagery for Alaska

    NASA Astrophysics Data System (ADS)

    Heinrichs, T. A.; Sharpton, V. L.; Engle, K. E.; Ledlow, L. L.; Seman, L. E.

    2006-12-01

    In support of the International Polar Year, the Geographic Information Network of Alaska (GINA) intends to make available to researchers three important Arctic data sets. The first is near-real-time synoptic scale data from GINA and NOAA/NESDIS satellite ground stations. GINA operates ground stations that receive direct readout from the AVHRR (1.1-km per pixel resolution) and MODIS (250- to 1000-meter) sensors carried on NOAA and NASA satellites. GINA works in partnership with NOAA/NESDIS's Fairbanks Command and Data Acquisition Station (FCDAS) to distribute real-time data captured by FCDAS facilities in Fairbanks and Barrow, Alaska. AVHRR and Feng Yun 1D (1.1-km) sensors are captured in Fairbanks by FCDAS and distributed by GINA. AVHRR data is captured by FCDAS in Barrow and distributed by GINA. Due to its high latitude, the station mask of the Barrow station extends well beyond the Pole, showing the status in real-time of Arctic basin cloud and sea ice conditions. Second, digital elevation models (DEM) for Alaska vary greatly in quality and availability. The best available DEMs for Alaska will be combined and served through a GINA gateway. Third, the best available imagery for more than three quarters of Alaska is 15-meter pan-sharpened Landsat data. Less than a quarter of the state is covered by 5-meter or better data. The best available imagery for Alaska will be combined and served through a GINA gateway. In accordance with the IPY Subcommittee on Data Policy and Management recommendations, all data will be made available via Open Geospatial Consortium protocols, including Web Mapping, Feature, and Coverage Services. Data will also be made available for download in georeferenced formats such as GeoTIFF, MrSID, or GRID. Metadata will be available though the National Spatial Data Infrastructure via Z39.50 GEO protocols and through evolving web-based metadata standards.

  12. The implementation of sea ice model on a regional high-resolution scale

    NASA Astrophysics Data System (ADS)

    Prasad, Siva; Zakharov, Igor; Bobby, Pradeep; McGuire, Peter

    2015-09-01

    The availability of high-resolution atmospheric/ocean forecast models, satellite data and access to high-performance computing clusters have provided capability to build high-resolution models for regional ice condition simulation. The paper describes the implementation of the Los Alamos sea ice model (CICE) on a regional scale at high resolution. The advantage of the model is its ability to include oceanographic parameters (e.g., currents) to provide accurate results. The sea ice simulation was performed over Baffin Bay and the Labrador Sea to retrieve important parameters such as ice concentration, thickness, ridging, and drift. Two different forcing models, one with low resolution and another with a high resolution, were used for the estimation of sensitivity of model results. Sea ice behavior over 7 years was simulated to analyze ice formation, melting, and conditions in the region. Validation was based on comparing model results with remote sensing data. The simulated ice concentration correlated well with Advanced Microwave Scanning Radiometer for EOS (AMSR-E) and Ocean and Sea Ice Satellite Application Facility (OSI-SAF) data. Visual comparison of ice thickness trends estimated from the Soil Moisture and Ocean Salinity satellite (SMOS) agreed with the simulation for year 2010-2011.

  13. The Implementation of High-Leverage Teaching Practices: From the University Classroom to the Field Site

    ERIC Educational Resources Information Center

    Davin, Kristin J.; Troyan, Francis J.

    2015-01-01

    In response to the ACTFL's Research Priorities Initiative, the present study used a multiple case study design to examine teacher candidates' ability to implement two high-leverage teaching practices: increasing interaction and target language comprehensibility and questioning to build and assess student understanding. Candidates implemented these…

  14. On the use of fuzzy logic assessment for high consequence implementation risk analysis

    SciTech Connect

    Spray, S.; Cooper, A.; Bennett, R.

    1994-05-01

    ``High consequence`` operations are systems, structures, and/or strategies for which it is crucial to provide assured protection against some potential catastrophe or catastrophes. The word ``catastrophe`` implies a significant loss of a resource (e.g., money, lives, health, environment, national security, etc.). The implementation of operations that are to be as catastrophe-free as possible must incorporate a very high level of protection. Unfortunately, real world limitations on available resources, mainly money and time, preclude absolute protection. For this reason, conventional ``risk analysis`` focuses on ``cost-effective`` protection, demonstrating through analysis that the benefits of any protective measures chosen outweigh their cost. This is a ``crisp`` one-parameter (usually monetary) comparison. A major problem with this approach, especially for high consequence operations, is that it may not be possible to accurately determine quantitative ``costs,`` and furthermore, the costs may not be accurately quantifiable. Similarly, it may not be possible to accurately determine or to quantify the benefits of protection in high consequence operations. These weaknesses are addressed in this paper by introducing multiple parameters instead of a single monetary measure both for costs of implementing protective measures and their benefits. In addition, a fuzzy-algebra comparison based on fuzzy number theory is introduced as a tool in providing cost/benefit tradeoff depiction, with the incorporation of measures of the uncertainty that necessarily exists in the input information. The result allows a more informative comparison to be made through use of fuzzy results, especially at the extreme bounds of the uncertainty.

  15. An intense, high-repetition nanosecond light source using a commercially available Xe-arc lamp

    NASA Astrophysics Data System (ADS)

    Araki, Tsutomu; Yamada, Akihisa; Uchida, Teruo

    1993-07-01

    We describe the construction and emission characteristics of a Xe light source that produces broadband emission spectrum (250-650 nm) and high-repetition pulsed light of nanosecond duration. The standard dc-operated Xe-arc lamp, which is commercially available, is employed as the primary light source, with modified circuitry to realize pulsed operation. A dc voltage higher than 5 kV is applied to the electrode gap through a high-value resistor in order to generate a periodical discharge of current between the electrode gap. In order to further increase the intensity of the light pulses, the electrical polarity of the electrode must be in inverse connection relative to the normal connection under the dc operation. Intense light pulses as large as 20 W (peak value) of 3 ns width were generated repetitively from the Xe lamp. Fluorescence lifetimes of a quinine-sulfate solution and a fluorescent cell nucleus were measured to demonstrate the usefulness of the light source.

  16. The Availability of Competitive Foods and Beverages to Middle School Students in Appalachian Virginia Before Implementation of the 2014 Smart Snacks in School Standards

    PubMed Central

    Kraak, Vivica; Serrano, Elena

    2015-01-01

    The study objective was to examine the nutritional quality of competitive foods and beverages (foods and beverages from vending machines and à la carte foods) available to rural middle school students, before implementation of the US Department of Agriculture’s Smart Snacks in School standards in July 2014. In spring 2014, we audited vending machines and à la carte cafeteria foods and beverages in 8 rural Appalachian middle schools in Virginia. Few schools had vending machines. Few à la carte and vending machine foods met Smart Snacks in School standards (36.6%); however, most beverages did (78.2%). The major challenges to meeting standards were fat and sodium content of foods. Most competitive foods (63.4%) did not meet new standards, and rural schools with limited resources will likely require assistance to fully comply. PMID:26378899

  17. Characterization of Depleted Monolithic Active Pixel detectors implemented with a high-resistive CMOS technology

    NASA Astrophysics Data System (ADS)

    Kishishita, T.; Hemperek, T.; Rymaszewski, P.; Hirono, T.; Krüger, H.; Wermes, N.

    2016-07-01

    We present the recent development of DMAPS (Depleted Monolithic Active Pixel Sensor), implemented with a Toshiba 130 nm CMOS process. Unlike in the case of standard MAPS technologies which are based on an epi-layer, this process provides a high-resistive substrate that enables larger signal and faster charge collection by drift in a 50 - 300 μm thick depleted layer. Since this process also enables the use of deep n-wells to isolate the collection electrodes from the thin active device layer, NMOS and PMOS transistors are available for the readout electronics in each pixel cell. In order to characterize the technology, we implemented a simple three transistor readout with a variety of pixel pitches and input FET sizes. This layout variety gives us a clue on sensor characteristics for future optimization, such as the input detector capacitance or leakage current. In the initial measurement, the radiation spectra were obtained from 55Fe with an energy resolution of 770 eV (FWHM) and 90Sr with the MVP of 4165 e-.

  18. High-performance FFT implementation on the BOPS ManArray parallel DSP

    NASA Astrophysics Data System (ADS)

    Pitsianis, Nikos P.; Pechanek, Gerald

    1999-11-01

    We present a high performance implementation of the FFT algorithm on the BOPS ManArray parallel DSP processor. The ManArray we consider for this application consists of an array controller and 2 to 4 fully interconnected processing elements. To expose the parallelism inherent to an FFT algorithm we use a factorization of the DFT matrix in Kronecker products, permutation and diagonal matrices. Our implementation utilizes the multiple levels of parallelism that are available on the ManArray. We use the special multiply complex instruction, that calculates the product of two complex 32-bit fixed point numbers in 2 cycles (pipelinable). Instruction level parallelism is exploited via the indirect Very Long Instruction Word (iVLIW). With an iVLIW, in the same cycle a complex number is read from memory, another complex number is written to memory, a complex multiplication starts and another finishes, two complex additions or subtractions are done and a complex number is exchanged with another processing element. Multiple local FFTs are executed in Single Instruction Multiple Data (SIMD) mode, and to avoid a costly data transposition we execute distributed FFTs in Synchronous Multiple Instructions Multiple Data (SMIMD) mode.

  19. Teaching High School Science Using Image Processing: A Case Study of Implementation of Computer Technology.

    ERIC Educational Resources Information Center

    Greenberg, Richard; Raphael, Jacqueline; Keller, Jill L.; Tobias, Sheila

    1998-01-01

    Outlines an in-depth case study of teachers' use of image processing in biology, earth science, and physics classes in one high school science department. Explores issues surrounding technology implementation. Contains 21 references. (DDR)

  20. Implementation of a Parallel High-Performance Visualization Technique in GRASS GIS

    SciTech Connect

    Sorokine, Alexandre

    2007-01-01

    This paper describes an extension for GRASS GIS that enables users to perform geographic visualization tasks on tiled high-resolution displays powered by the clusters of commodity personal computers. Parallel visualization systems are becoming more common in scientific computing due to the decreasing hardware costs and availability of the open source software to support such architecture. High-resolution displays allow scientists to visualize very large datasets with minimal loss of details. Such systems have a big promise especially in the field of geographic information systems because users can naturally combine several geographic scales on a single display. The paper discusses architecture, implementation and operation of pd-GRASS - a GRASS GIS extension for high-performance parallel visualization on tiled displays. pd-GRASS is specifically well suited for the very large geographic datasets such as LIDAR data or high-resolution nation-wide geographic databases. The paper also briefly touches on computational efficiency, performance and potential applications for such systems.

  1. A software and hardware architecture for a high-availability PACS.

    PubMed

    Gutiérrez-Martínez, Josefina; Núñez-Gaona, Marco Antonio; Aguirre-Meneses, Heriberto; Delgado-Esquerra, Ruth Evelin

    2012-08-01

    Increasing radiology studies has led to the emergence of new requirements for management medical information, mainly affecting the storage of digital images. Today, it is a necessary interaction between workflow management and legal rules that govern it, to allow an efficient control of medical technology and associated costs. Another important topic that is growing in importance within the healthcare sector is compliance, which includes the retention of studies, information security, and patient privacy. Previously, we conducted a series of extensive analysis and measurements of pre-existing operating conditions. These studies and projects have been described in other papers. The first phase: hardware and software installation and initial tests were completed in March 2006. The storage phase was built step by step until the PACS-INR was totally completed. Two important aspects were considered in the integration of components: (1) the reliability and performance of the system to transfer and display DICOM images, and (2) the availability of data backups for disaster recovery and downtime scenarios. This paper describes the high-availability model for a large-scale PACS to support the storage and retrieve of data using CAS and DAS technologies to provide an open storage platform. This solution offers a simple framework that integrates and automates the information at low cost and minimum risk. Likewise, the model allows an optimized use of the information infrastructure in the clinical environment. The tests of the model include massive data migration, openness, scalability, and standard compatibility to avoid locking data into a proprietary technology. PMID:22692771

  2. ArcticDEM; A Publically Available, High Resolution Elevation Model of the Arctic

    NASA Astrophysics Data System (ADS)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Bates, Brian; Willamson, Cathleen; Peterman, Kennith

    2016-04-01

    A Digital Elevation Model (DEM) of the Arctic is needed for a large number of reasons, including: measuring and understanding rapid, ongoing changes to the Arctic landscape resulting from climate change and human use and mitigation and adaptation planning for Arctic communities. The topography of the Arctic is more poorly mapped than most other regions of Earth due to logistical costs and the limits of satellite missions with low-latitude inclinations. A convergence of civilian, high-quality sub-meter stereo imagery; petascale computing and open source photogrammetry software has made it possible to produce a complete, very high resolution (2 to 8-meter posting), elevation model of the Arctic. A partnership between the US National Geospatial-intelligence Agency and a team led by the US National Science Foundation funded Polar Geospatial Center is using stereo imagery from DigitalGlobe's Worldview-1, 2 and 3 satellites and the Ohio State University's Surface Extraction with TIN-based Search-space Minimization (SETSM) software running on the University of Illinois's Blue Water supercomputer to address this challenge. The final product will be a seemless, 2-m posting digital surface model mosaic of the entire Arctic above 60 North including all of Alaska, Greenland and Kamchatka. We will also make available the more than 300,000 individual time-stamped DSM strip pairs that were used to assemble the mosaic. The Arctic DEM will have a vertical precision of better than 0.5m and can be used to examine changes in land surfaces such as those caused by permafrost degradation or the evolution of arctic rivers and floodplains. The data set can also be used to highlight changing geomorphology due to Earth surface mass transport processes occurring in active volcanic and glacial environments. When complete the ArcticDEM will catapult the Arctic from the worst to among the best mapped regions on Earth.

  3. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  4. Assessment of commercially available ion exchange materials for cesium removal from highly alkaline wastes

    SciTech Connect

    Brooks, K.P.; Kim, A.Y.; Kurath, D.E.

    1996-04-01

    Approximately 61 million gallons of nuclear waste generated in plutonium production, radionuclide removal campaigns, and research and development activities is stored on the Department of Energy`s Hanford Site, near Richland, Washington. Although the pretreatment process and disposal requirements are still being defined, most pretreatment scenarios include removal of cesium from the aqueous streams. In many cases, after cesium is removed, the dissolved salt cakes and supernates can be disposed of as LLW. Ion exchange has been a leading candidate for this separation. Ion exchange systems have the advantage of simplicity of equipment and operation and provide many theoretical stages in a small space. The organic ion exchange material Duolite{trademark} CS-100 has been selected as the baseline exchanger for conceptual design of the Initial Pretreatment Module (IPM). Use of CS-100 was chosen because it is considered a conservative, technologically feasible approach. During FY 96, final resin down-selection will occur for IPM Title 1 design. Alternate ion exchange materials for cesium exchange will be considered at that time. The purpose of this report is to conduct a search for commercially available ion exchange materials which could potentially replace CS-100. This report will provide where possible a comparison of these resin in their ability to remove low concentrations of cesium from highly alkaline solutions. Materials which show promise can be studied further, while less encouraging resins can be eliminated from consideration.

  5. Implementing California's School Funding Formula: Will High-Need Students Benefit? Technical Appendix

    ERIC Educational Resources Information Center

    Hill, Laura; Ugo, Iwunze

    2015-01-01

    Intended to accompany "Implementing California's School Funding Formula: Will High-Need Students Benefit?," this appendix examines the extent to which school shares of high-need students vary relative to their district concentrations by grouping approximately 950 school districts by their share of high-need students, arraying them into…

  6. Estimating photosynthesis with high resolution field spectroscopy in a Mediterranean grassland under different nutrient availability

    NASA Astrophysics Data System (ADS)

    Perez-Priego, O.; Guan, J.; Fava, F.; Rossini, M.; Wutzler, T.; Moreno, G.; Carrara, A.; Kolle, O.; Schrumpf, M.; Reichstein, M.; Migliavacca, M.

    2014-12-01

    Recent studies have shown how human induced N:P imbalances are affecting essential processes (e.g. photosynthesis, plant growth rate) that lead to important changes in ecosystem structure and function. In this regard, the accuracy of the approaches based on remotely-sensed data for monitoring and modeling gross primary production (GPP) relies on the ability of vegetation indices (VIs) to track the dynamics of vegetation physiological and biophysical properties/variables. Promising results have been recently obtained when Chlorophyll-sensitive VIs and Chlorophyll fluorescence are combined with structural indices in the framework of the Monteith's light use efficiency (LUE) model. However, further ground-based experiments are required to validate LUE model performances, and their capability to be generalized under different nutrient availability conditions. In this study, the overall objective was to investigate the sensitivity of VIs to track short- and long-term GPP variations in a Mediterranean grassland under different N and P fertilization treatments. Spectral VIs were acquired manually using high resolution spectrometers (HR4000, OceanOptics, USA) along a phenological cycle. The VIs examined included photochemical reflectance index (PRI), MERIS terrestrial-chlorophyll index (MTCI) and normalized difference vegetation index (NDVI). Solar-induced chlorophyll fluorescence calculated at the oxygen absorption band O2-A (F760) using spectral fitting methods was also used. Simultaneously, measurements of GPP and environmental variables were conducted using a transient-state canopy chamber. Overall, GPP, F760 and VIs showed a clear seasonal time-trend in all treatments, which was driven by the phenological development of the grassland. Results showed significant differences (p<0.05) in midday GPP values between N and without N addition plots, in particular at the peak of the growing season during the flowering stage and at the end of the season during senescence. While

  7. High-resolution prediction of soil available water content within the crop root zone

    NASA Astrophysics Data System (ADS)

    Haghverdi, Amir; Leib, Brian G.; Washington-Allen, Robert A.; Ayers, Paul D.; Buschermohle, Michael J.

    2015-11-01

    A detailed understanding of soil hydraulic properties, particularly soil available water content (AWC) within the effective root zone, is needed to optimally schedule irrigation in fields with substantial spatial heterogeneity. However, it is difficult and time consuming to directly measure soil hydraulic properties. Therefore, easily collected and measured soil properties, such as soil texture and/or bulk density, that are well correlated with hydraulic properties are used as proxies to develop pedotransfer functions (PTF). In this study, multiple modeling scenarios were developed and evaluated to indirectly predict high resolution AWC maps within the effective root zone. The modeling techniques included kriging, co-kriging, regression kriging, artificial neural networks (NN) and geographically weighted regression (GWR). The efficiency of soil apparent electrical conductivity (ECa) as proximal data in the modeling process was assessed. There was a good agreement (root mean square error (RMSE) = 0.052 cm3 cm-3 and r = 0.88) between observed and point prediction of water contents using pseudo continuous PTFs. We found that both GWR (mean RMSE = 0.062 cm3 cm-3) and regression kriging (mean RMSE = 0.063 cm3 cm-3) produced the best water content maps with these accuracies improved up to 19% when ECa was used as an ancillary soil attribute in the interpolation process. The maps indicated fourfold differences in AWC between coarse- and fine-textured soils across the study site. This provided a template for future investigations for evaluating the efficiency of variable rate irrigation management scenarios in accounting for the spatial heterogeneity of soil hydraulic attributes.

  8. Circulating tumour necrosis factor is highly correlated with brainstem serotonin transporter availability in humans.

    PubMed

    Krishnadas, Rajeev; Nicol, Alice; Sassarini, Jen; Puri, Navesh; Burden, A David; Leman, Joyce; Combet, Emilie; Pimlott, Sally; Hadley, Donald; McInnes, Iain B; Cavanagh, Jonathan

    2016-01-01

    Preclinical studies demonstrate that pro-inflammatory cytokines increase serotonin transporter availability and function, leading to depressive symptoms in rodent models. Herein we investigate associations between circulating inflammatory markers and brainstem serotonin transporter (5-HTT) availability in humans. We hypothesised that higher circulating inflammatory cytokine concentrations, particularly of tumour necrosis factor (TNF-α), would be associated with greater 5-HTT availability, and that TNF-α inhibition with etanercept (sTNFR:Fc) would in turn reduce 5-HTT availability. In 13 neurologically healthy adult women, plasma TNF-α correlated significantly with 5-HTT availability (rho=0.6; p=0.03) determined by [(123)I]-beta-CIT SPECT scanning. This association was replicated in an independent sample of 12 patients with psoriasis/psoriatic arthritis (rho=0.76; p=0.003). Indirect effects analysis, showed that there was a significant overlap in the variance explained by 5-HTT availability and TNF-α concentrations on BDI scores. Treatment with etanercept for 6-8weeks was associated with a significant reduction in 5-HTT availability (Z=2.09; p=0.03; r=0.6) consistent with a functional link. Our findings confirm an association between TNF-α and 5-HTT in both the basal physiological and pathological condition. Modulation of both TNF-α and 5-HTT by etanercept indicate the presence of a mechanistic pathway whereby circulating inflammatory cytokines are related to central nervous system substrates underlying major depression. PMID:26255693

  9. High-reliability teams and situation awareness: implementing a hospital emergency incident command system.

    PubMed

    Autrey, Pamela; Moss, Jacqueline

    2006-02-01

    To enhance disaster preparedness, hospitals are beginning to implement the Hospital Emergency Incident Command System. Although Hospital Emergency Incident Command System provides a template for disaster preparation, its successful implementation requires an understanding of situation awareness (SA) and high-reliability teams. The authors present the concept of SA and how this concept relates to team reliability in dynamic environments. Then strategies for increasing SA and team reliability through education, training, and improved communication systems are discussed. PMID:16528147

  10. Implementation of an Enzyme Linked Immunosorbent Assay for the Quantification of Allergenic Egg Residues in Red Wines Using Commercially Available Antibodies.

    PubMed

    Koestel, Carole; Simonin, Céline; Belcher, Sandrine; Rösti, Johannes

    2016-08-01

    Since the early 2000s, labeling of potentially allergenic food components to protect people who suffer from food allergies is compulsory in numerous industrialized countries. In Europe, milk and egg components used during the winemaking process must be indicated on the label since July 1, 2012. Several ELISA procedures have been developed to detect allergenic residues in wines. However, the complexity of the wine matrix can inhibit the immunoenzymatic reaction. The aim of this study was to implement an ELISA assay for the detection of ovalbumin in red wines using commercially available antibodies. The specificity of the acquired antibodies and the absence of cross reactivity were assessed by immunoblotting and ELISA. An ELISA assay with a LOD of 14.2 μg/L and a LOQ of 56.4 μg/L of ovalbumin in aqueous solution was obtained. Differences in ELISA signals were observed when analyzing various fining agents, although reproducible conformation of the antigen could be reached for the comparison of ovalbumin and Ovicolle. The differences between samples in terms of pH could be leveled but the inhibition of the ELISA signal, positively correlated to the tannin content of the wines, could not be suppressed. Thus, standard curves of ovalbumin in several wines were obtained by relative quantification. The control steps and the difficulties encountered presented in this study should be considered by anybody working toward the development of ELISA assays for the detection of allergenic residues in complex food matrices. PMID:27356183

  11. 78 FR 20503 - Energy Conservation Program: Availability of the Interim Technical Support Document for High...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-05

    ..., 2013, at 78 FR 13566, is extended. DOE will accept comments, data, and information regarding this... public meeting and availability of the interim analysis in the Federal Register (78 FR 13566) to...

  12. Early Findings from the Implementation and Impact Study of Early College High School

    ERIC Educational Resources Information Center

    Bernstein, Larry; Yamaguchi, Ryoko; Unlu, Fatih; Edmunds, Julie; Glennie, Elizabeth; Willse, John; Arshavsky, Nina; Dallas, Andrew

    2010-01-01

    The purpose of this study is to rigorously examine the implementation and impact of the Early College High School (ECHS) model in North Carolina. The primary goal of the ECHS model is to increase the number of students who graduate from high school and who continue on and succeed in college. Therefore, the anticipated long-term outcomes for the…

  13. Select Novice Elementary Teachers' Perceived Knowledge and Implementation of High-Quality Reading Instruction

    ERIC Educational Resources Information Center

    Bumstead, Stacey

    2012-01-01

    The purpose of this mixed methods study was to examine select novice teachers' perceived knowledge of high-quality reading instruction, explore the extent that select novice teachers implemented high-quality reading instruction into their own classrooms, and to investigate any factors that explain the similarities and differences between…

  14. Preparing Students for College: The Implementation and Impact of the Early College High School Model

    ERIC Educational Resources Information Center

    Edmunds, Julie A.; Bernstein, Lawrence; Glennie, Elizabeth; Willse, John; Arshavsky, Nina; Unlu, Fatih; Bartz, Deborah; Silberman, Todd; Scales, W. David; Dallas, Andrew

    2010-01-01

    As implemented in North Carolina, Early College High Schools are small, autonomous schools designed to increase the number of students who graduate from high school and are prepared for postsecondary education. Targeted at students who are underrepresented in college, these schools are most frequently located on college campuses and are intended…

  15. Roadmap for High School Feedback Reports: Key Focus Areas to Ensure Quality Implementation. Data for Action

    ERIC Educational Resources Information Center

    Data Quality Campaign, 2014

    2014-01-01

    High school feedback reports let school and district leaders know where their students go after graduation and how well they are prepared for college and beyond. This roadmap discusses the seven key focus areas the Data Quality Campaign (DQC) recommends states work on to ensure quality implementation of high school feedback reports.

  16. Analyzing the United States Department of Transportation's Implementation Strategy for High Speed Rail: Three Case Studies

    NASA Astrophysics Data System (ADS)

    Robinson, Ryan

    High-speed rail (HSR) has become a major contributor to the transportation sector with a strong push by the Obama Administration and the Department of Transportation to implement high-speed rail in the United States. High-speed rail is a costly transportation alternative that has the potential displace some car and airport travel while increase energy security and environmental sustainability. This thesis will examine the United States high-speed rail implementation strategy by comparing it to the implementation strategies of France, Japan, and Germany in a multiple case study under four main criteria of success: economic profitability, reliability, safety, and ridership. Analysis will conclude with lessons to be taken away from the case studies and applied to the United States strategy. It is important to understand that this project has not been established to create a comprehensive implementation plan for high-speed rail in the United States; rather, this project is intended to observe the depth and quality of the current United States implementation strategy and make additional recommendations by comparing it with France, Japan, and Germany.

  17. RELIABILITY, AVAILABILITY, AND SERVICEABILITY FOR PETASCALE HIGH-END COMPUTING AND BEYOND

    SciTech Connect

    Chokchai "Box" Leangsuksun

    2011-05-31

    Our project is a multi-institutional research effort that adopts interplay of RELIABILITY, AVAILABILITY, and SERVICEABILITY (RAS) aspects for solving resilience issues in highend scientific computing in the next generation of supercomputers. results lie in the following tracks: Failure prediction in a large scale HPC; Investigate reliability issues and mitigation techniques including in GPGPU-based HPC system; HPC resilience runtime & tools.

  18. Making resonance a common case: a high-performance implementation of collective I/O on parallel file systems

    SciTech Connect

    Davis, Marion Kei; Zhang, Xuechen; Jiang, Song

    2009-01-01

    Collective I/O is a widely used technique to improve I/O performance in parallel computing. It can be implemented as a client-based or server-based scheme. The client-based implementation is more widely adopted in MPI-IO software such as ROMIO because of its independence from the storage system configuration and its greater portability. However, existing implementations of client-side collective I/O do not take into account the actual pattern offile striping over multiple I/O nodes in the storage system. This can cause a significant number of requests for non-sequential data at I/O nodes, substantially degrading I/O performance. Investigating the surprisingly high I/O throughput achieved when there is an accidental match between a particular request pattern and the data striping pattern on the I/O nodes, we reveal the resonance phenomenon as the cause. Exploiting readily available information on data striping from the metadata server in popular file systems such as PVFS2 and Lustre, we design a new collective I/O implementation technique, resonant I/O, that makes resonance a common case. Resonant I/O rearranges requests from multiple MPI processes to transform non-sequential data accesses on I/O nodes into sequential accesses, significantly improving I/O performance without compromising the independence ofa client-based implementation. We have implemented our design in ROMIO. Our experimental results show that the scheme can increase I/O throughput for some commonly used parallel I/O benchmarks such as mpi-io-test and ior-mpi-io over the existing implementation of ROMIO by up to 157%, with no scenario demonstrating significantly decreased performance.

  19. The Design and Implementation of hypre, a Library of Parallel High Performance Preconditioners

    SciTech Connect

    Falgout, R D; Jones, J E; Yang, U M

    2004-07-17

    The increasing demands of computationally challenging applications and the advance of larger more powerful computers with more complicated architectures have necessitated the development of new solvers and preconditioners. Since the implementation of these methods is quite complex, the use of high performance libraries with the newest efficient solvers and preconditioners becomes more important for promulgating their use into applications with relative ease. The hypre library [14, 17] has been designed with the primary goal of providing users with advanced scalable parallel preconditioners. Issues of robustness, ease of use, flexibility and interoperability have also been important. It can be used both as a solver package and as a framework for algorithm development. Its object model is more general and flexible than most current generation solver libraries [9]. hypre also provides several of the most commonly used solvers, such as conjugate gradient for symmetric systems or GMRES for nonsymmetric systems to be used in conjunction with the preconditioners. Design innovations have been made to enable access to the library in the way that applications users naturally think about their problems. For example, application developers that use structured grids, typically think of their problems in terms of stencils and grids. hypre's users do not have to learn complicated sparse matrix structures; instead hypre does the work of building these data structures through various conceptual interfaces. The conceptual interfaces currently implemented include stencil-based structured and semi-structured interfaces, a finite-element based unstructured interface, and a traditional linear-algebra based interface. The primary focus of this paper is on the design and implementation of the conceptual interfaces in hypre. The paper is organized as follows. The first two sections are of general interest.We begin in Section 2 with an introductory discussion of conceptual interfaces and

  20. RF/optical shared aperture for high availability wideband communication RF/FSO links

    DOEpatents

    Ruggiero, Anthony J; Pao, Hsueh-yuan; Sargis, Paul

    2015-03-24

    An RF/Optical shared aperture is capable of transmitting and receiving optical signals and RF signals simultaneously. This technology enables compact wide bandwidth communications systems with 100% availability in clear air turbulence, rain and fog. The functions of an optical telescope and an RF reflector antenna are combined into a single compact package by installing an RF feed at either of the focal points of a modified Gregorian telescope.

  1. RF/optical shared aperture for high availability wideband communication RF/FSO links

    DOEpatents

    Ruggiero, Anthony J; Pao, Hsueh-yuan; Sargis, Paul

    2014-04-29

    An RF/Optical shared aperture is capable of transmitting and receiving optical signals and RF signals simultaneously. This technology enables compact wide bandwidth communications systems with 100% availability in clear air turbulence, rain and fog. The functions of an optical telescope and an RF reflector antenna are combined into a single compact package by installing an RF feed at either of the focal points of a modified Gregorian telescope.

  2. The Availability and Delivery of Health Care to High School Athletes in Alabama.

    ERIC Educational Resources Information Center

    Culpepper, Michael I.

    1986-01-01

    A sports medicine survey of 119 public high schools in Alabama showed smaller schools at a disadvantage in offering health care for athletes relative to larger schools. Many schools rated the delivery and quality of medical care to the athletes as fair to very poor. (MT)

  3. A Synchronization Algorithm and Implementation for High-Speed Block Codes Applications. Part 4

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Zhang, Yu; Nakamura, Eric B.; Uehara, Gregory T.

    1998-01-01

    Block codes have trellis structures and decoders amenable to high speed CMOS VLSI implementation. For a given CMOS technology, these structures enable operating speeds higher than those achievable using convolutional codes for only modest reductions in coding gain. As a result, block codes have tremendous potential for satellite trunk and other future high-speed communication applications. This paper describes a new approach for implementation of the synchronization function for block codes. The approach utilizes the output of the Viterbi decoder and therefore employs the strength of the decoder. Its operation requires no knowledge of the signal-to-noise ratio of the received signal, has a simple implementation, adds no overhead to the transmitted data, and has been shown to be effective in simulation for received SNR greater than 2 dB.

  4. System, apparatus and methods to implement high-speed network analyzers

    DOEpatents

    Ezick, James; Lethin, Richard; Ros-Giralt, Jordi; Szilagyi, Peter; Wohlford, David E

    2015-11-10

    Systems, apparatus and methods for the implementation of high-speed network analyzers are provided. A set of high-level specifications is used to define the behavior of the network analyzer emitted by a compiler. An optimized inline workflow to process regular expressions is presented without sacrificing the semantic capabilities of the processing engine. An optimized packet dispatcher implements a subset of the functions implemented by the network analyzer, providing a fast and slow path workflow used to accelerate specific processing units. Such dispatcher facility can also be used as a cache of policies, wherein if a policy is found, then packet manipulations associated with the policy can be quickly performed. An optimized method of generating DFA specifications for network signatures is also presented. The method accepts several optimization criteria, such as min-max allocations or optimal allocations based on the probability of occurrence of each signature input bit.

  5. Implementation of checklists in health care; learning from high-reliability organisations

    PubMed Central

    2011-01-01

    Background Checklists are common in some medical fields, including surgery, intensive care and emergency medicine. They can be an effective tool to improve care processes and reduce mortality and morbidity. Despite the seemingly rapid acceptance and dissemination of the checklist, there are few studies describing the actual process of developing and implementing such tools in health care. The aim of this study is to explore the experiences from checklist development and implementation in a group of non-medical, high reliability organisations (HROs). Method A qualitative study based on key informant interviews and field visits followed by a Delphi approach. Eight informants, each with 10-30 years of checklist experience, were recruited from six different HROs. Results The interviews generated 84 assertions and recommendations for checklist implementation. To achieve checklist acceptance and compliance, there must be a predefined need for which a checklist is considered a well suited solution. The end-users ("sharp-end") are the key stakeholders throughout the development and implementation process. Proximity and ownership must be assured through a thorough and wise process. All informants underlined the importance of short, self-developed, and operationally-suited checklists. Simulation is a valuable and widely used method for training, revision, and validation. Conclusion Checklists have been a cornerstone of safety management in HROs for nearly a century, and are becoming increasingly popular in medicine. Acceptance and compliance are crucial for checklist implementation in health care. Experiences from HROs may provide valuable input to checklist implementation in healthcare. PMID:21967747

  6. Future water availability in North African dams simulated by high-resolution regional climate models

    NASA Astrophysics Data System (ADS)

    Tramblay, Yves; Jarlan, Lionel; Hanich, Lahoucine; Somot, Samuel

    2016-04-01

    In North Africa, the countries of Morocco, Algeria and Tunisia are already experiencing water scarcity and a strong interannual variability of precipitation. To better manage their existing water resources, several dams and reservoirs have been built on most large river catchments. The objective of this study is to provide quantitative scenarios of future changes in water availability for the 47 major dams and reservoirs catchments located in North Africa. An ensemble of regional climate models (RCM) with a spatial resolution of 12km, driven by different general circulation models (GCM), from the EuroCORDEX experiment have been considered to analyze the projected changes on temperature, precipitation and potential evapotranspiration (PET) for two scenarios (RCP4.5 and RCP8.5) and two time horizons (2040-2065 and 2065-2090). PET is estimated from RCM outputs either with the FAO-Penman-Monteith (PM) equation, requiring air temperature, relative humidity, net radiation and wind, or with the Hargreave Samani (HS) equation, requiring only air temperature. The water balance is analyzed by comparing the climatic demand and supply of water, considering that for most of these catchments groundwater storage is negligible over long time periods. Results indicated a future temperature increase for all catchments between +1.8° and +4.2°, depending on the emission scenario and the time period considered. Precipitation is projected to decrease between -14% to -27%, mainly in winter and spring, with a strong East to West gradient. PET computed from PM or HS formulas provided very similar estimates and projections, ranging between +7% to +18%. Changes in PET are mostly driven by rising temperatures and are greatest during dry summer months than for the wet winter season. Therefore the increased PET has a lower impact than declining precipitation on future water availability, which is expected to decrease by -19% to -33% on average.

  7. Implementation of Formative Assessment Strategies as Perceived by High School Students and Teachers: Professional Development Implications

    ERIC Educational Resources Information Center

    Burns, Rosemary

    2010-01-01

    The purpose of this research study was to investigate the level of implementation of formative assessment strategies among Rhode Island high school teachers and students in three districts. Furthermore, the research analyzed the relationship of the disciplines taught, the amount and kinds of professional development teachers had, and district…

  8. A Systematic Approach to Improving E-Learning Implementations in High Schools

    ERIC Educational Resources Information Center

    Pardamean, Bens; Suparyanto, Teddy

    2014-01-01

    This study was based on the current growing trend of implementing e-learning in high schools. Most endeavors have been inefficient, rendering an objective of determining the initial steps that could be taken to improve these efforts by assessing a student population's computer skill levels and performances in an IT course. Demographic factors…

  9. A Literature Review: The Effect of Implementing Technology in a High School Mathematics Classroom

    ERIC Educational Resources Information Center

    Murphy, Daniel

    2016-01-01

    This study is a literature review to investigate the effects of implementing technology into a high school mathematics classroom. Mathematics has a hierarchical structure in learning and it is essential that students get a firm understanding of mathematics early in education. Some students that miss beginning concepts may continue to struggle with…

  10. Serve and Learn: Implementing and Evaluating Service-Learning in Middle and High Schools

    ERIC Educational Resources Information Center

    Pritchard, Florence Fay; Whitehead, George I., III

    2004-01-01

    This volume makes two important contributions: First, it provides a framework grounded in theory and best professional practice that middle and high school teachers, their students, and community partners can use to design, implement, and evaluate service-learning projects that address authentic community needs. Second, it demonstrates ways…

  11. Implementing Student Information Systems in High Schools: An Embedded Single Case Study

    ERIC Educational Resources Information Center

    Rhodes-O'Neill, Tamyra LaShawn

    2014-01-01

    As new technologies are developed for teaching and learning, they hold the potential to transform education but have yet to be fully integrated into K-12 classrooms in the United States. The purpose of this study was to explore how a student information system was implemented in 2 urban public high schools and how stakeholders perceived that…

  12. Implementation of an Entry-Level Retention Program for High-Risk College Freshmen.

    ERIC Educational Resources Information Center

    Zanoni, Candido

    The specially funded program described in this report was implemented at the University of Minnesota's General College in Fall 1979 to promote the academic improvement and long-range retention of high-risk Black, Hispanic, and Native American students. After introductory material discussing the process involved in securing program funds from the…

  13. It Takes a Network: One Curriculum Leader Implements the Common Core High School Mathematics Standards

    ERIC Educational Resources Information Center

    Beckford, Franchetta Joenise

    2013-01-01

    This qualitative study was conducted for the purpose of determining whether a district mathematics curriculum leader's social network advanced the implementation of the high school mathematics Common Core State Standards (CCSS) (CCSSI, 2012c). The qualitative data was collected through an interview, a hand-drawn network map, observations, and…

  14. Negotiating Implementation of High-Stakes Performance Assessment Policies in Teacher Education: From Compliance to Inquiry

    ERIC Educational Resources Information Center

    Peck, Charles A.; Gallucci, Chrysan; Sloan, Tine

    2010-01-01

    Teacher education programs in the United States face a variety of new accountability policies at both the federal and the state level. Many of these policies carry high-stakes implications for students and programs and involve some of the same challenges for implementation as they have in the P-12 arena. Serious dilemmas for teacher educators…

  15. Academic-Career Integration in Magnet High Schools: Assessing the Level of Implementation.

    ERIC Educational Resources Information Center

    Tokarska, Barbara; And Others

    An ongoing study examined implementation and student response to academic career magnet (ACM) programs in New York City high schools. The programs emphasize both college preparation and career education, demonstrating one approach to the current emphasis on integrating academic and vocational education. New York City offers a wide array of magnet…

  16. The Three-Block Model of Universal Design for Learning Implementation in a High School

    ERIC Educational Resources Information Center

    Katz, Jennifer; Sugden, Ron

    2013-01-01

    The role of the school leader (principal) in supporting educational reform is explored through a case study of one high school implementing the Three Block Model of UDL (Katz, 2012a) in an effort to meet the needs of a diverse student population. This case study is a part of a much larger study exploring outcomes for students and teachers of…

  17. Highly Proficient Bilinguals Implement Inhibition: Evidence from N-2 Language Repetition Costs

    ERIC Educational Resources Information Center

    Declerck, Mathieu; Thoma, Aniella M.; Koch, Iring; Philipp, Andrea M.

    2015-01-01

    Several, but not all, models of language control assume that highly proficient bilinguals implement little to no inhibition during bilingual language production. In the current study, we tested this assumption with a less equivocal marker of inhibition (i.e., n-2 language repetition costs) than previous language switching studies have. N-2…

  18. A Comparison of the High Count Rate Performance of Three Commercially Available Digital Signal Processors

    SciTech Connect

    Dawn M. Scates; John K. Hartwell

    2005-10-01

    Three commercial ã-ray digital signal processors, a Canberra InSpector 2000, an ORTEC DigiDART, and an X-ray Instrumentation Associates Polaris system, coupled to a Canberra 2002C resistive-feedback preamplifier-equipped high-purity germanium detector, were performance tested to input rates of 440 kHz. The spectrometers were evaluated on their throughput, stability and peak shape performance. The accuracy of their quantitative corrections for dead time and pile-up were also tested. All three of the tested units performed well at input rates that strain most analog spectroscopy systems.

  19. School Violence, Substance Use, and Availability of Illegal Drugs on School Property among U.S. High School Students.

    ERIC Educational Resources Information Center

    Lowry, Richard; Cohen, Lisa R.; Modzeleski, William; Kann, Laura; Collins, Janet L.; Kolbe, Lloyd J.

    1999-01-01

    Investigated whether school violence among high school students related to substance use and availability of illegal drugs at school, examining the associations of tobacco, alcohol, and marijuana and availability of illegal drugs with five school violence indicators. Data from the 1995 Youth Risk Behavior Survey indicated that school violence…

  20. NOAA Operational Model Archive Distribution System (NOMADS): High Availability Applications for Reliable Real Time Access to Operational Model Data

    NASA Astrophysics Data System (ADS)

    Alpert, J. C.; Wang, J.

    2009-12-01

    To reduce the impact of natural hazards and environmental changes, the National Centers for Environmental Prediction (NCEP) provide first alert and a preferred partner for environmental prediction services, and represents a critical national resource to operational and research communities affected by climate, weather and water. NOMADS is now delivering high availability services as part of NOAA’s official real time data dissemination at its Web Operations Center (WOC) server. The WOC is a web service used by organizational units in and outside NOAA, and acts as a data repository where public information can be posted to a secure and scalable content server. A goal is to foster collaborations among the research and education communities, value added retailers, and public access for science and development efforts aimed at advancing modeling and GEO-related tasks. The user (client) executes what is efficient to execute on the client and the server efficiently provides format independent access services. Client applications can execute on the server, if it is desired, but the same program can be executed on the client side with no loss of efficiency. In this way this paradigm lends itself to aggregation servers that act as servers of servers listing, searching catalogs of holdings, data mining, and updating information from the metadata descriptions that enable collections of data in disparate places to be simultaneously accessed, with results processed on servers and clients to produce a needed answer. The services used to access the operational model data output are the Open-source Project for a Network Data Access Protocol (OPeNDAP), implemented with the Grid Analysis and Display System (GrADS) Data Server (GDS), and applications for slicing, dicing and area sub-setting the large matrix of real time model data holdings. This approach insures an efficient use of computer resources because users transmit/receive only the data necessary for their tasks including

  1. Implementation method of a core SONET/SDH switch with high capacity

    NASA Astrophysics Data System (ADS)

    Zhang, JinQi

    2004-05-01

    An implementation method of a core SONET/SDH switch with high capacity is introduced in the paper. High-speed serial I/O, switching architectures and design considerations for switching unit are involved. It supports strictly non-blocking for unicast traffic and re-arrangeably non-blocking for dual-cast. Dualcast traffic allows for efficient scheduling of working and protection paths in UPSR(Unidirectional Path Switched Ring)/BLSR (Bidirectional Line Switched Ring) applications.

  2. Hardware implementation of a scheduler for high performance switches with quality of service (QoS) support

    NASA Astrophysics Data System (ADS)

    Arteaga, R.; Tobajas, F.; De Armas, V.; Sarmiento, R.

    2009-05-01

    In this paper, the hardware implementation of a scheduler with QoS support is presented. The starting point is a Differentiated Service (DiffServ) network model. Each switch of this network classifies the packets in flows which are assigned to traffic classes depending of its requirements with an independent queue being available for each traffic class. Finally, the scheduler chooses the right queue in order to provide Quality of Service support. This scheduler considers the bandwidth distribution, introducing the time frame concept, and the packet delay, assigning a priority to each traffic class. The architecture of this algorithm is also presented in this paper describing their functionality and complexity. The architecture was described in Verilog HDL at RTL level. The complete system has been implemented in a Spartan-3 1000 FPGA device using ISE software from Xilinx, demonstrating it is a suitable design for high speed switches.

  3. Scalable Unix commands for parallel processors : a high-performance implementation.

    SciTech Connect

    Ong, E.; Lusk, E.; Gropp, W.

    2001-06-22

    We describe a family of MPI applications we call the Parallel Unix Commands. These commands are natural parallel versions of common Unix user commands such as ls, ps, and find, together with a few similar commands particular to the parallel environment. We describe the design and implementation of these programs and present some performance results on a 256-node Linux cluster. The Parallel Unix Commands are open source and freely available.

  4. Implementation of scalable video coding deblocking filter from high-level SystemC description

    NASA Astrophysics Data System (ADS)

    Carballo, Pedro P.; Espino, Omar; Neris, Romén.; Hernández-Fernández, Pedro; Szydzik, Tomasz M.; Núñez, Antonio

    2013-05-01

    This paper describes key concepts in the design and implementation of a deblocking filter (DF) for a H.264/SVC video decoder. The DF supports QCIF and CIF video formats with temporal and spatial scalability. The design flow starts from a SystemC functional model and has been refined using high-level synthesis methodology to RTL microarchitecture. The process is guided with performance measurements (latency, cycle time, power, resource utilization) with the objective of assuring the quality of results of the final system. The functional model of the DF is created in an incremental way from the AVC DF model using OpenSVC source code as reference. The design flow continues with the logic synthesis and the implementation on the FPGA using various strategies. The final implementation is chosen among the implementations that meet the timing constraints. The DF is capable to run at 100 MHz, and macroblocks are processed in 6,500 clock cycles for a throughput of 130 fps for QCIF format and 37 fps for CIF format. The proposed architecture for the complete H.264/SVC decoder is composed of an OMAP 3530 SOC (ARM Cortex-A8 GPP + DSP) and the FPGA Virtex-5 acting as a coprocessor for DF implementation. The DF is connected to the OMAP SOC using the GPMC interface. A validation platform has been developed using the embedded PowerPC processor in the FPGA, composing a SoC that integrates the frame generation and visualization in a TFT screen. The FPGA implements both the DF core and a GPMC slave core. Both cores are connected to the PowerPC440 embedded processor using LocalLink interfaces. The FPGA also contains a local memory capable of storing information necessary to filter a complete frame and to store a decoded picture frame. The complete system is implemented in a Virtex5 FX70T device.

  5. Implementation of physics and everyday thinking in a high school classroom: Concepts and argumentation

    NASA Astrophysics Data System (ADS)

    Belleau, Shelly N.; Ross, Mike J.; Otero, Valerie K.

    2012-02-01

    The Physics and Everyday Thinking (PET) curriculum is based on educational research and consists of carefully sequenced sets of activities intended to help students develop physics ideas through guided experimentation and questioning with extensive small group and whole class discussion. A high school physics teacher has adapted and implemented the PET curriculum in a low-income urban high school with the aim of removing barriers that typically limit access to traditional physics curriculum. Though PET was not designed for secondary physics students, this teacher has worked closely with physics education research faculty and graduate students to simultaneously modify, implement, and investigate the impact of PET on urban high school students' physics learning. Preliminary results indicate that the PET curriculum has great potential to provide students with opportunities for success in understanding physics concepts, as well as helping to develop scientific argumentation strategies.

  6. Implementing a Highly Specified Curricular, Instructional, and Organizational School Design in a High-Poverty Urban Elementary School: Three Year Results. Report No. 20.

    ERIC Educational Resources Information Center

    McHugh, Barbara; Stringfield, Sam

    This report provides background, implementation, and diverse outcome data from the first 3 years of an ongoing effort to implement a highly specified school reform design in a high-poverty, urban elementary school, Woodson Elementary School, Baltimore (Maryland). The design that is being implemented is the Calvert School model. The Calvert School…

  7. High School Physics Availability: Results from the 2008-09 Nationwide Survey of High School Physics Teachers. Focus On

    ERIC Educational Resources Information Center

    White, Susan; Tesfaye, Casey Langer

    2010-01-01

    In the fall of 2008, the authors contacted a representative sample of over 3,600 high schools in the U.S.--both public and private--to determine whether or not physics was taught there. They received responses from over 99% of the schools. For the schools which indicated they were offering physics, they obtained contact information for the…

  8. Implementing Molecular Dynamics on Hybrid High Performance Computers - Particle-Particle Particle-Mesh

    SciTech Connect

    Brown, W Michael; Kohlmeyer, Axel; Plimpton, Steven J; Tharrington, Arnold N

    2012-01-01

    The use of accelerators such as graphics processing units (GPUs) has become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high-performance computers, machines with nodes containing more than one type of floating-point processor (e.g. CPU and GPU), are now becoming more prevalent due to these advantages. In this paper, we present a continuation of previous work implementing algorithms for using accelerators into the LAMMPS molecular dynamics software for distributed memory parallel hybrid machines. In our previous work, we focused on acceleration for short-range models with an approach intended to harness the processing power of both the accelerator and (multi-core) CPUs. To augment the existing implementations, we present an efficient implementation of long-range electrostatic force calculation for molecular dynamics. Specifically, we present an implementation of the particle-particle particle-mesh method based on the work by Harvey and De Fabritiis. We present benchmark results on the Keeneland InfiniBand GPU cluster. We provide a performance comparison of the same kernels compiled with both CUDA and OpenCL. We discuss limitations to parallel efficiency and future directions for improving performance on hybrid or heterogeneous computers.

  9. An implementation of the SNR high speed network communication protocol (Receiver part)

    NASA Astrophysics Data System (ADS)

    Wan, Wen-Jyh

    1995-03-01

    This thesis work is to implement the receiver pan of the SNR high speed network transport protocol. The approach was to use the Systems of Communicating Machines (SCM) as the formal definition of the protocol. Programs were developed on top of the Unix system using C programming language. The Unix system features that were adopted for this implementation were multitasking, signals, shared memory, semaphores, sockets, timers and process control. The problems encountered, and solved, were signal loss, shared memory conflicts, process synchronization, scheduling, data alignment and errors in the SCM specification itself. The result was a correctly functioning program which implemented the SNR protocol. The system was tested using different connection modes, lost packets, duplicate packets and large data transfers. The contributions of this thesis are: (1) implementation of the receiver part of the SNR high speed transport protocol; (2) testing and integration with the transmitter part of the SNR transport protocol on an FDDI data link layered network; (3) demonstration of the functions of the SNR transport protocol such as connection management, sequenced delivery, flow control and error recovery using selective repeat methods of retransmission; and (4) modifications to the SNR transport protocol specification such as corrections for incorrect predicate conditions, defining of additional packet types formats, solutions for signal lost and processes contention problems etc.

  10. Implementing molecular dynamics on hybrid high performance computers - Particle-particle particle-mesh

    NASA Astrophysics Data System (ADS)

    Brown, W. Michael; Kohlmeyer, Axel; Plimpton, Steven J.; Tharrington, Arnold N.

    2012-03-01

    The use of accelerators such as graphics processing units (GPUs) has become popular in scientific computing applications due to their low cost, impressive floating-point capabilities, high memory bandwidth, and low electrical power requirements. Hybrid high-performance computers, machines with nodes containing more than one type of floating-point processor (e.g. CPU and GPU), are now becoming more prevalent due to these advantages. In this paper, we present a continuation of previous work implementing algorithms for using accelerators into the LAMMPS molecular dynamics software for distributed memory parallel hybrid machines. In our previous work, we focused on acceleration for short-range models with an approach intended to harness the processing power of both the accelerator and (multi-core) CPUs. To augment the existing implementations, we present an efficient implementation of long-range electrostatic force calculation for molecular dynamics. Specifically, we present an implementation of the particle-particle particle-mesh method based on the work by Harvey and De Fabritiis. We present benchmark results on the Keeneland InfiniBand GPU cluster. We provide a performance comparison of the same kernels compiled with both CUDA and OpenCL. We discuss limitations to parallel efficiency and future directions for improving performance on hybrid or heterogeneous computers.

  11. Implementation and modeling of parametrizable high-speed Reed Solomon decoders on FPGAs

    NASA Astrophysics Data System (ADS)

    Flocke, A.; Blume, H.; Noll, T. G.

    2005-05-01

    One of the most important error correction codes in digital signal processing is the Reed Solomon code. A lot of VLSI implementations have been described in literature. This paper introduces a highly parametrizable RS-decoder for FPGAs. By implementing resource-sharing and by using a fully pipelined multiplier/adder-unit in GF(2m) it was possible to achieve high throughput rates up to 1.3Gbit/s on a standard FPGA, while using only an attractive small amount of logical elements (LE). The implementation, written in a hardware description language (HDL), is based on an inversionless Berlekamp Algorithm (iBA), whose structure leads to a chain of identical processing elements (PE). The critical path of one PE runs only through one adder and one multiplier. A detailed description of a resource-sharing methodology for this Berlekamp Algorithm and the achievable gain are presented in this paper. The benchmarking for the design was done for different 8bit-codes against state-of-the-art FPGA-solutions and showed a gain of up to a factor of six regarding the AT-product, compared to other implementations.

  12. Design and implementation of interface units for high speed fiber optics local area networks and broadband integrated services digital networks

    NASA Technical Reports Server (NTRS)

    Tobagi, Fouad A.; Dalgic, Ismail; Pang, Joseph

    1990-01-01

    The design and implementation of interface units for high speed Fiber Optic Local Area Networks and Broadband Integrated Services Digital Networks are discussed. During the last years, a number of network adapters that are designed to support high speed communications have emerged. This approach to the design of a high speed network interface unit was to implement package processing functions in hardware, using VLSI technology. The VLSI hardware implementation of a buffer management unit, which is required in such architectures, is described.

  13. A C++11 implementation of arbitrary-rank tensors for high-performance computing

    NASA Astrophysics Data System (ADS)

    Aragón, Alejandro M.

    2014-06-01

    This article discusses an efficient implementation of tensors of arbitrary rank by using some of the idioms introduced by the recently published C++ ISO Standard (C++11). With the aims at providing a basic building block for high-performance computing, a single Array class template is carefully crafted, from which vectors, matrices, and even higher-order tensors can be created. An expression template facility is also built around the array class template to provide convenient mathematical syntax. As a result, by using templates, an extra high-level layer is added to the C++ language when dealing with algebraic objects and their operations, without compromising performance. The implementation is tested running on both CPU and GPU.

  14. Implementation of a High Explosive Equation of State into an Eulerian Hydrocode

    NASA Astrophysics Data System (ADS)

    Littlefield, David L.; Baker, Ernest L.

    2004-07-01

    The implementation of a high explosive equation of state into the Eulerian hydrocode CTH is described. The equation of state is an extension to JWL referred to as JWLB, and is intended to model the thermodynamic state of detonation products from a high explosive reaction. The EOS was originally cast in a form p = p(ρ, e), where p is the pressure, ρ is the density and e is the internal energy. However, the target application code requires an EOS of the form p = p(ρ, T), where T is the temperature, so it was necessary to reformulate the EOS in a thermodynamically consistent manner. A Helmholtz potential, developed from the original EOS, insures this consistency. Example calculations are shown that illustrate the veracity of this implementation.

  15. Computer simulations in teaching physics: Development and implementation of a hypermedia system for high school teachers

    NASA Astrophysics Data System (ADS)

    da Silva, A. M. R.; de Macêdo, J. A.

    2016-06-01

    On the basis of the technological advancement in the middle and the difficulty of learning by the students in the discipline of physics, this article describes the process of elaboration and implementation of a hypermedia system for high school teachers involving computer simulations for teaching basic concepts of electromagnetism, using free tool. With the completion and publication of the project there will be a new possibility of interaction of students and teachers with the technology in the classroom and in labs.

  16. Parallel Implementation of a High Order Implicit Collocation Method for the Heat Equation

    NASA Technical Reports Server (NTRS)

    Kouatchou, Jules; Halem, Milton (Technical Monitor)

    2000-01-01

    We combine a high order compact finite difference approximation and collocation techniques to numerically solve the two dimensional heat equation. The resulting method is implicit arid can be parallelized with a strategy that allows parallelization across both time and space. We compare the parallel implementation of the new method with a classical implicit method, namely the Crank-Nicolson method, where the parallelization is done across space only. Numerical experiments are carried out on the SGI Origin 2000.

  17. Highly-Parallel, Highly-Compact Computing Structures Implemented in Nanotechnology

    NASA Technical Reports Server (NTRS)

    Crawley, D. G.; Duff, M. J. B.; Fountain, T. J.; Moffat, C. D.; Tomlinson, C. D.

    1995-01-01

    In this paper, we describe work in which we are evaluating how the evolving properties of nano-electronic devices could best be utilized in highly parallel computing structures. Because of their combination of high performance, low power, and extreme compactness, such structures would have obvious applications in spaceborne environments, both for general mission control and for on-board data analysis. However, the anticipated properties of nano-devices mean that the optimum architecture for such systems is by no means certain. Candidates include single instruction multiple datastream (SIMD) arrays, neural networks, and multiple instruction multiple datastream (MIMD) assemblies.

  18. IMPLEMENTING SCIENTIFIC SIMULATION CODES HIGHLY TAILORED FOR VECTOR ARCHITECTURES USING CUSTOM CONFIGURABLE COMPUTING MACHINES

    NASA Technical Reports Server (NTRS)

    Rutishauser, David K.

    2006-01-01

    The motivation for this work comes from an observation that amidst the push for Massively Parallel (MP) solutions to high-end computing problems such as numerical physical simulations, large amounts of legacy code exist that are highly optimized for vector supercomputers. Because re-hosting legacy code often requires a complete re-write of the original code, which can be a very long and expensive effort, this work examines the potential to exploit reconfigurable computing machines in place of a vector supercomputer to implement an essentially unmodified legacy source code. Custom and reconfigurable computing resources could be used to emulate an original application's target platform to the extent required to achieve high performance. To arrive at an architecture that delivers the desired performance subject to limited resources involves solving a multi-variable optimization problem with constraints. Prior research in the area of reconfigurable computing has demonstrated that designing an optimum hardware implementation of a given application under hardware resource constraints is an NP-complete problem. The premise of the approach is that the general issue of applying reconfigurable computing resources to the implementation of an application, maximizing the performance of the computation subject to physical resource constraints, can be made a tractable problem by assuming a computational paradigm, such as vector processing. This research contributes a formulation of the problem and a methodology to design a reconfigurable vector processing implementation of a given application that satisfies a performance metric. A generic, parametric, architectural framework for vector processing implemented in reconfigurable logic is developed as a target for a scheduling/mapping algorithm that maps an input computation to a given instance of the architecture. This algorithm is integrated with an optimization framework to arrive at a specification of the architecture parameters

  19. Great Careers in Two Years: The Associate Degree Option. High Skill and High Wage Jobs Available through Two-Year Programs.

    ERIC Educational Resources Information Center

    Phifer, Paul

    This book explores high-skill and high-wage jobs available through two-year programs. It identifies 100 high-need occupational areas, and discusses "hot" programs and starting salaries for graduates of dental hygiene, manufacturing, process technology, telecommunications, physical therapy assisting, and registered nursing. Each career article…

  20. An Exploration of Support Factors Available to Higher Education Students with High Functioning Autism or Asperger Syndrome

    ERIC Educational Resources Information Center

    Rutherford, Emily N.

    2013-01-01

    This qualitative phenomenological research study used narrative inquiry to explore the support factors available to students with High Functioning Autism or Asperger Syndrome in higher education that contribute to their success as perceived by the students. Creswell's (2009) six step method for analyzing phenomenological studies was used to…

  1. The Availability and Utilization of School Library Resources in Some Selected Secondary Schools (High School) in Rivers State

    ERIC Educational Resources Information Center

    Owate, C. N.; Iroha, Okpa

    2013-01-01

    This study investigates the availability and utilization of school library resources by Secondary School (High School) Students. Eight Selected Secondary Schools in Rivers State, Nigeria were chosen based on their performance in external examinations and geographic locations. In carrying out the research, questionnaires were administered to both…

  2. International Conference on Harmonisation; Electronic Transmission of Postmarket Individual Case Safety Reports for Drugs and Biologics, Excluding Vaccines; Availability of Food and Drug Administration Regional Implementation Specifications for ICH E2B(R3) Reporting to the Food and Drug Administration Adverse Event Reporting System. Notice of Availability.

    PubMed

    2016-06-23

    The Food and Drug Administration (FDA) is announcing the availability of its FDA Adverse Event Reporting System (FAERS) Regional Implementation Specifications for the International Conference on Harmonisation (ICH) E2B(R3) Specification. FDA is making this technical specifications document available to assist interested parties in electronically submitting individual case safety reports (ICSRs) (and ICSR attachments) to the Center for Drug Evaluation and Research (CDER) and the Center for Biologics Evaluation and Research (CBER). This document, entitled "FDA Regional Implementation Specifications for ICH E2B(R3) Implementation: Postmarket Submission of Individual Case Safety Reports (ICSRs) for Drugs and Biologics, Excluding Vaccines" supplements the "E2B(R3) Electronic Transmission of Individual Case Safety Reports (ICSRs) Implementation Guide--Data Elements and Message Specification" final guidance for industry and describes FDA's technical approach for receiving ICSRs, for incorporating regionally controlled terminology, and for adding region-specific data elements when reporting to FAERS. PMID:27373012

  3. Power of the policy: how the announcement of high-stakes clinical examination altered OSCE implementation at institutional level

    PubMed Central

    2013-01-01

    Background The Objective Structured Clinical Examination (OSCE) has been widely applied as a high-stakes examination for assessing physicians’ clinical competency. In 1992, OSCE was first introduced in Taiwan, and the authorities announced that passing the OSCE would be a prerequisite for step-2 medical licensure examination in 2013. This study aimed to investigate the impacts of the announced national OSCE policy on implementation of OSCE at the institutional level. Further, the readiness and the recognition of barriers toward a high-stakes examination were explored. Methods In 2007 and 2010, the year before and after the announcement of high-stakes OSCE policy in 2008, respectively, questionnaires on the status of OSCE implementation were distributed to all hospitals with active OSCE programs in Taiwan. Information on OSCE facilities, equipment, station length, number of administrations per year, and the recognition of barriers to the success of implementing an OSCE were collected. The missing data were completed by telephone interviews. The OSCE format, administration, and facilities before and after the announcement of the nationwide OSCE policy were compared. Results The data were collected from 17 hospitals in 2007 and 21 in 2010. Comparing the OSCE formats between 2007 and 2010, the number of stations increased and the station length decreased. The designated space and the equipment for OSCE were also found to have been improved. As for the awareness of OSCE implementation barriers, the hospital representatives concerned mostly about the availability and quality of standardized patients in 2007, as well as space and facilities in 2010. Conclusions The results of this study underscored an overall increase in the number of OSCE hospitals and changes in facilities and formats. While recruitment and training of standardized patients were the major concerns before the official disclosure of the policy, space and facilities became the focus of attention after

  4. A high-performance, portable implementation of the MPI message passing interface standard.

    SciTech Connect

    Gropp, W.; Lusk, E.; Doss, N.; Skjellum, A.; Mathematics and Computer Science; Mississippi State Univ.

    1996-09-01

    MPI (Message Passing Interface) is a specification for a standard library for message passing that was defined by the MPI Forum, a broadly based group of parallel computer vendors, library writers, and applications specialists. Multiple implementations of MPI have been developed. In this paper, we describe MPICH, unique among existing implementations in its design goal of combining portability with high performance. We document its portability and performance and describe the architecture by which these features are simultaneously achieved. We also discuss the set of tools that accompany the free distribution of MPICH, which constitute the beginnings of a portable parallel programming environment. A project of this scope inevitably imparts lessons about parallel computing, the specification being followed, the current hardware and software environment for parallel computing, and project management; we describe those we have learned. Finally, we discuss future developments for MPICH, including those necessary to accommodate extensions to the MPI Standard now being contemplated by the MPI Forum.

  5. Implementation of a High Throughput Variable Decimation Pane Filter Using the Xilinx System Generator

    SciTech Connect

    RADDER,JERAHMIE WILLIAM

    2003-01-01

    In a Synthetic Aperture Radar (SAR) system, the purpose of the receiver is to process incoming radar signals in order to obtain target information and ultimately construct an image of the target area. Incoming raw signals are usually in the microwave frequency range and are typically processed with analog circuitry, requiring hardware designed specifically for the desired signal processing operations. A more flexible approach is to process the signals in the digital domain. Recent advances in analog-to-digital converter (ADC) and Field Programmable Gate Array (FPGA) technology allow direct digital processing of wideband intermediate frequency (IF) signals. Modern ADCs can achieve sampling rates in excess of 1GS/s, and modern FPGAs can contain millions of logic gates operating at frequencies over 100 MHz. The combination of these technologies is necessary to implement a digital radar receiver capable of performing high speed, sophisticated and scalable DSP designs that are not possible with analog systems. Additionally, FPGA technology allows designs to be modified as the design parameters change without the need for redesigning circuit boards, potentially saving both time and money. For typical radars receivers, there is a need for operation at multiple ranges, which requires filters with multiple decimation rates, i.e., multiple bandwidths. In previous radar receivers, variable decimation was implemented by switching between SAW filters to achieve an acceptable filter configuration. While this method works, it is rather ''brute force'' because it duplicates a large amount of hardware and requires a new filter to be added for each IF bandwidth. By implementing the filter digitally in FPGAs, a larger number of decimation values (and consequently a larger number of bandwidths) can be implemented with no need for extra components. High performance, wide bandwidth radar systems also place high demands on the DSP throughput of a given digital receiver. In such

  6. Principles and practical implementation for high resolution multi-sensor QPE

    NASA Astrophysics Data System (ADS)

    Chandra, C. V.; Lim, S.; Cifelli, R.

    2011-12-01

    The multi-sensor Quantitative Precipitation Estimation (MPE) is a principle and a practical concept and is becoming a well-known term in the scientific circles of hydrology and atmospheric science. The main challenge in QPE is that precipitation is a highly variable quantity with extensive spatial and temporal variability at multiple scales. There are MPE products produced from satellites, radars, models and ground sensors. There are MPE products at global scale (Heinemann et al. 2002), continental scale (Seo et al. 2010; Zhang et al. 2011) and regional scale (Kitzmiller et al. 2011). Lots of the MPE products are used to alleviate the problems of one type of sensor by another. Some multi-sensor products are used to move across scales. This paper looks at a comprehensive view of the "concept of multi sensor precipitation estimate", from different perspectives. This paper delineates the MPE problem into three categories namely, a) Scale based MPE, b) MPE for accuracy enhancement and coverage and c) Integrative across scales. For example, by introducing dual polarization radar data to the MPE system, QPE can be improved significantly. In last decade, dual polarization radars are becoming an important tool for QPE in operational networks. Dual polarization radars offer an advantage to interpret more accurate physical models by providing information of the size, shape, phase and orientation of hydrometers (Bringi and Chandrasekar 2001). In addition, these systems have the ability to provide measurements that are immune to absolute radar calibration and partial beam blockage as well as help in data quality enhancement. By integrating these characteristics of dual polarization radar, QPE performance can be improved in comparison of single polarization radar based QPE (Cifelli and Chandrasekar 2010). Dual-polarization techniques have been applied to S and C band radar systems for several decades and higher frequency system such as X band are now widely available to the

  7. Numerical implementation of a crystal plasticity model with dislocation transport for high strain rate applications

    NASA Astrophysics Data System (ADS)

    Mayeur, Jason R.; Mourad, Hashem M.; Luscher, Darby J.; Hunter, Abigail; Kenamond, Mark A.

    2016-05-01

    This paper details a numerical implementation of a single crystal plasticity model with dislocation transport for high strain rate applications. Our primary motivation for developing the model is to study the influence of dislocation transport and conservation on the mesoscale response of metallic crystals under extreme thermo-mechanical loading conditions (e.g. shocks). To this end we have developed a single crystal plasticity theory (Luscher et al (2015)) that incorporates finite deformation kinematics, internal stress fields caused by the presence of geometrically necessary dislocation gradients, advection equations to model dislocation density transport and conservation, and constitutive equations appropriate for shock loading (equation of state, drag-limited dislocation velocity, etc). In the following, we outline a coupled finite element–finite volume framework for implementing the model physics, and demonstrate its capabilities in simulating the response of a [1 0 0] copper single crystal during a plate impact test. Additionally, we explore the effect of varying certain model parameters (e.g. mesh density, finite volume update scheme) on the simulation results. Our results demonstrate that the model performs as intended and establishes a baseline of understanding that can be leveraged as we extend the model to incorporate additional and/or refined physics and move toward a multi-dimensional implementation.

  8. High altitude mine waste remediation -- Implementation of the Idarado remedial action plan

    SciTech Connect

    Hardy, A.J.; Redmond, J.V.; River, R.A.; Davis, C.S.

    1999-07-01

    The Idarado Mine in Colorado's San Juan Mountains includes 11 tailing areas, numerous waste rock dumps, and a large number of underground openings connected by over 100 miles of raises and drifts. The tailings and mine wastes were generated from different mining and milling operations between 1975 and 1978. the Idarado Remedial Action Plan (RAP) was an innovative 5-year program developed for remediating the impacts of historic mining activities in the San Miguel River and Red Mountain Creek drainages. The challenges during implementation included seasonal access limitations due to the high altitude construction areas, high volumes of runoff during snow melt, numerous abandoned underground openings and stopped-out veins, and high profile sites adjacent to busy jeep trails and a major ski resort town. Implementation of the RAP has included pioneering efforts in engineering design and construction of remedial measures. Innovative engineering designs included direct revegetation techniques for the stabilization of tailings piles, concrete cutoff walls and French drains to control subsurface flows, underground water controls that included pipelines, weeplines, and portal collection systems, and various underground structures to collect and divert subsurface flows often exceeding 2,000 gpm. Remote work locations have also required the use of innovative construction techniques such as heavy lift helicopters to move construction materials to mines above 10,000 feet. This paper describes the 5-year implementation program which has included over 1,000,000 cubic yards of tailing regrading, application of 5,000 tons of manure and 26,000 tons of limestone, and construction of over 10,000 feet of pipeline and approximately 45,000 feet of diversion channel.

  9. Variables that impact the implementation of project-based learning in high school science

    NASA Astrophysics Data System (ADS)

    Cunningham, Kellie

    Wagner and colleagues (2006) state the mediocrity of teaching and instructional leadership is the central problem that must be addressed if we are to improve student achievement. Educational reform efforts have been initiated to improve student performance and to hold teachers and school leaders accountable for student achievement (Wagner et al., 2006). Specifically, in the area of science, goals for improving student learning have led reformers to establish standards for what students should know and be able to do, as well as what instructional methods should be used. Key concepts and principles have been identified for student learning. Additionally, reformers recommend student-centered, inquiry-based practices that promote a deep understanding of how science is embedded in the everyday world. These new approaches to science education emphasize inquiry as an essential element for student learning (Schneider, Krajcik, Marx, & Soloway, 2002). Project-based learning (PBL) is an inquiry-based instructional approach that addresses these recommendations for science education reform. The objective of this research was to study the implementation of project-based learning (PBL) in an urban school undergoing reform efforts and identify the variables that positively or negatively impacted the PBL implementation process and its outcomes. This study responded to the need to change how science is taught by focusing on the implementation of project-based learning as an instructional approach to improve student achievement in science and identify the role of both school leaders and teachers in the creation of a school environment that supports project-based learning. A case study design using a mixed-method approach was used in this study. Data were collected through individual interviews with the school principal, science instructional coach, and PBL facilitator. A survey, classroom observations and interviews involving three high school science teachers teaching grades 9

  10. High School/High Tech Program Guide: An Implementation Guide for High School/High Tech Program Coordinators. Promoting Careers in Science and Technology for High School Students with Disabilities.

    ERIC Educational Resources Information Center

    Office of Disability Employment Policy (DOL), Washington, DC.

    This implementation guide is intended to assist educators in planning, establishing, building, and managing a High School/High Tech project for high school students with disabilities. The program is designed to develop career opportunities, provide activities that will spark an interest in high technology fields, and encourage students to pursue…

  11. Evaluate the Options of Implementing Skew Quadrupoles in the High Energy Ring

    SciTech Connect

    Cai, Yunhai

    1999-03-09

    There are six skew quadrupoles needed in each side of the interaction region to compensate the effects of coupling and vertical dispersion due to the solenoid detector. Two of those skew quadrupoles are at the location of the first pair of the local chromatic sextupoles in the arcs adjacent the interaction region. To avoid introducing high order aberration, the skew quadrupoles could not be placed between the sextupoles pair. In this note, we evaluate two options of implementing the skew quadrupoles at those locations, namely adding trim coil into the sextupoles or vertically displacing the sextupoles.

  12. ''Towards a High-Performance and Robust Implementation of MPI-IO on Top of GPFS''

    SciTech Connect

    Prost, J.P.; Tremann, R.; Blackwore, R.; Hartman, C.; Hedges, R.; Jia, B.; Kouiges, A.; White, A.

    2000-01-11

    MPI-IO/GPFS is a prototype implementation of the I/O chapter of the Message Passing Interface (MPI) 2 standard. It uses the IBM General Parallel File System (GPFS), with prototyped extensions, as the underlying file system. this paper describes the features of this prototype which support its high performance and robustness. The use of hints at the file system level and at the MPI-IO level allows tailoring the use of the file system to the application needs. Error handling in collective operations provides robust error reporting and deadlock prevention in case of returning errors.

  13. Design and implementation of high sensitive CCD on gallium arsenide based miniaturized spectrometer

    NASA Astrophysics Data System (ADS)

    Zheng, Jiamin; Shen, Jianhua; Guo, Fangmin

    2013-08-01

    In this paper, a method on how to design and implement a miniaturized spectrometer with low-light-level (LLL) CCD on GaAs is introduced. The optical system uses a blazed grating as the dispersive element and a 1×64 CCD on GaAs as the sensor. We apply a highly integrated Cortex-M4 MCU (STM32F407), to build the data acquisition and analysis unit, providing Wi-Fi interface to communicate with the PC software. It can complete the tasks like data acquisition, digital filtering, spectral display, network communication, human-computer interaction etc.

  14. High Performance Data Clustering: A Comparative Analysis of Performance for GPU, RASC, MPI, and OpenMP Implementations.

    PubMed

    Yang, Luobin; Chiu, Steve C; Liao, Wei-Keng; Thomas, Michael A

    2014-10-01

    Compared to Beowulf clusters and shared-memory machines, GPU and FPGA are emerging alternative architectures that provide massive parallelism and great computational capabilities. These architectures can be utilized to run compute-intensive algorithms to analyze ever-enlarging datasets and provide scalability. In this paper, we present four implementations of K-means data clustering algorithm for different high performance computing platforms. These four implementations include a CUDA implementation for GPUs, a Mitrion C implementation for FPGAs, an MPI implementation for Beowulf compute clusters, and an OpenMP implementation for shared-memory machines. The comparative analyses of the cost of each platform, difficulty level of programming for each platform, and the performance of each implementation are presented. PMID:25309040

  15. High Performance Data Clustering: A Comparative Analysis of Performance for GPU, RASC, MPI, and OpenMP Implementations*

    PubMed Central

    Yang, Luobin; Chiu, Steve C.; Liao, Wei-Keng; Thomas, Michael A.

    2013-01-01

    Compared to Beowulf clusters and shared-memory machines, GPU and FPGA are emerging alternative architectures that provide massive parallelism and great computational capabilities. These architectures can be utilized to run compute-intensive algorithms to analyze ever-enlarging datasets and provide scalability. In this paper, we present four implementations of K-means data clustering algorithm for different high performance computing platforms. These four implementations include a CUDA implementation for GPUs, a Mitrion C implementation for FPGAs, an MPI implementation for Beowulf compute clusters, and an OpenMP implementation for shared-memory machines. The comparative analyses of the cost of each platform, difficulty level of programming for each platform, and the performance of each implementation are presented. PMID:25309040

  16. Three case studies of three high school teachers' definitions, beliefs, and implementation practices of inquiry-based science method including barriers to and facilitators of successful implementation

    NASA Astrophysics Data System (ADS)

    Blackburn-Morrison, Kimberly D.

    This study involved three teachers in various stages of implementation of inquiry-based science method. The cases were chosen because one participant was a novice in using inquiry-based science method, one participant was in her second year of implementation, and the third participant was experienced with inquiry-based science method. The cases were set in a rural high school in three different science classrooms. One of the classrooms was a regular biology class. One of the classrooms was an honors oceanography class and another was an advanced placement environmental science classroom. Data sources included interviews, observations, and document collection. Interviews, observations, and document collection were used to triangulate data. Each classroom was observed five times. Interviews were conducted at the beginning of the semester with each participant and at the end of the semester. Follow-up interviews were conducted after each observation. Documents were collected such as each teacher's lesson plans, student work, and assignments. Data was initially organized according to the research areas of teacher's definition, teacher's beliefs, teacher's barriers to implementation, and teacher's enablers to implementation. Then, patterns emerging from each of these cases were organized. Lastly, patterns emerging across cases were compared in a cross-case analysis. Patterns shared between cases were: Participants related inquiry-based science method with hands-on learning activities. Participants saw students as the center of the learning process. Participants had positive beliefs about constructivist learning practices that were strengthened after implementation of inquiry-based teaching. Facilitators of successful implementation of inquiry-based science method were positive student motivation, students' retention of knowledge, and a positive experience for lower level students. Barriers to successful implementation were teachers not having complete control of the

  17. Bringing High-Rate, CO2-Based Microbial Electrosynthesis Closer to Practical Implementation through Improved Electrode Design and Operating Conditions.

    PubMed

    Jourdin, Ludovic; Freguia, Stefano; Flexer, Victoria; Keller, Jurg

    2016-02-16

    The enhancement of microbial electrosynthesis (MES) of acetate from CO2 to performance levels that could potentially support practical implementations of the technology must go through the optimization of key design and operating conditions. We report that higher proton availability drastically increases the acetate production rate, with pH 5.2 found to be optimal, which will likely suppress methanogenic activity without inhibitor addition. Applied cathode potential as low as -1.1 V versus SHE still achieved 99% of electron recovery in the form of acetate at a current density of around -200 A m(-2). These current densities are leading to an exceptional acetate production rate of up to 1330 g m(-2) day(-1) at pH 6.7. Using highly open macroporous reticulated vitreous carbon electrodes with macropore sizes of about 0.6 mm in diameter was found to be optimal for achieving a good balance between total surface area available for biofilm formation and effective mass transfer between the bulk liquid and the electrode and biofilm surface. Furthermore, we also successfully demonstrated the use of a synthetic biogas mixture as carbon dioxide source, yielding similarly high MES performance as pure CO2. This would allow this process to be used effectively for both biogas quality improvement and conversion of the available CO2 to acetate. PMID:26810392

  18. The Galaxy Mass Function at High-Redshift from the Largest Available Spitzer-Based Survey (SERVS)

    NASA Astrophysics Data System (ADS)

    Morice-Atkinson, Xan; Maraston, Claudia; Lacy, Mark; Capozzi, Diego

    2015-08-01

    We exploit the largest (18 deg2) and deepest (AB = 23.1) galaxy and QSO survey available up to date of five highly observed astronomical fields (SERVS) to derive the galaxy stellar mass function and detailed galaxy properties as a function of cosmic time. SERVS obtained Spitzer 3.6µm and 4.5µm magnitudes for ~1 million galaxies up to redshift ~6, which we complement with multi-wavelength data from other on-going surveys, including VIDEO, GALEX, CFHTLS, UKIDSS, etc. in order to perform full SED fitting to models. The power of Spitzer data is its sensitivity to evolved stars at high-redshift, which allows us to better constrain the galaxy star formation histories. The wide area and depth of SERVS was designed precisely to capture the light from the most massive galaxies up to high-redshift. Results and comparison with the literature will be presented.

  19. Architecture and implementation for high-bandwidth real-time radar signal transmission and computing application

    NASA Astrophysics Data System (ADS)

    Cho, Yoong-Goog; Chandrasekar, V.; Jayasumana, Anura P.; Brunkow, David

    2002-06-01

    he design, architecture, and implementation for the high-throughput data transmission and high-performance computing,which are applicable for various real-time radar signal transmission applications over the data network, are presented. With a client-server model, the multiple processes and threads on the end systems operate simultaneously and collaborately to meet the real-time requirement. The design covers the Digitized Radar Signal (DRS) data acquisition and data transmission on the DRS server end as well as DRS data receiving, radar signal parameter computation and parameter transmission on the DRS receiver end. Generic packet and data structures for transmission and inter-process data sharing are constructed. The architecture was successfully implemented on Sun/Solaris workstations with dual 750 MHz UltraSPARC-III processors containing Gigabit Ethernet card. The comparison in transmission throughput over gigabit link between with computation and without computation clearly shows the importance of the signal processing capability on the end-to-end performance. Profiling analysis on the DRS receiver process shows the work-loaded functions and provides guides for improving computing capabilities.

  20. A high performance Josephson binary counter implemented in Nb and NbN technology

    SciTech Connect

    Kuo, F. ); Whitely, S.R.

    1991-03-01

    This paper reports on a Josephson binary counter with nondestructive readout implemented and tested in both niobium and niobium nitride technology. Successful operation of the Nb version has been observed. The design incorporates an additional tapered edge SiO{sub 2} level in the Nb processing sequence, which increases interferometer inductance, decreases capacitance, and ensures that geometric resonances are as high in frequency as possible. This new level has the added advantage of providing mask compatibility with the NbN process, as this level is skipped in the NbN flow, thereby compensating in part for the larger penetration depth of NbN. The counter cell is designed to be as compact as possible to minimize stray inductance and maximize top count rate and high count rate bias margins low read SQUID inductance, and requires no holes in the ground plane.

  1. Analysis of the Steady-State Eddy Available Energy Budget in the High-Latitude Lower Thermosphere

    NASA Astrophysics Data System (ADS)

    Richmond, A. D.; Kwak, Y.; Roble, R. G.

    2007-05-01

    Only part of the energy of the thermospheric gas is available for driving dynamics. This eddy available energy (EAE) is composed of an eddy kinetic energy (EKE) and an eddy available potential energy (EAPE). In the high-latitude thermosphere EKE is generated primarily where the ion-drag force associated with plasma convection accelerates the neutral gas, and is destroyed primarily where the ion-drag force opposes the wind. EAPE is generated primarily where Joule heat is deposited in regions of elevated temperatures, and destroyed where the heat is deposited in regions of reduced temperatures. We have evaluated the budgets of EAE production, transport, and loss under steady-state forcing of the high-latitude lower thermosphere, using the NCAR Thermosphere-Ionosphere-Electrodynamics General-Circulation Model. In general, ion-drag forcing is a larger contributor to both the production and destruction (depending on location) of EAE than is Joule heating for steady-state conditions, although Joule heating can play a more significant role for impulsive forcing. Transport of EAE by horizontal and vertical winds is a significant component of the EAE budget. Conversion of EAPE to EKE, and of EKE to EAPE, constitutes an important part of the budgets of these two components of EAE.

  2. A C++11 implementation of arbitrary-rank tensors for high-performance computing

    NASA Astrophysics Data System (ADS)

    Aragón, Alejandro M.

    2014-11-01

    This article discusses an efficient implementation of tensors of arbitrary rank by using some of the idioms introduced by the recently published C++ ISO Standard (C++11). With the aims at providing a basic building block for high-performance computing, a single Array class template is carefully crafted, from which vectors, matrices, and even higher-order tensors can be created. An expression template facility is also built around the array class template to provide convenient mathematical syntax. As a result, by using templates, an extra high-level layer is added to the C++ language when dealing with algebraic objects and their operations, without compromising performance. The implementation is tested running on both CPU and GPU. Catalogue identifier: AESA_v1_1 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AESA_v1_1.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: GNU Lesser General Public License, version 3 No. of lines in distributed program, including test data, etc.: 12 376 No. of bytes in distributed program, including test data, etc.: 81 669 Distribution format: tar.gz Programming language: C++. Computer: All modern architectures. Operating system: Linux/Unix/Mac OS. RAM: Problem dependent Classification: 5. External routines: GNU CMake build system and BLAS implementation. NVIDIA CUBLAS for GPU computing. Does the new version supersede the previous version?: Yes Catalogue identifier of previous version: AESA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 185 (2014) 1681 Nature of problem: Tensors are a basic building block for any program in scientific computing. Yet, tensors are not a built-in component of the C++ programming language. Solution method: An arbitrary-rank tensor class template is crafted by using the new features introduced by the C++11 set of requirements. In addition, an entire expression template facility is built on top, to provide mathematical

  3. The Relationship between Professional Learning Community Implementation and Academic Achievement and Graduation Rates in Georgia High Schools

    ERIC Educational Resources Information Center

    Hardinger, Regina Gail

    2013-01-01

    Many educational administrators in Georgia continue to struggle with low student academic achievement and low high school graduation rates. DuFour's professional learning community (PLC) theory suggests a positive relationship between levels of PLC implementation and academic achievement and between levels of PLC implementation and graduation…

  4. Vocational High School Teachers' Difficulties in Implementing the Assessment in Curriculum 2013 in Yogyakarta Province of Indonesia

    ERIC Educational Resources Information Center

    Retnawati, Heri; Hadi, Samsul; Nugraha, Ariadie Chandra

    2016-01-01

    The study aims to describe vocational high school teachers' difficulties in implementing the assessment within Curriculum 2013, which has been implemented since July 2013 in several Indonesian schools and which might have been in effect in all schools around 2014. The study was descriptive explorative research by means of qualitative data…

  5. Development, Implementation and Application of Micromechanical Analysis Tools for Advanced High Temperature Composites

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This document contains the final report to the NASA Glenn Research Center (GRC) for the research project entitled Development, Implementation, and Application of Micromechanical Analysis Tools for Advanced High-Temperature Composites. The research supporting this initiative has been conducted by Dr. Brett A. Bednarcyk, a Senior Scientist at OM in Brookpark, Ohio from the period of August 1998 to March 2005. Most of the work summarized herein involved development, implementation, and application of enhancements and new capabilities for NASA GRC's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) software package. When the project began, this software was at a low TRL (3-4) and at release version 2.0. Due to this project, the TRL of MAC/GMC has been raised to 7 and two new versions (3.0 and 4.0) have been released. The most important accomplishments with respect to MAC/GMC are: (1) A multi-scale framework has been built around the software, enabling coupled design and analysis from the global structure scale down to the micro fiber-matrix scale; (2) The software has been expanded to analyze smart materials; (3) State-of-the-art micromechanics theories have been implemented and validated within the code; (4) The damage, failure, and lifing capabilities of the code have been expanded from a very limited state to a vast degree of functionality and utility; and (5) The user flexibility of the code has been significantly enhanced. MAC/GMC is now the premier code for design and analysis of advanced composite and smart materials. It is a candidate for the 2005 NASA Software of the Year Award. The work completed over the course of the project is summarized below on a year by year basis. All publications resulting from the project are listed at the end of this report.

  6. The content of high-intensity sweeteners in different categories of foods available on the Polish market.

    PubMed

    Zygler, Agata; Wasik, Andrzej; Kot-Wasik, Agata; Namieśnik, Jacek

    2012-01-01

    The objective of this study was to measure the concentrations of nine high-intensity sweeteners (acesulfame-K, aspartame, alitame, cyclamate, dulcin, neohesperidin DC, neotame, saccharin and sucralose) in different categories of food available on the Polish market. Over 170 samples of different brands of beverages, yoghurts, fruit preparations, vegetable preserves and fish products were analysed using an analytical procedure based on SPE and LC/MS. The results indicated that foodstuffs under the study generally comply with European Union legislation in terms of sweetener content. However, a few cases of food product mislabelling were detected, i.e. the use of cyclamate for non-approved applications. PMID:22827164

  7. Implementation of a Novel Flight Tracking and Recovery Package for High Altitude Ballooning Missions

    NASA Astrophysics Data System (ADS)

    Fatima, Aqsa; Nekkanti, Sanjay; Mohan Suri, Ram; Shankar, Divya; Prasad Nagendra, Narayan

    High altitude ballooning is typically used for scientific missions including stratospheric observations, aerological observations, and near space environment technology demonstration. The usage of stratospheric balloons is a cost effective method to pursue several scientific and technological avenues against using satellites in the void of space. Based on the Indian Institute of Astrophysics (IIA) ballooning program for studying Comet ISON using high altitude ballooning, a cost effective flight tracking and recovery package for ballooning missions has been developed using open source hardware. The flight tracking and recovery package is based on using Automatic Packet Reporting System (APRS) and has a redundant Global System for Mobile Communications (GSM) based Global Positioning System (GPS) tracker. The APRS based tracker uses AX.25 protocol for transmission of the GPS coordinates (latitude, longitude, altitude, time) alongside the heading and health parameters of the board (voltage, temperature). APRS uses amateur radio frequencies where data is transmitted in packet messaging format, modulated by radio signals. The receiver uses Very High Frequency (VHF) transceiver to demodulate the APRS signals. The data received will be decoded using MixW (open source software). A bridge will be established between the decoding software and the APRS software. The flight path will be predicted before the launch and the real time position co-ordinates will be used to obtain the real time flight path that will be uploaded online using the bridge connection. We also use open source APRS software to decode and Google Earth to display the real time flight path. Several ballooning campaigns do not employ payload data transmission in real time, which makes the flight tracking and package recovery vital for data collection and recovery of flight instruments. The flight tracking and recovery package implemented in our missions allow independent development of the payload package

  8. Centrality dependence of high energy jets in p+Pb collisions at energies available at the CERN Large Hadron Collider

    DOE PAGESBeta

    Bzdak, Adam; Skokov, Vladimir; Bathe, Stefan

    2016-04-08

    We investigate the recently measured centrality dependence of high energy jets in proton-lead collisions at the LHC. Here, we hypothesize that events with jets of very high energy (a few hundred GeV) are characterized by a suppressed number of soft particles, thus shifting these events into more peripheral bins. This naturally results in the suppression (enhancement) of the nuclear modification factor, RpA, in central (peripheral) collisions. Our calculations suggest that a moderate suppression of the order of 20%, for 103 GeV jets, can quantitatively reproduce the experimental data. Finally, we further extract the suppression factor as a function of jetmore » energy and test our conjecture using available RpA data for various centralities.« less

  9. Next Generation Fast RF Interlock Module and ATCA Adapter for ILC High Availability RF Test Station Demonstration

    SciTech Connect

    Larsen, R

    2009-10-17

    High availability interlocks and controls are required for the ILC (International Linear Collider) L-Band high power RF stations. A new F3 (Fast Fault Finder) VME module has been developed to process both fast and slow interlocks using FPGA logic to detect the interlock trip excursions. This combination eliminates the need for separate PLC (Programmable Logic Controller) control of slow interlocks. Modules are chained together to accommodate as many inputs as needed. In the next phase of development the F3's will be ported to the new industry standard ATCA (Advanced Telecom Computing Architecture) crate (shelf) via a specially designed VME adapter module with IPMI (Intelligent Platform Management Interface). The goal is to demonstrate auto-failover and hot-swap for future partially redundant systems.

  10. Performance evaluation and capacity planning for a scalable and highly available virtualisation infrastructure for the LHCb experiment

    NASA Astrophysics Data System (ADS)

    Bonaccorsi, E.; Neufeld, N.; Sborzacchi, F.

    2014-06-01

    The virtual computing is often run to satisfy different needs: reduce costs, reduce resources, simplify maintenance and the last but not the least adds flexibility. The use of Virtualization in a complex system such as a farm of PCs that control the hardware of an experiment (PLC, power supplies, gas, magnets...) put us in a condition where not only an High Performance requirements need to be carefully considered but also a deep analysis of strategies to achieve a certain level of High Availability. We conducted a performance evaluation on different and comparable storage/network/virtualization platforms. The performance is measured using a series of independent benchmarks, testing the speed and the stability of multiple VMs running heavy-load operations on the I/O of virtualized storage and the virtualized network. The result from the benchmark tests allowed us to study and evaluate how the different VMs workloads interact with the Hardware/Software resource layers.