Sample records for standard techniques basis

  1. [The evaluation of costs: standards of medical care and clinical statistic groups].

    PubMed

    Semenov, V Iu; Samorodskaia, I V

    2014-01-01

    The article presents the comparative analysis of techniques of evaluation of costs of hospital treatment using medical economic standards of medical care and clinical statistical groups. The technique of evaluation of costs on the basis of clinical statistical groups was developed almost fifty years ago and is largely applied in a number of countries. Nowadays, in Russia the payment for completed case of treatment on the basis of medical economic standards is the main mode of payment for medical care in hospital. It is very conditionally a Russian analogue of world-wide prevalent system of diagnostic related groups. The tariffs for these cases of treatment as opposed to clinical statistical groups are counted on basis of standards of provision of medical care approved by Minzdrav of Russia. The information derived from generalization of cases of treatment of real patients is not applied.

  2. Hybrid Grid and Basis Set Approach to Quantum Chemistry DMRG

    NASA Astrophysics Data System (ADS)

    Stoudenmire, Edwin Miles; White, Steven

    We present a new approach for using DMRG for quantum chemistry that combines the advantages of a basis set with that of a grid approximation. Because DMRG scales linearly for quasi-one-dimensional systems, it is feasible to approximate the continuum with a fine grid in one direction while using a standard basis set approach for the transverse directions. Compared to standard basis set methods, we reach larger systems and achieve better scaling when approaching the basis set limit. The flexibility and reduced costs of our approach even make it feasible to incoporate advanced DMRG techniques such as simulating real-time dynamics. Supported by the Simons Collaboration on the Many-Electron Problem.

  3. 40 CFR 63.5799 - How do I calculate my facility's organic HAP emissions on a tpy basis for purposes of determining...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites Production... to incorporation of pollution-prevention control techniques, existing facilities may base the average...

  4. Testing Gravity and Cosmic Acceleration with Galaxy Clustering

    NASA Astrophysics Data System (ADS)

    Kazin, Eyal; Tinker, J.; Sanchez, A. G.; Blanton, M.

    2012-01-01

    The large-scale structure contains vast amounts of cosmological information that can help understand the accelerating nature of the Universe and test gravity on large scales. Ongoing and future sky surveys are designed to test these using various techniques applied on clustering measurements of galaxies. We present redshift distortion measurements of the Sloan Digital Sky Survey II Luminous Red Galaxy sample. We find that when combining the normalized quadrupole Q with the projected correlation function wp(rp) along with cluster counts (Rapetti et al. 2010), results are consistent with General Relativity. The advantage of combining Q and wp is the addition of the bias information, when using the Halo Occupation Distribution framework. We also present improvements to the standard technique of measuring Hubble expansion rates H(z) and angular diameter distances DA(z) when using the baryonic acoustic feature as a standard ruler. We introduce clustering wedges as an alternative basis to the multipole expansion and show that it yields similar constraints. This alternative basis serves as a useful technique to test for systematics, and ultimately improve measurements of the cosmic acceleration.

  5. [A method for inducing standardized spiral fractures of the tibia in the animal experiment].

    PubMed

    Seibold, R; Schlegel, U; Cordey, J

    1995-07-01

    A method for the deliberate weakening of cortical bone has been developed on the basis of an already established technique for creating butterfly fractures. It enables one to create the same type of fracture, i.e., a spiral fracture, every time. The fracturing process is recorded as a force-strain curve. The results of the in vitro investigations form a basis for the preparation of experimental tasks aimed at demonstrating internal fixation techniques and their influence on the vascularity of the bone in simulated fractures. Animal protection law lays down that this fracture model must not fail in animal experiments.

  6. Structural reliability assessment of the Oman India Pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Sharif, A.M.; Preston, R.

    1996-12-31

    Reliability techniques are increasingly finding application in design. The special design conditions for the deep water sections of the Oman India Pipeline dictate their use since the experience basis for application of standard deterministic techniques is inadequate. The paper discusses the reliability analysis as applied to the Oman India Pipeline, including selection of a collapse model, characterization of the variability in the parameters that affect pipe resistance to collapse, and implementation of first and second order reliability analyses to assess the probability of pipe failure. The reliability analysis results are used as the basis for establishing the pipe wall thicknessmore » requirements for the pipeline.« less

  7. AWS breaks new ground with soldering specification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vianco, Paul Thomas

    Joining technologies continue to advance with new materials, process innovations, and inspection techniques. An increasing number of high-valued, high-reliability applications -- from boilers and ship hulls to rocket motors and medical devices -- have required the development of industry standards and specifications in order to ensure that the best design and manufacturing practices are being used to produce safe, durable products and assemblies. Standards writing has always had an important role at the American Welding Society (AWS). The AWS standards and specifications cover such topics as filler materials, joining processes, inspection techniques, and qualification methods that are used in weldingmore » and brazing technologies. These AWS standards and specifications, all of which are approved by the American National Standards Institute (ANSI), have also provided the basis for many similar documents used in Europe and in Pacific Rim countries.« less

  8. Defining the IEEE-854 floating-point standard in PVS

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1995-01-01

    A significant portion of the ANSI/IEEE-854 Standard for Radix-Independent Floating-Point Arithmetic is defined in PVS (Prototype Verification System). Since IEEE-854 is a generalization of the ANSI/IEEE-754 Standard for Binary Floating-Point Arithmetic, the definition of IEEE-854 in PVS also formally defines much of IEEE-754. This collection of PVS theories provides a basis for machine checked verification of floating-point systems. This formal definition illustrates that formal specification techniques are sufficiently advanced that is is reasonable to consider their use in the development of future standards.

  9. Comparison of data inversion techniques for remotely sensed wide-angle observations of Earth emitted radiation

    NASA Technical Reports Server (NTRS)

    Green, R. N.

    1981-01-01

    The shape factor, parameter estimation, and deconvolution data analysis techniques were applied to the same set of Earth emitted radiation measurements to determine the effects of different techniques on the estimated radiation field. All three techniques are defined and their assumptions, advantages, and disadvantages are discussed. Their results are compared globally, zonally, regionally, and on a spatial spectrum basis. The standard deviations of the regional differences in the derived radiant exitance varied from 7.4 W-m/2 to 13.5 W-m/2.

  10. Vaginal Vault Suspension at Hysterectomy for Prolapse – Myths and Facts, Anatomical Requirements, Fixation Techniques, Documentation and Cost Accounting

    PubMed Central

    Graefe, F.; Marschke, J.; Dimpfl, T.; Tunn, R.

    2012-01-01

    Vaginal vault suspension during hysterectomy for prolapse is both a therapy for apical insufficiency and helps prevent recurrence. Numerous techniques exist, with different anatomical results and differing complications. The description of the different approaches together with a description of the vaginal vault suspension technique used at the Department for Urogynaecology at St. Hedwig Hospital could serve as a basis for reassessment and for recommendations by scientific associations regarding general standards. PMID:25278621

  11. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  12. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  13. Direct magnitude estimates of speech intelligibility in dysarthria: effects of a chosen standard.

    PubMed

    Weismer, Gary; Laures, Jacqueline S

    2002-06-01

    Direct magnitude estimation (DME) has been used frequently as a perceptual scaling technique in studies of the speech intelligibility of persons with speech disorders. The technique is typically used with a standard, or reference stimulus, chosen as a good exemplar of "midrange" intelligibility. In several published studies, the standard has been chosen subjectively, usually on the basis of the expertise of the investigators. The current experiment demonstrates that a fixed set of sentence-level utterances, obtained from 4 individuals with dysarthria (2 with Parkinson disease, 2 with traumatic brain injury) as well as 3 neurologically normal speakers, is scaled differently depending on the identity of the standard. Four different standards were used in the main experiment, three of which were judged qualitatively in two independent evaluations to be good exemplars of midrange intelligibility. Acoustic analyses did not reveal obvious differences between these four standards but suggested that the standard with the worst-scaled intelligibility had much poorer voice source characteristics compared to the other three standards. Results are discussed in terms of possible standardization of midrange intelligibility exemplars for DME experiments.

  14. Continuous internal channels formed in aluminum fusion welds

    NASA Technical Reports Server (NTRS)

    Gault, J.; Sabo, W.

    1967-01-01

    Process produces continuous internal channel systems on a repeatable basis in 2014-T6 aluminum. Standard machining forms the initial channel, which is filled with tungsten carbide powder. TIG machine fusion welding completes formation of the channel. Chem-mill techniques enlarge it to the desired size.

  15. GRACE L1b inversion through a self-consistent modified radial basis function approach

    NASA Astrophysics Data System (ADS)

    Yang, Fan; Kusche, Juergen; Rietbroek, Roelof; Eicker, Annette

    2016-04-01

    Implementing a regional geopotential representation such as mascons or, more general, RBFs (radial basis functions) has been widely accepted as an efficient and flexible approach to recover the gravity field from GRACE (Gravity Recovery and Climate Experiment), especially at higher latitude region like Greenland. This is since RBFs allow for regionally specific regularizations over areas which have sufficient and dense GRACE observations. Although existing RBF solutions show a better resolution than classical spherical harmonic solutions, the applied regularizations cause spatial leakage which should be carefully dealt with. It has been shown that leakage is a main error source which leads to an evident underestimation of yearly trend of ice-melting over Greenland. Unlike some popular post-processing techniques to mitigate leakage signals, this study, for the first time, attempts to reduce the leakage directly in the GRACE L1b inversion by constructing an innovative modified (MRBF) basis in place of the standard RBFs to retrieve a more realistic temporal gravity signal along the coastline. Our point of departure is that the surface mass loading associated with standard RBF is smooth but disregards physical consistency between continental mass and passive ocean response. In this contribution, based on earlier work by Clarke et al.(2007), a physically self-consistent MRBF representation is constructed from standard RBFs, with the help of the sea level equation: for a given standard RBF basis, the corresponding MRBF basis is first obtained by keeping the surface load over the continent unchanged, but imposing global mass conservation and equilibrium response of the oceans. Then, the updated set of MRBFs as well as standard RBFs are individually employed as the basis function to determine the temporal gravity field from GRACE L1b data. In this way, in the MRBF GRACE solution, the passive (e.g. ice melting and land hydrology response) sea level is automatically separated from ocean dynamic effects, and our hypothesis is that in this way we improve the partitioning of the GRACE signals into land and ocean contributions along the coastline. In particular, we inspect the ice-melting over Greenland from real GRACE data, and we evaluate the ability of the MRBF approach to recover true mass variations along the coastline. Finally, using independent measurements from multiple techniques including GPS vertical motion and altimetry, a validation will be presented to quantify to what extent it is possible to reduce the leakage through the MRBF approach.

  16. Study of radar pulse compression for high resolution satellite altimetry

    NASA Technical Reports Server (NTRS)

    Dooley, R. P.; Nathanson, F. E.; Brooks, L. W.

    1974-01-01

    Pulse compression techniques are studied which are applicable to a satellite altimeter having a topographic resolution of + 10 cm. A systematic design procedure is used to determine the system parameters. The performance of an optimum, maximum likelihood processor is analysed, which provides the basis for modifying the standard split-gate tracker to achieve improved performance. Bandwidth considerations lead to the recommendation of a full deramp STRETCH pulse compression technique followed by an analog filter bank to separate range returns. The implementation of the recommended technique is examined.

  17. A Deterministic Annealing Approach to Clustering AIRS Data

    NASA Technical Reports Server (NTRS)

    Guillaume, Alexandre; Braverman, Amy; Ruzmaikin, Alexander

    2012-01-01

    We will examine the validity of means and standard deviations as a basis for climate data products. We will explore the conditions under which these two simple statistics are inadequate summaries of the underlying empirical probability distributions by contrasting them with a nonparametric, method called Deterministic Annealing technique

  18. Diagnostic testing for Giardia infections.

    PubMed

    Heyworth, Martin F

    2014-03-01

    The traditional method for diagnosing Giardia infections involves microscopic examination of faecal specimens for Giardia cysts. This method is subjective and relies on observer experience. From the 1980s onwards, objective techniques have been developed for diagnosing Giardia infections, and are superseding diagnostic techniques reliant on microscopy. Detection of Giardia antigen(s) by immunoassay is the basis of commercially available diagnostic kits. Various nucleic acid amplification techniques (NAATs) can demonstrate DNA of Giardia intestinalis, and have the potential to become standard approaches for diagnosing Giardia infections. Of such techniques, methods involving either fluorescent microspheres (Luminex) or isothermal amplification of DNA (loop-mediated isothermal amplification; LAMP) are especially promising.

  19. Methodology to estimate particulate matter emissions from certified commercial aircraft engines.

    PubMed

    Wayson, Roger L; Fleming, Gregg G; Lovinelli, Ralph

    2009-01-01

    Today, about one-fourth of U.S. commercial service airports, including 41 of the busiest 50, are either in nonattainment or maintenance areas per the National Ambient Air Quality Standards. U.S. aviation activity is forecasted to triple by 2025, while at the same time, the U.S. Environmental Protection Agency (EPA) is evaluating stricter particulate matter (PM) standards on the basis of documented human health and welfare impacts. Stricter federal standards are expected to impede capacity and limit aviation growth if regulatory mandated emission reductions occur as for other non-aviation sources (i.e., automobiles, power plants, etc.). In addition, strong interest exists as to the role aviation emissions play in air quality and climate change issues. These reasons underpin the need to quantify and understand PM emissions from certified commercial aircraft engines, which has led to the need for a methodology to predict these emissions. Standardized sampling techniques to measure volatile and nonvolatile PM emissions from aircraft engines do not exist. As such, a first-order approximation (FOA) was derived to fill this need based on available information. FOA1.0 only allowed prediction of nonvolatile PM. FOA2.0 was a change to include volatile PM emissions on the basis of the ratio of nonvolatile to volatile emissions. Recent collaborative efforts by industry (manufacturers and airlines), research establishments, and regulators have begun to provide further insight into the estimation of the PM emissions. The resultant PM measurement datasets are being analyzed to refine sampling techniques and progress towards standardized PM measurements. These preliminary measurement datasets also support the continued refinement of the FOA methodology. FOA3.0 disaggregated the prediction techniques to allow for independent prediction of nonvolatile and volatile emissions on a more theoretical basis. The Committee for Aviation Environmental Protection of the International Civil Aviation Organization endorsed the use of FOA3.0 in February 2007. Further commitment was made to improve the FOA as new data become available, until such time the methodology is rendered obsolete by a fully validated database of PM emission indices for today's certified commercial fleet. This paper discusses related assumptions and derived equations for the FOA3.0 methodology used worldwide to estimate PM emissions from certified commercial aircraft engines within the vicinity of airports.

  20. Estimating standard errors in feature network models.

    PubMed

    Frank, Laurence E; Heiser, Willem J

    2007-05-01

    Feature network models are graphical structures that represent proximity data in a discrete space while using the same formalism that is the basis of least squares methods employed in multidimensional scaling. Existing methods to derive a network model from empirical data only give the best-fitting network and yield no standard errors for the parameter estimates. The additivity properties of networks make it possible to consider the model as a univariate (multiple) linear regression problem with positivity restrictions on the parameters. In the present study, both theoretical and empirical standard errors are obtained for the constrained regression parameters of a network model with known features. The performance of both types of standard error is evaluated using Monte Carlo techniques.

  1. Demonstration of PIV in a Transonic Compressor

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    1997-01-01

    Particle Imaging Velocimetry (PIV) is a powerful measurement technique which can be used as an alternative or complementary approach to Laser Doppler Velocimetry (LDV) in a wide range of research applications. PIV data are measured simultaneously at multiple points in space, which enables the investigation of the non-stationary spatial structures typically encountered in turbomachinery. Many of the same issues encountered in the application of LDV techniques to rotating machinery apply in the application of PIV. Preliminary results from the successful application of the standard 2-D PIV technique to a transonic axial compressor are presented. The lessons learned from the application of the 2-D PIV technique will serve as the basis for applying 3-component PIV techniques to turbomachinery.

  2. Geology orbiter comparison study

    NASA Technical Reports Server (NTRS)

    Cutts, J. A. J.; Blasius, K. R.; Davis, D. R.; Pang, K. D.; Shreve, D. C.

    1977-01-01

    Instrument requirements of planetary geology orbiters were examined with the objective of determining the feasibility of applying standard instrument designs to a host of terrestrial targets. Within the basic discipline area of geochemistry, gamma-ray, X-ray fluorescence, and atomic spectroscopy remote sensing techniques were considered. Within the discipline area of geophysics, the complementary techniques of gravimetry and radar were studied. Experiments using these techniques were analyzed for comparison at the Moon, Mercury, Mars and the Galilean satellites. On the basis of these comparative assessments, the adaptability of each sensing technique was judged as a basic technique for many targets, as a single instrument applied to many targets, as a single instrument used in different mission modes, and as an instrument capability for nongeoscience objectives.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eppich, G.; Kips, R.; Lindvall, R.

    The CUP-2 uranium ore concentrate (UOC) standard reference material, a powder, was produced at the Blind River uranium refinery of Eldorado Resources Ltd. in Canada in 1986. This material was produced as part of a joint effort by the Canadian Certified Reference Materials Project and the Canadian Uranium Producers Metallurgical Committee to develop a certified reference material for uranium concentration and the concentration of several impurity constituents. This standard was developed to satisfy the requirements of the UOC mining and milling industry, and was characterized with this purpose in mind. To produce CUP-2, approximately 25 kg of UOC derived frommore » the Blind River uranium refinery was blended, homogenized, and assessed for homogeneity by X-ray fluorescence (XRF) analysis. The homogenized material was then packaged into bottles, containing 50 g of material each, and distributed for analysis to laboratories in 1986. The CUP-2 UOC standard was characterized by an interlaboratory analysis program involving eight member laboratories, six commercial laboratories, and three additional volunteer laboratories. Each laboratory provided five replicate results on up to 17 analytes, including total uranium concentration, and moisture content. The selection of analytical technique was left to each participating laboratory. Uranium was reported on an “as-received” basis; all other analytes (besides moisture content) were reported on a “dry-weight” basis. A bottle of 25g of CUP-2 UOC standard as described above was purchased by LLNL and characterized by the LLNL Nuclear Forensics Group. Non-destructive and destructive analytical techniques were applied to the UOC sample. Information obtained from short-term techniques such as photography, gamma spectrometry, and scanning electron microscopy were used to guide the performance of longer-term techniques such as ICP-MS. Some techniques, such as XRF and ICP-MS, provided complementary types of data. The results indicate that the CUP-2 standard has a natural isotopic ratio, and does not appear to have been isotopically enriched or depleted in any way, and was not contaminated by a source of uranium with a non-natural isotopic composition. Furthermore, the lack of 233U and 236U above the instrumental detection limit indicates that this sample was not exposed to a neutron flux, which would have generated one or both of these isotopes in measurable concentrations.« less

  4. Calibrated LCD/TFT stimulus presentation for visual psychophysics in fMRI.

    PubMed

    Strasburger, H; Wüstenberg, T; Jäncke, L

    2002-11-15

    Standard projection techniques using liquid crystal (LCD) or thin-film transistor (TFT) technology show drastic distortions in luminance and contrast characteristics across the screen and across grey levels. Common luminance measurement and calibration techniques are not applicable in the vicinity of MRI scanners. With the aid of a fibre optic, we measured screen luminances for the full space of screen position and image grey values and on that basis developed a compensation technique that involves both luminance homogenisation and position-dependent gamma correction. By the technique described, images displayed to a subject in functional MRI can be specified with high precision by a matrix of desired luminance values rather than by local grey value.

  5. Hyperhidrosis: review of recent advances and new therapeutic options for primary hyperhidrosis.

    PubMed

    Brown, Ashley L; Gordon, Jennifer; Hill, Samantha

    2014-08-01

    Primary focal hyperhidrosis is a common condition that negatively impacts quality of life for many pediatric patients and can be challenging to treat. Standard treatments for hyperhidrosis can be used with success in many patients, and newer therapies and techniques offer options that have demonstrated efficacy and safety. This review highlights standard therapies for primary focal hyperhidrosis as well as the most recent technique advancements and alternative treatment options. The standard approach to treating primary focal hyperhidrosis remains initiation of topical preparations, followed by advancement to systemic medications, local administration of medication and/or surgical procedures. Recent studies focus on enhancing tolerability of topical preparations as well as evaluating the efficacy of neuromodulator injections, oral anticholinergic medications and laser therapy. Microwave technology has also been introduced for the treatment of focal hyperhidrosis with promising results. Many therapies exist for hyperhidrosis, and each treatment plan must be evaluated on a patient-by-patient basis. Advances in standard therapies and emergence of new treatment techniques are the main emphases of current published literature on hyperhidrosis. This article presents recent therapeutic options as well as updates on more established strategies to help practitioners treat this challenging condition.

  6. Application of satellite remote sensing to North Carolina. Development of a monitoring methodology for trophic states of lakes in North Carolina

    NASA Technical Reports Server (NTRS)

    Welby, C. W.; Holman, R. E.

    1977-01-01

    Conjunctive study of four shallow coastal plain lakes in northeastern North Carolina and their LANDSAT-2 images demonstrates that it is possible to differentiate between the lakes and their respective trophic states on the basis of the multispectral scanner imagery. The year-long investigation established that monitoring of the trophic states of the lakes on a seasonal basis through application of color additive imagery enchancement techniques is possible. Utilizing a standard setting of the color additive viewer, an investigator can normalize the imagery to an internal standard of constant reflectance characteristics. By comparison of the false color renditions with a standard interference color chart combined with brightness measurements made on the viewer screen, one can relate the lake reflectances to their trophic states. Two or more bands of the imagery are required, and the present study established that for the lakes studied, Band 5 and Band 6 form a good combination.

  7. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  8. Towards improving the NASA standard soil moisture retrieval algorithm and product

    NASA Astrophysics Data System (ADS)

    Mladenova, I. E.; Jackson, T. J.; Njoku, E. G.; Bindlish, R.; Cosh, M. H.; Chan, S.

    2013-12-01

    Soil moisture mapping using passive-based microwave remote sensing techniques has proven to be one of the most effective ways of acquiring reliable global soil moisture information on a routine basis. An important step in this direction was made by the launch of the Advanced Microwave Scanning Radiometer on the NASA's Earth Observing System Aqua satellite (AMSR-E). Along with the standard NASA algorithm and operational AMSR-E product, the easy access and availability of the AMSR-E data promoted the development and distribution of alternative retrieval algorithms and products. Several evaluation studies have demonstrated issues with the standard NASA AMSR-E product such as dampened temporal response and limited range of the final retrievals and noted that the available global passive-based algorithms, even though based on the same electromagnetic principles, produce different results in terms of accuracy and temporal dynamics. Our goal is to identify the theoretical causes that determine the reduced sensitivity of the NASA AMSR-E product and outline ways to improve the operational NASA algorithm, if possible. Properly identifying the underlying reasons that cause the above mentioned features of the NASA AMSR-E product and differences between the alternative algorithms requires a careful examination of the theoretical basis of each approach. Specifically, the simplifying assumptions and parametrization approaches adopted by each algorithm to reduce the dimensionality of unknowns and characterize the observing system. Statistically-based error analyses, which are useful and necessary, provide information on the relative accuracy of each product but give very little information on the theoretical causes, knowledge that is essential for algorithm improvement. Thus, we are currently examining the possibility of improving the standard NASA AMSR-E global soil moisture product by conducting a thorough theoretically-based review of and inter-comparisons between several well established global retrieval techniques. A detailed discussion focused on the theoretical basis of each approach and algorithms sensitivity to assumptions and parametrization approaches will be presented. USDA is an equal opportunity provider and employer.

  9. Application of selection techniques to electric-propulsion options on an advanced synchronous satellite

    NASA Technical Reports Server (NTRS)

    Holcomb, L. B.; Degrey, S. P.

    1973-01-01

    This paper addresses the comparison of several candidate auxiliary-propulsion systems and system combinations for an advanced synchronous satellite. Economic selection techniques, evolved at the Jet Propulsion Laboratory, are used as a basis for system option comparisons. Electric auxiliary-propulsion types considered include pulsed plasma and ion bombardment, with hydrazine systems used as a state-of-the-art reference. Current as well as projected electric-propulsion system performance data are used, as well as projected hydrazine system costs resulting from NASA standardization program projections.

  10. The Development of Maritime English Learning Model Using Authentic Assessment Based Bridge Simulator in Merchant Marine Polytechnic, Makassar

    ERIC Educational Resources Information Center

    Fauzi, Ahmad; Bundu, Patta; Tahmir, Suradi

    2016-01-01

    Bridge simulator constitutes a very fundamental and vital tool to trigger and ensure that seamen or seafarers possess the standardized competence required. By using the bridge simulator technique, a reality based study can be presented easily and delivered to the students in ongoing basis to their classroom or study place. Afterwards, the validity…

  11. Novel Histogram Based Unsupervised Classification Technique to Determine Natural Classes From Biophysically Relevant Fit Parameters to Hyperspectral Data

    DOE PAGES

    McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra; ...

    2017-05-23

    Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less

  12. Novel Histogram Based Unsupervised Classification Technique to Determine Natural Classes From Biophysically Relevant Fit Parameters to Hyperspectral Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, Cooper; Repasky, Kevin S.; Morin, Mikindra

    Hyperspectral image analysis has benefited from an array of methods that take advantage of the increased spectral depth compared to multispectral sensors; however, the focus of these developments has been on supervised classification methods. Lack of a priori knowledge regarding land cover characteristics can make unsupervised classification methods preferable under certain circumstances. An unsupervised classification technique is presented in this paper that utilizes physically relevant basis functions to model the reflectance spectra. These fit parameters used to generate the basis functions allow clustering based on spectral characteristics rather than spectral channels and provide both noise and data reduction. Histogram splittingmore » of the fit parameters is then used as a means of producing an unsupervised classification. Unlike current unsupervised classification techniques that rely primarily on Euclidian distance measures to determine similarity, the unsupervised classification technique uses the natural splitting of the fit parameters associated with the basis functions creating clusters that are similar in terms of physical parameters. The data set used in this work utilizes the publicly available data collected at Indian Pines, Indiana. This data set provides reference data allowing for comparisons of the efficacy of different unsupervised data analysis. The unsupervised histogram splitting technique presented in this paper is shown to be better than the standard unsupervised ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. Finally, this improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA.« less

  13. [Standardization of Blastocystis hominis diagnosis using different staining techniques].

    PubMed

    Eymael, Dayane; Schuh, Graziela Maria; Tavares, Rejane Giacomelli

    2010-01-01

    The present study was carried out from March to May 2008, with the aim of evaluating the effectiveness of different techniques for diagnosing Blastocystis hominis in a sample of the population attended at the Biomedicine Laboratory of Feevale University, Novo Hamburgo, Rio Grande do Sul. On hundred feces samples from children and adults were evaluated. After collection, the samples were subjected to the techniques of spontaneous sedimentation (HPJ), sedimentation in formalin-ether (Ritchie) and staining by means of Gram and May-Grünwald-Giemsa (MGG). The presence of Blastocystis hominis was observed in 40 samples, when staining techniques were used (MGG and Gram), while sedimentation techniques were less efficient (32 positive samples using the Ritchie technique and 20 positive samples using the HPJ technique). Our results demonstrate that HPJ was less efficient than the other methods, thus indicating the need to include laboratory techniques that enable parasite identification on a routine basis.

  14. Radiometry of water turbidity measurements

    NASA Technical Reports Server (NTRS)

    Mccluney, W. R.

    1974-01-01

    An examination of a number of measurements of turbidity reported in the literature reveals considerable variability in the definitions, units, and measurement techniques used. Many of these measurements differ radically in the optical quantity measured. The radiometric basis of each of the most common definitions of turbidity is examined. Several commercially available turbidimeters are described and their principles of operation are evaluated radiometrically. It is recommended that the term turbidity be restricted to measurements based upon the light scattered by the sample with that scattered by standard suspensions of known turbidity. It is also recommended that the measurement procedure be standardized by requiring the use of Formazin as the turbidity standardizing material and that the Formazin Turbidity Unit (FTU) be adopted as the standard unit of turbidity.

  15. The Use of Probability Theory as a Basis for Planning and Controlling Overhead Costs in Education and Industry. Final Report.

    ERIC Educational Resources Information Center

    Vinson, R. B.

    In this report, the author suggests changes in the treatment of overhead costs by hypothesizing that "the effectiveness of standard costing in planning and controlling overhead costs can be increased through the use of probability theory and associated statistical techniques." To test the hypothesis, the author (1) presents an overview of the…

  16. The comet assay: Reflections on its development, evolution and applications.

    PubMed

    Singh, Narendra P

    2016-01-01

    The study of DNA damage and its repair is critical to our understanding of human aging and cancer. This review reflects on the development of a simple technique, now known as the comet assay, to study the accumulation of DNA damage and its repair. It describes my journey into aging research and the need for a method that sensitively quantifies DNA damage on a cell-by-cell basis and on a day-by-day basis. My inspirations, obstacles and successes on the path to developing this assay and improving its reliability and sensitivity are discussed. Recent modifications, applications, and the process of standardizing the technique are also described. What was once untried and unknown has become a technique used around the world for understanding and monitoring DNA damage. The comet assay's use has grown exponentially in the new millennium, as emphasis on studying biological phenomena at the single-cell level has increased. I and others have applied the technique across cell types (including germ cells) and species (including bacteria). As it enters new realms and gains clinical relevance, the comet assay may very well illuminate human aging and its prevention. Copyright © 2016. Published by Elsevier B.V.

  17. Application of Fourier transform infrared (FTIR) spectroscopy for the identification of wheat varieties.

    PubMed

    Amir, Rai Muhammad; Anjum, Faqir Muhammad; Khan, Muhammad Issa; Khan, Moazzam Rafiq; Pasha, Imran; Nadeem, Muhammad

    2013-10-01

    Quality characteristics of wheat are determined by different physiochemical and rheological analysis by using different AACC methods. AACC methods are expensive, time consuming and cause destruction of samples. Fourier transforms infrared (FTIR) spectroscopy is one of the most important and emerging tool used for analyzing wheat for different quality parameters. This technique is rapid and sensitive with a great variety of sampling techniques. In the present study different wheat varieties were analyzed for quality assessment and were also characterized by using AACC methods and FTIR technique. The straight grade flour was analyzed for physical, chemical and rheological properties by standard methods and results were obtained. FTIR works on the basis of functional groups and provide information in the form of peaks. On basis of peaks the value of moisture, protein, fat, ash, carbohydrates and hardness of grain were determined. Peaks for water were observed in the range 1,640 cm(-1) and 3,300 cm(-1) on the basis of functional group H and OH. Protein was observed in the range from 1,600 cm(-1) to 1,700 cm(-1) and 1,550 cm(-1) to 1,570 cm(-1) on the basis of bond amide I and amide II respectively. Fat was also observed within these ranges but on the basis of C-H bond and also starch was observed in the range from 2,800 and 3,000 cm(-1) (C-H stretch region) and in the range 3,000 and 3,600 cm(-1) (O-H stretch region). As FTIR is a fast tool it can be easily emplyed for wheat varieties identification according to a set criterion.

  18. Fundamentals of quantitative dynamic contrast-enhanced MR imaging.

    PubMed

    Paldino, Michael J; Barboriak, Daniel P

    2009-05-01

    Quantitative analysis of dynamic contrast-enhanced MR imaging (DCE-MR imaging) has the power to provide information regarding physiologic characteristics of the microvasculature and is, therefore, of great potential value to the practice of oncology. In particular, these techniques could have a significant impact on the development of novel anticancer therapies as a promising biomarker of drug activity. Standardization of DCE-MR imaging acquisition and analysis to provide more reproducible measures of tumor vessel physiology is of crucial importance to realize this potential. The purpose of this article is to review the pathophysiologic basis and technical aspects of DCE-MR imaging techniques.

  19. Codex Alimentarius: food quality and safety standards for international trade.

    PubMed

    Randell, A W; Whitehead, A J

    1997-08-01

    Since 1962, the Codex Alimentarius Commission (CAC) of the Food and Agriculture Organisation/World Health Organisation has been responsible for developing standards, guidelines and other recommendations on the quality and safety of food to protect the health of consumers and to ensure fair practices in food trade. The mission of the CAC remains relevant, but a number of factors have shown the need for new techniques to form the basis of food standards, the most important of which is risk analysis. The authors give a brief description of the role and work of the CAC and the efforts deployed by the Commission to respond to the challenges posed by new approaches to government regulation, harmonisation of national requirements based on international standards and the role of civil society.

  20. Postmortem validation of breast density using dual-energy mammography

    PubMed Central

    Molloi, Sabee; Ducote, Justin L.; Ding, Huanjun; Feig, Stephen A.

    2014-01-01

    Purpose: Mammographic density has been shown to be an indicator of breast cancer risk and also reduces the sensitivity of screening mammography. Currently, there is no accepted standard for measuring breast density. Dual energy mammography has been proposed as a technique for accurate measurement of breast density. The purpose of this study is to validate its accuracy in postmortem breasts and compare it with other existing techniques. Methods: Forty postmortem breasts were imaged using a dual energy mammography system. Glandular and adipose equivalent phantoms of uniform thickness were used to calibrate a dual energy basis decomposition algorithm. Dual energy decomposition was applied after scatter correction to calculate breast density. Breast density was also estimated using radiologist reader assessment, standard histogram thresholding and a fuzzy C-mean algorithm. Chemical analysis was used as the reference standard to assess the accuracy of different techniques to measure breast composition. Results: Breast density measurements using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm, and dual energy were in good agreement with the measured fibroglandular volume fraction using chemical analysis. The standard error estimates using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean, and dual energy were 9.9%, 8.6%, 7.2%, and 4.7%, respectively. Conclusions: The results indicate that dual energy mammography can be used to accurately measure breast density. The variability in breast density estimation using dual energy mammography was lower than reader assessment rankings, standard histogram thresholding, and fuzzy C-mean algorithm. Improved quantification of breast density is expected to further enhance its utility as a risk factor for breast cancer. PMID:25086548

  1. Postmortem validation of breast density using dual-energy mammography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molloi, Sabee, E-mail: symolloi@uci.edu; Ducote, Justin L.; Ding, Huanjun

    2014-08-15

    Purpose: Mammographic density has been shown to be an indicator of breast cancer risk and also reduces the sensitivity of screening mammography. Currently, there is no accepted standard for measuring breast density. Dual energy mammography has been proposed as a technique for accurate measurement of breast density. The purpose of this study is to validate its accuracy in postmortem breasts and compare it with other existing techniques. Methods: Forty postmortem breasts were imaged using a dual energy mammography system. Glandular and adipose equivalent phantoms of uniform thickness were used to calibrate a dual energy basis decomposition algorithm. Dual energy decompositionmore » was applied after scatter correction to calculate breast density. Breast density was also estimated using radiologist reader assessment, standard histogram thresholding and a fuzzy C-mean algorithm. Chemical analysis was used as the reference standard to assess the accuracy of different techniques to measure breast composition. Results: Breast density measurements using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean algorithm, and dual energy were in good agreement with the measured fibroglandular volume fraction using chemical analysis. The standard error estimates using radiologist reader assessment, standard histogram thresholding, fuzzy C-mean, and dual energy were 9.9%, 8.6%, 7.2%, and 4.7%, respectively. Conclusions: The results indicate that dual energy mammography can be used to accurately measure breast density. The variability in breast density estimation using dual energy mammography was lower than reader assessment rankings, standard histogram thresholding, and fuzzy C-mean algorithm. Improved quantification of breast density is expected to further enhance its utility as a risk factor for breast cancer.« less

  2. Baryon non-invariant couplings in Higgs effective field theory

    NASA Astrophysics Data System (ADS)

    Merlo, Luca; Saa, Sara; Sacristán-Barbero, Mario

    2017-03-01

    The basis of leading operators which are not invariant under baryon number is constructed within the Higgs effective field theory. This list contains 12 dimension six operators, which preserve the combination B-L, to be compared to only 6 operators for the standard model effective field theory. The discussion of the independent flavour contractions is presented in detail for a generic number of fermion families adopting the Hilbert series technique.

  3. Preconditioned MoM Solutions for Complex Planar Arrays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fasenfest, B J; Jackson, D; Champagne, N

    2004-01-23

    The numerical analysis of large arrays is a complex problem. There are several techniques currently under development in this area. One such technique is the FAIM (Faster Adaptive Integral Method). This method uses a modification of the standard AIM approach which takes into account the reusability properties of matrices that arise from identical array elements. If the array consists of planar conducting bodies, the array elements are meshed using standard subdomain basis functions, such as the RWG basis. These bases are then projected onto a regular grid of interpolating polynomials. This grid can then be used in a 2D ormore » 3D FFT to accelerate the matrix-vector product used in an iterative solver. The method has been proven to greatly reduce solve time by speeding the matrix-vector product computation. The FAIM approach also reduces fill time and memory requirements, since only the near element interactions need to be calculated exactly. The present work extends FAIM by modifying it to allow for layered material Green's Functions and dielectrics. In addition, a preconditioner is implemented to greatly reduce the number of iterations required for a solution. The general scheme of the FAIM method is reported in; this contribution is limited to presenting new results.« less

  4. TECHNICAL BASIS FOR A CANDIDATE BUILDING MATERIALS RADIUM STANDARD

    EPA Science Inventory

    The report summarizes the technical basis for a candidate building materials radium standard. It contains the standard and a summary of the technical basis for the standard. (NOTE: The Florida Radon Research Program (FRRP), sponsored by the Environmental Protection Agency and the...

  5. Application of Multi-Criteria Decision Making (MCDM) Technique for Gradation of Jute Fibres

    NASA Astrophysics Data System (ADS)

    Choudhuri, P. K.

    2014-12-01

    Multi-Criteria Decision Making is a branch of Operation Research (OR) having a comparatively short history of about 40 years. It is being popularly used in the field of engineering, banking, fixing policy matters etc. It can also be applied for taking decisions in daily life like selecting a car to purchase, selecting bride or groom and many others. Various MCDM methods namely Weighted Sum Model (WSM), Weighted Product Model (WPM), Analytic Hierarchy Process (AHP), Technique for Order Preference by Similarity to Ideal Solutions (TOPSIS) and Elimination and Choice Translating Reality (ELECTRE) are there to solve many decision making problems, each having its own limitations. However it is very difficult to decide which MCDM method is the best. MCDM methods are prospective quantitative approaches for solving decision problems involving finite number of alternatives and criteria. Very few research works in textiles have been carried out with the help of this technique particularly where decision taking among several alternatives becomes the major problem based on some criteria which are conflicting in nature. Gradation of jute fibres on the basis of the criteria like strength, root content, defects, colour, density, fineness etc. is an important task to perform. The MCDM technique provides enough scope to be applied for the gradation of jute fibres or ranking among several varieties keeping in view a particular object and on the basis of some selection criteria and their relative weightage. The present paper is an attempt to explore the scope of applying the multiplicative AHP method of multi-criteria decision making technique to determine the quality values of selected jute fibres on the basis of some above stated important criteria and ranking them accordingly. A good agreement in ranking is observed between the existing Bureau of Indian Standards (BIS) grading and proposed method.

  6. Novel approach for tomographic reconstruction of gas concentration distributions in air: Use of smooth basis functions and simulated annealing

    NASA Astrophysics Data System (ADS)

    Drescher, A. C.; Gadgil, A. J.; Price, P. N.; Nazaroff, W. W.

    Optical remote sensing and iterative computed tomography (CT) can be applied to measure the spatial distribution of gaseous pollutant concentrations. We conducted chamber experiments to test this combination of techniques using an open path Fourier transform infrared spectrometer (OP-FTIR) and a standard algebraic reconstruction technique (ART). Although ART converged to solutions that showed excellent agreement with the measured ray-integral concentrations, the solutions were inconsistent with simultaneously gathered point-sample concentration measurements. A new CT method was developed that combines (1) the superposition of bivariate Gaussians to represent the concentration distribution and (2) a simulated annealing minimization routine to find the parameters of the Gaussian basis functions that result in the best fit to the ray-integral concentration data. This method, named smooth basis function minimization (SBFM), generated reconstructions that agreed well, both qualitatively and quantitatively, with the concentration profiles generated from point sampling. We present an analysis of two sets of experimental data that compares the performance of ART and SBFM. We conclude that SBFM is a superior CT reconstruction method for practical indoor and outdoor air monitoring applications.

  7. Application of AIS Technology to Forest Mapping

    NASA Technical Reports Server (NTRS)

    Yool, S. R.; Star, J. L.

    1985-01-01

    Concerns about environmental effects of large scale deforestation have prompted efforts to map forests over large areas using various remote sensing data and image processing techniques. Basic research on the spectral characteristics of forest vegetation are required to form a basis for development of new techniques, and for image interpretation. Examination of LANDSAT data and image processing algorithms over a portion of boreal forest have demonstrated the complexity of relations between the various expressions of forest canopies, environmental variability, and the relative capacities of different image processing algorithms to achieve high classification accuracies under these conditions. Airborne Imaging Spectrometer (AIS) data may in part provide the means to interpret the responses of standard data and techniques to the vegetation based on its relatively high spectral resolution.

  8. Towards developing standard operating procedures for pre-clinical testing in the mdx mouse model of Duchenne muscular dystrophy

    PubMed Central

    Grounds, Miranda D.; Radley, Hannah G.; Lynch, Gordon S.; Nagaraju, Kanneboyina; De Luca, Annamaria

    2008-01-01

    This review discusses various issues to consider when developing standard operating procedures for pre-clinical studies in the mdx mouse model of Duchenne muscular dystrophy (DMD). The review describes and evaluates a wide range of techniques used to measure parameters of muscle pathology in mdx mice and identifies some basic techniques that might comprise standardised approaches for evaluation. While the central aim is to provide a basis for the development of standardised procedures to evaluate efficacy of a drug or a therapeutic strategy, a further aim is to gain insight into pathophysiological mechanisms in order to identify other therapeutic targets. The desired outcome is to enable easier and more rigorous comparison of pre-clinical data from different laboratories around the world, in order to accelerate identification of the best pre-clinical therapies in the mdx mouse that will fast-track translation into effective clinical treatments for DMD. PMID:18499465

  9. Scattering of cylindrical electric field waves from an elliptical dielectric cylindrical shell

    NASA Astrophysics Data System (ADS)

    Urbanik, E. A.

    1982-12-01

    This thesis examines the scattering of cylindrical waves by large dielectric scatterers of elliptic cross section. The solution method was the method of moments using a Galerkin approach. Sinusoidal basis and testing functions were used resulting in a higher convergence rate. The higher rate of convergence made it possible for the program to run on the Aeronautical Systems Division's CYBER computers without any special storage methods. This report includes discussion on moment methods, solution of integral equations, and the relationship between the electric field and the source region or self cell singularity. Since the program produced unacceptable run times, no results are contained herein. The importance of this work is the evaluation of the practicality of moment methods using standard techniques. The long run times for a mid-sized scatterer demonstrate the impracticality of moment methods for dielectrics using standard techniques.

  10. Interpreting international governance standards for health IT use within general medical practice.

    PubMed

    Mahncke, Rachel J; Williams, Patricia A H

    2014-01-01

    General practices in Australia recognise the importance of comprehensive protective security measures. Some elements of information security governance are incorporated into recommended standards, however the governance component of information security is still insufficiently addressed in practice. The International Organistion for Standardisation (ISO) released a new global standard in May 2013 entitled, ISO/IEC 27014:2013 Information technology - Security techniques - Governance of information security. This standard, applicable to organisations of all sizes, offers a framework against which to assess and implement the governance components of information security. The standard demonstrates the relationship between governance and the management of information security, provides strategic principles and processes, and forms the basis for establishing a positive information security culture. An analysis interpretation of this standard for use in Australian general practice was performed. This work is unique as such interpretation for the Australian healthcare environment has not been undertaken before. It demonstrates an application of the standard at a strategic level to inform existing development of an information security governance framework.

  11. Training Challenges for the U.S. Army in the Pacific

    DTIC Science & Technology

    2013-03-01

    each other on a frequent enough basis to be thought of as “interoperable.” All the parties involved have different standard operating procedures ...13 All have different command and control technologies and all have different procedures to plan, prepare, coordinate and synchronize operations...does not fix it. Joint Task Force tactics, techniques and procedures are eventually developed but they take time and amount to a band aid as opposed

  12. Evaluation of liquefaction potential of soil based on standard penetration test using multi-gene genetic programming model

    NASA Astrophysics Data System (ADS)

    Muduli, Pradyut; Das, Sarat

    2014-06-01

    This paper discusses the evaluation of liquefaction potential of soil based on standard penetration test (SPT) dataset using evolutionary artificial intelligence technique, multi-gene genetic programming (MGGP). The liquefaction classification accuracy (94.19%) of the developed liquefaction index (LI) model is found to be better than that of available artificial neural network (ANN) model (88.37%) and at par with the available support vector machine (SVM) model (94.19%) on the basis of the testing data. Further, an empirical equation is presented using MGGP to approximate the unknown limit state function representing the cyclic resistance ratio (CRR) of soil based on developed LI model. Using an independent database of 227 cases, the overall rates of successful prediction of occurrence of liquefaction and non-liquefaction are found to be 87, 86, and 84% by the developed MGGP based model, available ANN and the statistical models, respectively, on the basis of calculated factor of safety (F s) against the liquefaction occurrence.

  13. PIXE and XRF Analysis of Roman Denarii

    NASA Astrophysics Data System (ADS)

    Fasano, Cecilia; Raddell, Mark; Manukyan, Khachatur; Stech, Edward; Wiescher, Michael

    2017-09-01

    A set of Roman Denarii from the republican to the imperial period (140BC-240AD) has been studied using X-ray fluorescent (XRF) scanning and proton induced x-ray emission (PIXE) techniques. XRF and PIXE are commonly used in the study of cultural heritage objects because they are nondestructive. The combination of these two methods is also unique because of the ability to penetrate the sample with a broader spectrum of depths and energies than either could achieve on its own. The coins are from a large span of Roman history and their analysis serves to follow the economic and political change of the era using the relative silver and copper contents in each sample. In addition to analyzing the samples, the study sought to compare these two common analysis techniques and to explore the use of a standard to examine any shortcomings in either of the methods. Data sets were compared and then adjusted to a calibration curve which was created from the analysis of a number of standard solutions. The concentrations of the standard solutions were confirmed using inductively coupled plasma spectroscopy. Through this we were able to assemble results which will progress the basis of understanding of PIXE and XRF techniques as well as increase the wealth of knowledge of Ancient Roman currency.

  14. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  15. Ensuring safety of implanted devices under MRI using reversed RF polarization.

    PubMed

    Overall, William R; Pauly, John M; Stang, Pascal P; Scott, Greig C

    2010-09-01

    Patients with long-wire medical implants are currently prevented from undergoing magnetic resonance imaging (MRI) scans due to the risk of radio frequency (RF) heating. We have developed a simple technique for determining the heating potential for these implants using reversed radio frequency (RF) polarization. This technique could be used on a patient-to-patient basis as a part of the standard prescan procedure to ensure that the subject's device does not pose a heating risk. By using reversed quadrature polarization, the MR scan can be sensitized exclusively to the potentially dangerous currents in the device. Here, we derive the physical principles governing the technique and explore the primary sources of inaccuracy. These principles are verified through finite-difference simulations and through phantom scans of implant leads. These studies demonstrate the potential of the technique for sensitively detecting potentially dangerous coupling conditions before they can do any harm. 2010 Wiley-Liss, Inc.

  16. Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?

    PubMed

    Rocha, José C; Passalia, Felipe; Matos, Felipe D; Maserati, Marc P; Alves, Mayra F; Almeida, Tamie G de; Cardoso, Bruna L; Basso, Andrea C; Nogueira, Marcelo F G

    2016-08-01

    Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment.

  17. Methods for assessing the quality of mammalian embryos: How far we are from the gold standard?

    PubMed Central

    Rocha, José C.; Passalia, Felipe; Matos, Felipe D.; Maserati Jr, Marc P.; Alves, Mayra F.; de Almeida, Tamie G.; Cardoso, Bruna L.; Basso, Andrea C.; Nogueira, Marcelo F. G.

    2016-01-01

    Morphological embryo classification is of great importance for many laboratory techniques, from basic research to the ones applied to assisted reproductive technology. However, the standard classification method for both human and cattle embryos, is based on quality parameters that reflect the overall morphological quality of the embryo in cattle, or the quality of the individual embryonic structures, more relevant in human embryo classification. This assessment method is biased by the subjectivity of the evaluator and even though several guidelines exist to standardize the classification, it is not a method capable of giving reliable and trustworthy results. Latest approaches for the improvement of quality assessment include the use of data from cellular metabolism, a new morphological grading system, development kinetics and cleavage symmetry, embryo cell biopsy followed by pre-implantation genetic diagnosis, zona pellucida birefringence, ion release by the embryo cells and so forth. Nowadays there exists a great need for evaluation methods that are practical and non-invasive while being accurate and objective. A method along these lines would be of great importance to embryo evaluation by embryologists, clinicians and other professionals who work with assisted reproductive technology. Several techniques shows promising results in this sense, one being the use of digital images of the embryo as basis for features extraction and classification by means of artificial intelligence techniques (as genetic algorithms and artificial neural networks). This process has the potential to become an accurate and objective standard for embryo quality assessment. PMID:27584609

  18. Straight-wire appliances: standard versus individual prescription.

    PubMed

    Farronato, Giampietro; Periti, Giulia; Giannini, Lucia; Farronato, Davide; Maspero, Cinzia

    2009-01-01

    In this article the individual patient (IP) appliance is described. It consists of 250 options of bracket and band variations as the straight wire appliances. Increasing the bracket capabilities means using an increasing number of brackets, each with a specific design created for a treatment situation. The objective of IP appliance is to eliminate wire bending from orthodontic treatment and improve the treatment results. To manage this technique, a computer software is needed. Internet offers significant possibilities in managing each patient by patient basis. The clinician is required to make the diagnosis and treatment plan before ordering the appliance. Two clinical cases are described with the aim to present the advantages of this technique.

  19. Architecture and Key Techniques of Augmented Reality Maintenance Guiding System for Civil Aircrafts

    NASA Astrophysics Data System (ADS)

    hong, Zhou; Wenhua, Lu

    2017-01-01

    Augmented reality technology is introduced into the maintenance related field for strengthened information in real-world scenarios through integration of virtual assistant maintenance information with real-world scenarios. This can lower the difficulty of maintenance, reduce maintenance errors, and improve the maintenance efficiency and quality of civil aviation crews. Architecture of augmented reality virtual maintenance guiding system is proposed on the basis of introducing the definition of augmented reality and analyzing the characteristics of augmented reality virtual maintenance. Key techniques involved, such as standardization and organization of maintenance data, 3D registration, modeling of maintenance guidance information and virtual maintenance man-machine interaction, are elaborated emphatically, and solutions are given.

  20. Laser Propulsion Standardization Issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scharring, Stefan; Eckel, Hans-Albert; Roeser, Hans-Peter

    It is a relevant issue in the research on laser propulsion that experimental results are treated seriously and that meaningful scientific comparison is possible between groups using different equipment and measurement techniques. However, critical aspects of experimental measurements are sparsely addressed in the literature. In addition, few studies so far have the benefit of independent confirmation by other laser propulsion groups. In this paper, we recommend several approaches towards standardization of published laser propulsion experiments. Such standards are particularly important for the measurement of laser ablation pulse energy, laser spot area, imparted impulse or thrust, and mass removal during ablation.more » Related examples are presented from experiences of an actual scientific cooperation between NU and DLR. On the basis of a given standardization, researchers may better understand and contribute their findings more clearly in the future, and compare those findings confidently with those already published in the laser propulsion literature. Relevant ISO standards are analyzed, and revised formats are recommended for application to laser propulsion studies.« less

  1. Digital Spectral Analysis: A Guide Based on Experience with Aircraft Vibrations.

    DTIC Science & Technology

    1981-02-01

    possible in the ’standard’ texts. In a number of respects, the present application was a severe test of the spectral techniques. The excitation of the...determined on the basis of experience. For example, when a lightly-damped structure is subjected to random excitation , the energy stored in the vibrations will...be far greeter than the work done by the excitation in one cycle. The intensity of the response will tend to vary less than the intensity of the

  2. Sequential neural text compression.

    PubMed

    Schmidhuber, J; Heil, S

    1996-01-01

    The purpose of this paper is to show that neural networks may be promising tools for data compression without loss of information. We combine predictive neural nets and statistical coding techniques to compress text files. We apply our methods to certain short newspaper articles and obtain compression ratios exceeding those of the widely used Lempel-Ziv algorithms (which build the basis of the UNIX functions "compress" and "gzip"). The main disadvantage of our methods is that they are about three orders of magnitude slower than standard methods.

  3. New Primary Standards for Establishing SI Traceability for Moisture Measurements in Solid Materials

    NASA Astrophysics Data System (ADS)

    Heinonen, M.; Bell, S.; Choi, B. Il; Cortellessa, G.; Fernicola, V.; Georgin, E.; Hudoklin, D.; Ionescu, G. V.; Ismail, N.; Keawprasert, T.; Krasheninina, M.; Aro, R.; Nielsen, J.; Oğuz Aytekin, S.; Österberg, P.; Skabar, J.; Strnad, R.

    2018-01-01

    A European research project METefnet addresses a fundamental obstacle to improving energy-intensive drying process control: due to ambiguous reference analysis methods and insufficient methods for estimating uncertainty in moisture measurements, the achievable accuracy in the past was limited and measurement uncertainties were largely unknown. This paper reports the developments in METefnet that provide a sound basis for the SI traceability: four new primary standards for realizing the water mass fraction were set up, analyzed and compared to each other. The operation of these standards is based on combining sample weighing with different water vapor detection techniques: cold trap, chilled mirror, electrolytic and coulometric Karl Fischer titration. The results show that an equivalence of 0.2 % has been achieved between the water mass fraction realizations and that the developed methods are applicable to a wide range of materials.

  4. Technique for simulating peak-flow hydrographs in Maryland

    USGS Publications Warehouse

    Dillow, Jonathan J.A.

    1998-01-01

    The efficient design and management of many bridges, culverts, embankments, and flood-protection structures may require the estimation of time-of-inundation and (or) storage of floodwater relating to such structures. These estimates can be made on the basis of information derived from the peak-flow hydrograph. Average peak-flow hydrographs corresponding to a peak discharge of specific recurrence interval can be simulated for drainage basins having drainage areas less than 500 square miles in Maryland, using a direct technique of known accuracy. The technique uses dimensionless hydrographs in conjunction with estimates of basin lagtime and instantaneous peak flow. Ordinary least-squares regression analysis was used to develop an equation for estimating basin lagtime in Maryland. Drainage area, main channel slope, forest cover, and impervious area were determined to be the significant explanatory variables necessary to estimate average basin lagtime at the 95-percent confidence interval. Qualitative variables included in the equation adequately correct for geographic bias across the State. The average standard error of prediction associated with the equation is approximated as plus or minus (+/-) 37.6 percent. Volume correction factors may be applied to the basin lagtime on the basis of a comparison between actual and estimated hydrograph volumes prior to hydrograph simulation. Three dimensionless hydrographs were developed and tested using data collected during 278 significant rainfall-runoff events at 81 stream-gaging stations distributed throughout Maryland and Delaware. The data represent a range of drainage area sizes and basin conditions. The technique was verified by applying it to the simulation of 20 peak-flow events and comparing actual and simulated hydrograph widths at 50 and 75 percent of the observed peak-flow levels. The events chosen are considered extreme in that the average recurrence interval of the selected peak flows is 130 years. The average standard errors of prediction were +/- 61 and +/- 56 percent at the 50 and 75 percent of peak-flow hydrograph widths, respectively.

  5. In vitro study comparing fracture strength recovery of teeth restored with three esthetic bonding materials using different techniques.

    PubMed

    Rajput, Akhil; Ataide, Ida; Lambor, Rajan; Monteiro, Jeanne; Tar, Malika; Wadhawan, Neeraj

    2010-01-01

    Reattachment of the fractured fragment of a traumatized tooth (whenever available and usable) has become the treatment of choice in cases of uncomplicated crown fractures. Despite the presence of various bonding materials and techniques, laboratory data evaluating the biomechanical aspects of such procedures is largely lacking in the literature. The objective of this in vitro study was to evaluate the fracture strength recovery of incisors, following fragment restoration with three different techniques. A total of 90 extracted human maxillary central incisors were subjected to crown fractured under standard conditions. This was carried out by applying a compressive force from the buccal aspect of the clinical crown using a universal strength testing machine. The fractured teeth were equality distributed in three groups, defined on the basis of the technique used for reattachment: i) overcontour, ii) internal dentinal groove and iii) direct buildup. Each group was further subdivided into three subgroups on the basis of the intermediate restorative material used for reattachment, namely: i) hybrid composite (Filtek Z100 Universal Restorative, ii) nanocomposite (Filtek Z350) and iii) Ormocer (Voco Admira). Following reattachment, the crowns were re-fractured under standard conditions. The force required for fracture was recorded and was expressed as a percentage of the fracture strength of the intact tooth. The data was expressed as a percentage of the fracture strength of the intact tooth. The data was analyzed using two-way ANOVA and Bonferroni tests for pair-wise comparison. The results showed no statistically significant differences in fractures strength between the three groups (P > 0.05). However, comparison of the subgroups revealed statistically significant higher strength recovery percentages for the hybrid and the nanocomposite compared with the Ormocer material (P < 0.05). It was concluded that material properties have a significant influence on the success of reattachment procedures.

  6. Ab initio structural and spectroscopic study of HPS{sup x} and HSP{sup x} (x = 0,+1,−1) in the gas phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yaghlane, Saida Ben; Cotton, C. Eric; Francisco, Joseph S., E-mail: francisc@purdue.edu, E-mail: hochlaf@univ-mlv.fr

    2013-11-07

    Accurate ab initio computations of structural and spectroscopic parameters for the HPS/HSP molecules and corresponding cations and anions have been performed. For the electronic structure computations, standard and explicitly correlated coupled cluster techniques in conjunction with large basis sets have been adopted. In particular, we present equilibrium geometries, rotational constants, harmonic vibrational frequencies, adiabatic ionization energies, electron affinities, and, for the neutral species, singlet-triplet relative energies. Besides, the full-dimensional potential energy surfaces (PESs) for HPS{sup x} and HSP{sup x} (x = −1,0,1) systems have been generated at the standard coupled cluster level with a basis set of augmented quintuple-zeta quality.more » By applying perturbation theory to the calculated PESs, an extended set of spectroscopic constants, including τ, first-order centrifugal distortion and anharmonic vibrational constants has been obtained. In addition, the potentials have been used in a variational approach to deduce the whole pattern of vibrational levels up to 4000 cm{sup −1} above the minima of the corresponding PESs.« less

  7. Towards the methodological optimization of the moss bag technique in terms of contaminants concentrations and replicability values

    NASA Astrophysics Data System (ADS)

    Ares, A.; Fernández, J. A.; Carballeira, A.; Aboal, J. R.

    2014-09-01

    The moss bag technique is a simple and economical environmental monitoring tool used to monitor air quality. However, routine use of the method is not possible because the protocols involved have not yet been standardized. Some of the most variable methodological aspects include (i) selection of moss species, (ii) ratio of moss weight to surface area of the bag, (iii) duration of exposure, and (iv) height of exposure. In the present study, the best option for each of these aspects was selected on the basis of the mean concentrations and data replicability of Cd, Cu, Hg, Pb and Zn measured during at least two exposure periods in environments affected by different degrees of contamination. The optimal choices for the studied aspects were the following: (i) Sphagnum denticulatum, (ii) 5.68 mg of moss tissue for each cm-2 of bag surface, (iii) 8 weeks of exposure, and (iv) 4 m height of exposure. Duration of exposure and height of exposure accounted for most of the variability in the data. The aim of this methodological study was to provide data to help establish a standardized protocol that will enable use of the moss bag technique by public authorities.

  8. Theoretical basis of the DOE-2 building energy use analysis program

    NASA Astrophysics Data System (ADS)

    Curtis, R. B.

    1981-04-01

    A user-oriented, public domain, computer program was developed that will enable architects and engineers to perform design and retrofit studies of the energy-use of buildings under realistic weather conditions. The DOE-2.1A has been named by the US DOE as the standard evaluation technique for the Congressionally mandated building energy performance standards (BEPS). A number of program design decisions were made that determine the breadth of applicability of DOE-2.1. Such design decisions are intrinsic to all building energy use analysis computer programs and determine the types of buildings or the kind of HVAC systems that can be modeled. In particular, the weighting factor method used in DOE-2 has both advantages and disadvantages relative to other computer programs.

  9. Hybrid and Constrained Resolution-of-Identity Techniques for Coulomb Integrals.

    PubMed

    Duchemin, Ivan; Li, Jing; Blase, Xavier

    2017-03-14

    The introduction of auxiliary bases to approximate molecular orbital products has paved the way to significant savings in the evaluation of four-center two-electron Coulomb integrals. We present a generalized dual space strategy that sheds a new light on variants over the standard density and Coulomb-fitting schemes, including the possibility of introducing minimization constraints. We improve in particular the charge- or multipole-preserving strategies introduced respectively by Baerends and Van Alsenoy that we compare to a simple scheme where the Coulomb metric is used for lowest angular momentum auxiliary orbitals only. We explore the merits of these approaches on the basis of extensive Hartree-Fock and MP2 calculations over a standard set of medium size molecules.

  10. Stable isotope dilution analysis of hydrologic samples by inductively coupled plasma mass spectrometry

    USGS Publications Warehouse

    Garbarino, John R.; Taylor, Howard E.

    1987-01-01

    Inductively coupled plasma mass spectrometry is employed in the determination of Ni, Cu, Sr, Cd, Ba, Ti, and Pb in nonsaline, natural water samples by stable isotope dilution analysis. Hydrologic samples were directly analyzed without any unusual pretreatment. Interference effects related to overlapping isobars, formation of metal oxide and multiply charged ions, and matrix composition were identified and suitable methods of correction evaluated. A comparability study snowed that single-element isotope dilution analysis was only marginally better than sequential multielement isotope dilution analysis. Accuracy and precision of the single-element method were determined on the basis of results obtained for standard reference materials. The instrumental technique was shown to be ideally suited for programs associated with certification of standard reference materials.

  11. RENEB - Running the European Network of biological dosimetry and physical retrospective dosimetry.

    PubMed

    Kulka, Ulrike; Abend, Michael; Ainsbury, Elizabeth; Badie, Christophe; Barquinero, Joan Francesc; Barrios, Lleonard; Beinke, Christina; Bortolin, Emanuela; Cucu, Alexandra; De Amicis, Andrea; Domínguez, Inmaculada; Fattibene, Paola; Frøvig, Anne Marie; Gregoire, Eric; Guogyte, Kamile; Hadjidekova, Valeria; Jaworska, Alicja; Kriehuber, Ralf; Lindholm, Carita; Lloyd, David; Lumniczky, Katalin; Lyng, Fiona; Meschini, Roberta; Mörtl, Simone; Della Monaca, Sara; Monteiro Gil, Octávia; Montoro, Alegria; Moquet, Jayne; Moreno, Mercedes; Oestreicher, Ursula; Palitti, Fabrizio; Pantelias, Gabriel; Patrono, Clarice; Piqueret-Stephan, Laure; Port, Matthias; Prieto, María Jesus; Quintens, Roel; Ricoul, Michelle; Romm, Horst; Roy, Laurence; Sáfrány, Géza; Sabatier, Laure; Sebastià, Natividad; Sommer, Sylwester; Terzoudi, Georgia; Testa, Antonella; Thierens, Hubert; Turai, Istvan; Trompier, François; Valente, Marco; Vaz, Pedro; Voisin, Philippe; Vral, Anne; Woda, Clemens; Zafiropoulos, Demetre; Wojcik, Andrzej

    2017-01-01

    A European network was initiated in 2012 by 23 partners from 16 European countries with the aim to significantly increase individualized dose reconstruction in case of large-scale radiological emergency scenarios. The network was built on three complementary pillars: (1) an operational basis with seven biological and physical dosimetric assays in ready-to-use mode, (2) a basis for education, training and quality assurance, and (3) a basis for further network development regarding new techniques and members. Techniques for individual dose estimation based on biological samples and/or inert personalized devices as mobile phones or smart phones were optimized to support rapid categorization of many potential victims according to the received dose to the blood or personal devices. Communication and cross-border collaboration were also standardized. To assure long-term sustainability of the network, cooperation with national and international emergency preparedness organizations was initiated and links to radiation protection and research platforms have been developed. A legal framework, based on a Memorandum of Understanding, was established and signed by 27 organizations by the end of 2015. RENEB is a European Network of biological and physical-retrospective dosimetry, with the capacity and capability to perform large-scale rapid individualized dose estimation. Specialized to handle large numbers of samples, RENEB is able to contribute to radiological emergency preparedness and wider large-scale research projects.

  12. Yielding physically-interpretable emulators - A Sparse PCA approach

    NASA Astrophysics Data System (ADS)

    Galelli, S.; Alsahaf, A.; Giuliani, M.; Castelletti, A.

    2015-12-01

    Projection-based techniques, such as Principal Orthogonal Decomposition (POD), are a common approach to surrogate high-fidelity process-based models by lower order dynamic emulators. With POD, the dimensionality reduction is achieved by using observations, or 'snapshots' - generated with the high-fidelity model -, to project the entire set of input and state variables of this model onto a smaller set of basis functions that account for most of the variability in the data. While reduction efficiency and variance control of POD techniques are usually very high, the resulting emulators are structurally highly complex and can hardly be given a physically meaningful interpretation as each basis is a projection of the entire set of inputs and states. In this work, we propose a novel approach based on Sparse Principal Component Analysis (SPCA) that combines the several assets of POD methods with the potential for ex-post interpretation of the emulator structure. SPCA reduces the number of non-zero coefficients in the basis functions by identifying a sparse matrix of coefficients. While the resulting set of basis functions may retain less variance of the snapshots, the presence of a few non-zero coefficients assists in the interpretation of the underlying physical processes. The SPCA approach is tested on the reduction of a 1D hydro-ecological model (DYRESM-CAEDYM) used to describe the main ecological and hydrodynamic processes in Tono Dam, Japan. An experimental comparison against a standard POD approach shows that SPCA achieves the same accuracy in emulating a given output variable - for the same level of dimensionality reduction - while yielding better insights of the main process dynamics.

  13. Anatomic Double-Bundle Anterior Cruciate Ligament Reconstruction With a Free Quadriceps Tendon Autograft.

    PubMed

    Caterev, Sergiu; Nistor, Dan Viorel; Todor, Adrian

    2016-10-01

    Anatomic double-bundle anterior cruciate ligament (ACL) reconstruction aims to restore the 2 functional bundles of the ACL in an attempt to better reproduce the native biomechanics of the injured knee and promote long-term knee health. However, this concept is not fully accepted and is not performed on a standard basis. In addition, the superiority of this technique over the conventional single-bundle technique has been questioned, especially the long-term clinical results. One of the down sides of the double-bundle reconstruction is the complexity of the procedure, with increased risks, operative time, and costs compared with the single-bundle procedure. Also, the revision procedure, if necessary, is more challenging. We propose a technique that has some advantages over the traditional double-bundle procedure, using a single femoral tunnel, 2 tibial tunnels, and a free quadriceps tendon autograft.

  14. Ion mobility spectrometry as a detector for molecular imprinted polymer separation and metronidazole determination in pharmaceutical and human serum samples.

    PubMed

    Jafari, M T; Rezaei, B; Zaker, B

    2009-05-01

    Application of ion mobility spectrometry (IMS) as the detection technique for a separation method based on molecular imprinted polymer (MIP) was investigated and evaluated for the first time. On the basis of the results obtained in this work, the MIP-IMS system can be used as a powerful technique for separation, preconcentration, and detection of the metronidazole drug in pharmaceutical and human serum samples. The method is exhaustively validated in terms of sensitivity, selectivity, recovery, reproducibility, and column capacity. The linear dynamic range of 0.05-70.00 microg/mL was obtained for the determination of metronidazole with IMS. The recovery of analyzed drug was calculated to be above 89%, and the relative standard deviation (RSD) was lower than 6% for all experiments. Various real samples were analyzed with the coupled techniques, and the results obtained revealed the efficient cleanup of the samples using MIP separation before the analysis by IMS as a detection technique.

  15. Security of Color Image Data Designed by Public-Key Cryptosystem Associated with 2D-DWT

    NASA Astrophysics Data System (ADS)

    Mishra, D. C.; Sharma, R. K.; Kumar, Manish; Kumar, Kuldeep

    2014-08-01

    In present times the security of image data is a major issue. So, we have proposed a novel technique for security of color image data by public-key cryptosystem or asymmetric cryptosystem. In this technique, we have developed security of color image data using RSA (Rivest-Shamir-Adleman) cryptosystem with two-dimensional discrete wavelet transform (2D-DWT). Earlier proposed schemes for security of color images designed on the basis of keys, but this approach provides security of color images with the help of keys and correct arrangement of RSA parameters. If the attacker knows about exact keys, but has no information of exact arrangement of RSA parameters, then the original information cannot be recovered from the encrypted data. Computer simulation based on standard example is critically examining the behavior of the proposed technique. Security analysis and a detailed comparison between earlier developed schemes for security of color images and proposed technique are also mentioned for the robustness of the cryptosystem.

  16. Strong stabilization servo controller with optimization of performance criteria.

    PubMed

    Sarjaš, Andrej; Svečko, Rajko; Chowdhury, Amor

    2011-07-01

    Synthesis of a simple robust controller with a pole placement technique and a H(∞) metrics is the method used for control of a servo mechanism with BLDC and BDC electric motors. The method includes solving a polynomial equation on the basis of the chosen characteristic polynomial using the Manabe standard polynomial form and parametric solutions. Parametric solutions are introduced directly into the structure of the servo controller. On the basis of the chosen parametric solutions the robustness of a closed-loop system is assessed through uncertainty models and assessment of the norm ‖•‖(∞). The design procedure and the optimization are performed with a genetic algorithm differential evolution - DE. The DE optimization method determines a suboptimal solution throughout the optimization on the basis of a spectrally square polynomial and Šiljak's absolute stability test. The stability of the designed controller during the optimization is being checked with Lipatov's stability condition. Both utilized approaches: Šiljak's test and Lipatov's condition, check the robustness and stability characteristics on the basis of the polynomial's coefficients, and are very convenient for automated design of closed-loop control and for application in optimization algorithms such as DE. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  17. 7 CFR 810.1203 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD ADMINISTRATION (FEDERAL GRAIN INSPECTION SERVICE), DEPARTMENT OF AGRICULTURE OFFICIAL UNITED STATES STANDARDS FOR GRAIN United States Standards for Rye Principles Governing the Application of Standards § 810.1203 Basis...

  18. Standardization of uveitis nomenclature for reporting clinical data. Results of the First International Workshop.

    PubMed

    Jabs, Douglas A; Nussenblatt, Robert B; Rosenbaum, James T

    2005-09-01

    To begin a process of standardizing the methods for reporting clinical data in the field of uveitis. Consensus workshop. Members of an international working group were surveyed about diagnostic terminology, inflammation grading schema, and outcome measures, and the results used to develop a series of proposals to better standardize the use of these entities. Small groups employed nominal group techniques to achieve consensus on several of these issues. The group affirmed that an anatomic classification of uveitis should be used as a framework for subsequent work on diagnostic criteria for specific uveitic syndromes, and that the classification of uveitis entities should be on the basis of the location of the inflammation and not on the presence of structural complications. Issues regarding the use of the terms "intermediate uveitis," "pars planitis," "panuveitis," and descriptors of the onset and course of the uveitis were addressed. The following were adopted: standardized grading schema for anterior chamber cells, anterior chamber flare, and for vitreous haze; standardized methods of recording structural complications of uveitis; standardized definitions of outcomes, including "inactive" inflammation, "improvement'; and "worsening" of the inflammation, and "corticosteroid sparing," and standardized guidelines for reporting visual acuity outcomes. A process of standardizing the approach to reporting clinical data in uveitis research has begun, and several terms have been standardized.

  19. 7 CFR 810.403 - Basis of determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... GRAIN United States Standards for Corn Principles Governing the Application of Standards § 810.403 Basis of determination. Each determination of class, damaged kernels, heat-damaged kernels, waxy corn, flint corn, and flint and dent corn is made on the basis of the grain after the removal of the broken...

  20. 7 CFR 810.403 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GRAIN United States Standards for Corn Principles Governing the Application of Standards § 810.403 Basis of determination. Each determination of class, damaged kernels, heat-damaged kernels, waxy corn, flint corn, and flint and dent corn is made on the basis of the grain after the removal of the broken...

  1. Compressive auto-indexing in femtosecond nanocrystallography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maia, Filipe; Yang, Chao; Marchesini, Stefano

    2010-09-20

    Ultrafast nanocrystallography has the potential to revolutionize biology by enabling structural elucidation of proteins for which it is possible to grow crystals with 10 or fewer unit cells. The success of nanocrystallography depends on robust orientation-determination procedures that allow us to average diffraction data from multiple nanocrystals to produce a 3D diffraction data volume with a high signal-to-noise ratio. Such a 3D diffraction volume can then be phased using standard crystallographic techniques."Indexing" algorithms used in crystallography enable orientation determination of a diffraction data from a single crystal when a relatively large number of reflections are recorded. Here we show thatmore » it is possible to obtain the exact lattice geometry from a smaller number of measurements than standard approaches using a basis pursuit solver.« less

  2. A Mobile Early Stimulation Program to Support Children with Developmental Delays in Brazil.

    PubMed

    Dias, Raquel da Luz; Silva, Kátia Cristina Correa Guimarães; Lima, Marcela Raquel de Oliveira; Alves, João Guilherme Bezerra; Abidi, Syed Sibte Raza

    2018-01-01

    Developmental delay is a deviation development from the normative milestones during the childhood and it may be caused by neurological disorders. Early stimulation is a standardized and simple technique to treat developmental delays in children (aged 0-3 years), allowing them to reach the best development possible and to mitigate neuropsychomotor sequelae. However, the outcomes of the treatment depending on the involvement of the family, to continue the activities at home on a daily basis. To empower and educate parents of children with neurodevelopmental delays to administer standardized early stimulation programs at home, we developed a mobile early stimulation program that provides timely and evidence-based clinical decision support to health professionals and a personalized guidance to parents about how to administer early stimulation to their child at home.

  3. Optically sectioned in vivo imaging with speckle illumination HiLo microscopy

    PubMed Central

    Lim, Daryl; Ford, Tim N.; Chu, Kengyeh K.; Mertz, Jerome

    2011-01-01

    We present a simple wide-field imaging technique, called HiLo microscopy, that is capable of producing optically sectioned images in real time, comparable in quality to confocal laser scanning microscopy. The technique is based on the fusion of two raw images, one acquired with speckle illumination and another with standard uniform illumination. The fusion can be numerically adjusted, using a single parameter, to produce optically sectioned images of varying thicknesses with the same raw data. Direct comparison between our HiLo microscope and a commercial confocal laser scanning microscope is made on the basis of sectioning strength and imaging performance. Specifically, we show that HiLo and confocal 3-D imaging of a GFP-labeled mouse brain hippocampus are comparable in quality. Moreover, HiLo microscopy is capable of faster, near video rate imaging over larger fields of view than attainable with standard confocal microscopes. The goal of this paper is to advertise the simplicity, robustness, and versatility of HiLo microscopy, which we highlight with in vivo imaging of common model organisms including planaria, C. elegans, and zebrafish. PMID:21280920

  4. Optically sectioned in vivo imaging with speckle illumination HiLo microscopy.

    PubMed

    Lim, Daryl; Ford, Tim N; Chu, Kengyeh K; Mertz, Jerome

    2011-01-01

    We present a simple wide-field imaging technique, called HiLo microscopy, that is capable of producing optically sectioned images in real time, comparable in quality to confocal laser scanning microscopy. The technique is based on the fusion of two raw images, one acquired with speckle illumination and another with standard uniform illumination. The fusion can be numerically adjusted, using a single parameter, to produce optically sectioned images of varying thicknesses with the same raw data. Direct comparison between our HiLo microscope and a commercial confocal laser scanning microscope is made on the basis of sectioning strength and imaging performance. Specifically, we show that HiLo and confocal 3-D imaging of a GFP-labeled mouse brain hippocampus are comparable in quality. Moreover, HiLo microscopy is capable of faster, near video rate imaging over larger fields of view than attainable with standard confocal microscopes. The goal of this paper is to advertise the simplicity, robustness, and versatility of HiLo microscopy, which we highlight with in vivo imaging of common model organisms including planaria, C. elegans, and zebrafish.

  5. Optically sectioned in vivo imaging with speckle illumination HiLo microscopy

    NASA Astrophysics Data System (ADS)

    Lim, Daryl; Ford, Tim N.; Chu, Kengyeh K.; Mertz, Jerome

    2011-01-01

    We present a simple wide-field imaging technique, called HiLo microscopy, that is capable of producing optically sectioned images in real time, comparable in quality to confocal laser scanning microscopy. The technique is based on the fusion of two raw images, one acquired with speckle illumination and another with standard uniform illumination. The fusion can be numerically adjusted, using a single parameter, to produce optically sectioned images of varying thicknesses with the same raw data. Direct comparison between our HiLo microscope and a commercial confocal laser scanning microscope is made on the basis of sectioning strength and imaging performance. Specifically, we show that HiLo and confocal 3-D imaging of a GFP-labeled mouse brain hippocampus are comparable in quality. Moreover, HiLo microscopy is capable of faster, near video rate imaging over larger fields of view than attainable with standard confocal microscopes. The goal of this paper is to advertise the simplicity, robustness, and versatility of HiLo microscopy, which we highlight with in vivo imaging of common model organisms including planaria, C. elegans, and zebrafish.

  6. Towards Application of NASA Standard for Models and Simulations in Aeronautical Design Process

    NASA Astrophysics Data System (ADS)

    Vincent, Luc; Dunyach, Jean-Claude; Huet, Sandrine; Pelissier, Guillaume; Merlet, Joseph

    2012-08-01

    Even powerful computational techniques like simulation endure limitations in their validity domain. Consequently using simulation models requires cautions to avoid making biased design decisions for new aeronautical products on the basis of inadequate simulation results. Thus the fidelity, accuracy and validity of simulation models shall be monitored in context all along the design phases to build confidence in achievement of the goals of modelling and simulation.In the CRESCENDO project, we adapt the Credibility Assessment Scale method from NASA standard for models and simulations from space programme to the aircraft design in order to assess the quality of simulations. The proposed eight quality assurance metrics aggregate information to indicate the levels of confidence in results. They are displayed in management dashboard and can secure design trade-off decisions at programme milestones.The application of this technique is illustrated in aircraft design context with specific thermal Finite Elements Analysis. This use case shows how to judge the fitness- for-purpose of simulation as Virtual testing means and then green-light the continuation of Simulation Lifecycle Management (SLM) process.

  7. Profiling and sorting Mangifera Indica morphology for quality attributes and grade standards using integrated image processing algorithms

    NASA Astrophysics Data System (ADS)

    Balbin, Jessie R.; Fausto, Janette C.; Janabajab, John Michael M.; Malicdem, Daryl James L.; Marcelo, Reginald N.; Santos, Jan Jeffrey Z.

    2017-06-01

    Mango production is highly vital in the Philippines. It is very essential in the food industry as it is being used in markets and restaurants daily. The quality of mangoes can affect the income of a mango farmer, thus incorrect time of harvesting will result to loss of quality mangoes and income. Scientific farming is much needed nowadays together with new gadgets because wastage of mangoes increase annually due to uncouth quality. This research paper focuses on profiling and sorting of Mangifera Indica using image processing techniques and pattern recognition. The image of a mango is captured on a weekly basis from its early stage. In this study, the researchers monitor the growth and color transition of a mango for profiling purposes. Actual dimensions of the mango are determined through image conversion and determination of pixel and RGB values covered through MATLAB. A program is developed to determine the range of the maximum size of a standard ripe mango. Hue, light, saturation (HSL) correction is used in the filtering process to assure the exactness of RGB values of a mango subject. By pattern recognition technique, the program can determine if a mango is standard and ready to be exported.

  8. [Criteria of quality of structure in rehabilitation units with inpatient treatment].

    PubMed

    Klein, K; Farin, E; Jäckel, W H; Blatt, O; Schliehe, F

    2004-04-01

    The structure of a rehabilitation unit is an important feature of the quality of care. Adequate and qualitatively good structures provide the basis for appropriate therapy offers and treatment and eventually, a better health for rehabilitants. The quality of structures is generally recorded without any evaluation of the aspects in particular. The definition of standards is the basis for such an evaluation. The project presented is aimed at the definition of relevant structural standards for rehab units with inpatient treatment for musculoskeletal, cardiac, neurological, gastroenterological, oncological, pneumological and dermatological diseases. Here, the distinction between basal criteria which have to be fulfilled by every rehab unit with inpatient treatment and criteria important for a well-aimed assignment of patients with specific needs ("assignment criteria") should be made. Apart from the documentation of structural attributes, the structural quality of a rehab unit can be described individually as well as in comparison with other units. Relevant structural criteria were defined in expert meetings by means of a modified Delphi-technique with five inquiries. Overall, 199 "basal criteria" and "assignment criteria" were defined. All criteria can be assigned to the two domains general structural characteristics (general characteristics and equipment of rooms; medical/technical equipment; therapy, education, care; staff) and process-related structures (conceptual frames; internal quality management; internal communication and personnel development). The structural standards are applicable to units for musculoskeletal, cardiac, neurological, oncological, gastroenterological, dermatological and pneumological rehabilitation financed by the two main providers of rehabilitation, the statutory pension insurance scheme and the statutory health insurance scheme for all other five indications. The definition of structural standards agreed by experts in a formal consensus process, provides comprehensive and concrete requirements for German rehab units with inpatient medical rehabilitation. If the two main providers of rehabilitation both use the standards this can be regarded as a hallmark on the path to a unitary programme for quality management. The results enable units to analyse their weak points not just on an individual basis but allow also for a comparison between units, along with contributing to optimizing the structural quality of rehab units.

  9. Developing a typology of African Americans with limited literacy based on preventive health practice orientation: implications for colorectal cancer screening strategies.

    PubMed

    Gordon, Thomas F; Bass, Sarah Bauerle; Ruzek, Sheryl B; Wolak, Caitlin; Rovito, Michael J; Ruggieri, Dominique G; Ward, Stephanie; Paranjape, Anuradha; Greener, Judith

    2014-01-01

    Preventive health messages are often tailored to reach broad sociodemographic groups. However, within groups, there may be considerable variation in perceptions of preventive health practices, such as colorectal cancer screening. Segmentation analysis provides a tool for crafting messages that are tailored more closely to the mental models of targeted individuals or subgroups. This study used cluster analysis, a psychosocial marketing segmentation technique, to develop a typology of colorectal cancer screening orientation among 102 African American clinic patients between the ages of 50 and 74 years with limited literacy. Patients were from a general internal medicine clinic in a large urban teaching hospital, a subpopulation known to have high rates of colorectal cancer and low rates of screening. Preventive screening orientation variables included the patients' responses to questions involving personal attitudes and preferences toward preventive screening and general prevention practices. A k-means cluster analysis yielded three clusters of patients on the basis of their screening orientation: ready screeners (50.0%), cautious screeners (30.4%), and fearful avoiders (19.6%). The resulting typology clearly defines important subgroups on the basis of their preventive health practice perceptions. The authors propose that the development of a validated typology of patients on the basis of their preventive health perceptions could be applicable to a variety of health concerns. Such a typology would serve to standardize how populations are characterized and would provide a more accurate view of their preventive health-related attitudes, values, concerns, preferences, and behaviors. Used with standardized assessment tools, it would provide an empirical basis for tailoring health messages and improving medical communication.

  10. Discrete variable representation in electronic structure theory: quadrature grids for least-squares tensor hypercontraction.

    PubMed

    Parrish, Robert M; Hohenstein, Edward G; Martínez, Todd J; Sherrill, C David

    2013-05-21

    We investigate the application of molecular quadratures obtained from either standard Becke-type grids or discrete variable representation (DVR) techniques to the recently developed least-squares tensor hypercontraction (LS-THC) representation of the electron repulsion integral (ERI) tensor. LS-THC uses least-squares fitting to renormalize a two-sided pseudospectral decomposition of the ERI, over a physical-space quadrature grid. While this procedure is technically applicable with any choice of grid, the best efficiency is obtained when the quadrature is tuned to accurately reproduce the overlap metric for quadratic products of the primary orbital basis. Properly selected Becke DFT grids can roughly attain this property. Additionally, we provide algorithms for adopting the DVR techniques of the dynamics community to produce two different classes of grids which approximately attain this property. The simplest algorithm is radial discrete variable representation (R-DVR), which diagonalizes the finite auxiliary-basis representation of the radial coordinate for each atom, and then combines Lebedev-Laikov spherical quadratures and Becke atomic partitioning to produce the full molecular quadrature grid. The other algorithm is full discrete variable representation (F-DVR), which uses approximate simultaneous diagonalization of the finite auxiliary-basis representation of the full position operator to produce non-direct-product quadrature grids. The qualitative features of all three grid classes are discussed, and then the relative efficiencies of these grids are compared in the context of LS-THC-DF-MP2. Coarse Becke grids are found to give essentially the same accuracy and efficiency as R-DVR grids; however, the latter are built from explicit knowledge of the basis set and may guide future development of atom-centered grids. F-DVR is found to provide reasonable accuracy with markedly fewer points than either Becke or R-DVR schemes.

  11. [The implementation of strategy of medicinal support in multi-type hospital].

    PubMed

    Ludupova, E Yu

    2016-01-01

    The article presents brief review of implementation of strategy of medicinal support of population of the Russian Federation and experience of application of at the level of regional hospital. The necessity and importance of implementation into practice of hospitals of methodology of pharmaco-economical management of medicinal care using modern technologies of XYZ-, ABC and VEN-analysis is demonstrated. The stages of development and implementation of process of medicinal support of multifield hospital applying principles of system of quality management (processing and systemic approaches, risk management) on the basis of standards ISO 9001 are described. The significance of monitoring of results ofprocess of medicinal support of the basis of implementation of priority target programs (prevention of venous thrombo-embolic complications, system od control of anti-bacterial therapy) are demonstrated in relation to multi-field hospital using technique of ATC/DDD-analysis for evaluating indices of effectiveness and efficiency.

  12. An expanded calibration study of the explicitly correlated CCSD(T)-F12b method using large basis set standard CCSD(T) atomization energies.

    PubMed

    Feller, David; Peterson, Kirk A

    2013-08-28

    The effectiveness of the recently developed, explicitly correlated coupled cluster method CCSD(T)-F12b is examined in terms of its ability to reproduce atomization energies derived from complete basis set extrapolations of standard CCSD(T). Most of the standard method findings were obtained with aug-cc-pV7Z or aug-cc-pV8Z basis sets. For a few homonuclear diatomic molecules it was possible to push the basis set to the aug-cc-pV9Z level. F12b calculations were performed with the cc-pVnZ-F12 (n = D, T, Q) basis set sequence and were also extrapolated to the basis set limit using a Schwenke-style, parameterized formula. A systematic bias was observed in the F12b method with the (VTZ-F12/VQZ-F12) basis set combination. This bias resulted in the underestimation of reference values associated with small molecules (valence correlation energies <0.5 E(h)) and an even larger overestimation of atomization energies for bigger systems. Consequently, caution should be exercised in the use of F12b for high accuracy studies. Root mean square and mean absolute deviation error metrics for this basis set combination were comparable to complete basis set values obtained with standard CCSD(T) and the aug-cc-pVDZ through aug-cc-pVQZ basis set sequence. However, the mean signed deviation was an order of magnitude larger. Problems partially due to basis set superposition error were identified with second row compounds which resulted in a weak performance for the smaller VDZ-F12/VTZ-F12 combination of basis sets.

  13. Simple and efficient LCAO basis sets for the diffuse states in carbon nanostructures.

    PubMed

    Papior, Nick R; Calogero, Gaetano; Brandbyge, Mads

    2018-06-27

    We present a simple way to describe the lowest unoccupied diffuse states in carbon nanostructures in density functional theory calculations using a minimal LCAO (linear combination of atomic orbitals) basis set. By comparing plane wave basis calculations, we show how these states can be captured by adding long-range orbitals to the standard LCAO basis sets for the extreme cases of planar sp 2 (graphene) and curved carbon (C 60 ). In particular, using Bessel functions with a long range as additional basis functions retain a minimal basis size. This provides a smaller and simpler atom-centered basis set compared to the standard pseudo-atomic orbitals (PAOs) with multiple polarization orbitals or by adding non-atom-centered states to the basis.

  14. Simple and efficient LCAO basis sets for the diffuse states in carbon nanostructures

    NASA Astrophysics Data System (ADS)

    Papior, Nick R.; Calogero, Gaetano; Brandbyge, Mads

    2018-06-01

    We present a simple way to describe the lowest unoccupied diffuse states in carbon nanostructures in density functional theory calculations using a minimal LCAO (linear combination of atomic orbitals) basis set. By comparing plane wave basis calculations, we show how these states can be captured by adding long-range orbitals to the standard LCAO basis sets for the extreme cases of planar sp 2 (graphene) and curved carbon (C60). In particular, using Bessel functions with a long range as additional basis functions retain a minimal basis size. This provides a smaller and simpler atom-centered basis set compared to the standard pseudo-atomic orbitals (PAOs) with multiple polarization orbitals or by adding non-atom-centered states to the basis.

  15. Symmetry analysis of trimers rovibrational spectra: the case of Ne3★

    NASA Astrophysics Data System (ADS)

    Márquez-Mijares, Maykel; Roncero, Octavio; Villarreal, Pablo; González-Lezana, Tomás

    2018-05-01

    An approximate method to assign the symmetry to the rovibrational spectrum of homonuclear trimers based on the solution of the rotational Hamiltonian by means of a purely vibrational basis combined with standard rotational functions is applied on Ne3. The neon trimer constitutes an ideal test between heavier systems such as Ar3 for which the method proves to be an extremely useful technique and some other previously investigated cases such as H3 + where some limitations were observed. Comparisons of the calculated rovibrational energy levels are established with results from different calculations reported in the literature.

  16. Operations planning simulation: Model study

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The use of simulation modeling for the identification of system sensitivities to internal and external forces and variables is discussed. The technique provides a means of exploring alternate system procedures and processes, so that these alternatives may be considered on a mutually comparative basis permitting the selection of a mode or modes of operation which have potential advantages to the system user and the operator. These advantages are measurements is system efficiency are: (1) the ability to meet specific schedules for operations, mission or mission readiness requirements or performance standards and (2) to accomplish the objectives within cost effective limits.

  17. 7 CFR 810.1803 - Basis of determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... GRAIN United States Standards for Sunflower Seed Principles Governing the Application of Standards § 810... per bushel, and dehulled seed is made on the basis of the grain when free from foreign material. Other...

  18. 7 CFR 810.1803 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GRAIN United States Standards for Sunflower Seed Principles Governing the Application of Standards § 810... per bushel, and dehulled seed is made on the basis of the grain when free from foreign material. Other...

  19. 7 CFR 810.2203 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... GRAIN United States Standards for Wheat Principles Governing the Application of Standards § 810.2203..., wheat of other classes, contrasting classes, and subclasses is made on the basis of the grain when free...

  20. A simple linear regression method for quantitative trait loci linkage analysis with censored observations.

    PubMed

    Anderson, Carl A; McRae, Allan F; Visscher, Peter M

    2006-07-01

    Standard quantitative trait loci (QTL) mapping techniques commonly assume that the trait is both fully observed and normally distributed. When considering survival or age-at-onset traits these assumptions are often incorrect. Methods have been developed to map QTL for survival traits; however, they are both computationally intensive and not available in standard genome analysis software packages. We propose a grouped linear regression method for the analysis of continuous survival data. Using simulation we compare this method to both the Cox and Weibull proportional hazards models and a standard linear regression method that ignores censoring. The grouped linear regression method is of equivalent power to both the Cox and Weibull proportional hazards methods and is significantly better than the standard linear regression method when censored observations are present. The method is also robust to the proportion of censored individuals and the underlying distribution of the trait. On the basis of linear regression methodology, the grouped linear regression model is computationally simple and fast and can be implemented readily in freely available statistical software.

  1. Anticoagulative strategies in reconstructive surgery – clinical significance and applicability

    PubMed Central

    Jokuszies, Andreas; Herold, Christian; Niederbichler, Andreas D.; Vogt, Peter M.

    2012-01-01

    Advanced strategies in reconstructive microsurgery and especially free tissue transfer with advanced microvascular techniques have been routinely applied and continously refined for more than three decades in day-to-day clinical work. Bearing in mind the success rates of more than 95%, the value of these techniques in patient care and comfort (one-step reconstruction of even the most complex tissue defects) cannot be underestimated. However, anticoagulative protocols and practices are far from general acceptance and – most importantly – lack the benchmark of evidence basis while the reconstructive and microsurgical methods are mostly standardized. Therefore, the aim of our work was to review the actual literature and synoptically lay out the mechanisms of action of the plethora of anticoagulative substances. The pharmacologic prevention and the surgical intervention of thrombembolic events represent an established and essential part of microsurgery. The high success rates of microvascular free tissue transfer as of today are due to treatment of patients in reconstructive centers where proper patient selection, excellent microsurgical technique, tissue transfer to adequate recipient vessels, and early anastomotic revision in case of thrombosis is provided. Whether the choice of antithrombotic agents is a factor of success remains still unclear. Undoubtedly however the lack of microsurgical experience and bad technique can never be compensated by any regimen of antithrombotic therapy. All the more, the development of consistent standards and algorithms in reconstructive microsurgery is absolutely essential to optimize clinical outcomes and increase multicentric and international comparability of postoperative results and complications. PMID:22294976

  2. A Review of Surface Water Quality Models

    PubMed Central

    Li, Shibei; Jia, Peng; Qi, Changjun; Ding, Feng

    2013-01-01

    Surface water quality models can be useful tools to simulate and predict the levels, distributions, and risks of chemical pollutants in a given water body. The modeling results from these models under different pollution scenarios are very important components of environmental impact assessment and can provide a basis and technique support for environmental management agencies to make right decisions. Whether the model results are right or not can impact the reasonability and scientificity of the authorized construct projects and the availability of pollution control measures. We reviewed the development of surface water quality models at three stages and analyzed the suitability, precisions, and methods among different models. Standardization of water quality models can help environmental management agencies guarantee the consistency in application of water quality models for regulatory purposes. We concluded the status of standardization of these models in developed countries and put forward available measures for the standardization of these surface water quality models, especially in developing countries. PMID:23853533

  3. Measurement of spin correlation between top and antitop quarks produced in $$p\\bar{p}$$ collisions at $$\\sqrt{s} = 1.96$$ TeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abazov, Victor Mukhamedovich

    Here, we present a measurement of the correlation between the spins of t and tbar quarks produced in proton-antiproton collisions at the Tevatron Collider at a center-of-mass energy of 1.96 TeV. We apply a matrix element technique to dilepton and single-lepton+jets final states in data accumulated with the D0 detector that correspond to an integrated luminosity of 9.7 fbmore » $$^{-1}$$. The measured value of the correlation coefficient in the off-diagonal basis, $$O_{off} = 0.89 \\pm 0.22$$ (stat + syst), is in agreement with the standard model prediction, and represents evidence for a top-antitop quark spin correlation difference from zero at a level of 4.2 standard deviations.« less

  4. Measurement of spin correlation between top and antitop quarks produced in $$p\\bar{p}$$ collisions at $$\\sqrt{s} = 1.96$$ TeV

    DOE PAGES

    Abazov, Victor Mukhamedovich

    2016-03-25

    Here, we present a measurement of the correlation between the spins of t and tbar quarks produced in proton-antiproton collisions at the Tevatron Collider at a center-of-mass energy of 1.96 TeV. We apply a matrix element technique to dilepton and single-lepton+jets final states in data accumulated with the D0 detector that correspond to an integrated luminosity of 9.7 fbmore » $$^{-1}$$. The measured value of the correlation coefficient in the off-diagonal basis, $$O_{off} = 0.89 \\pm 0.22$$ (stat + syst), is in agreement with the standard model prediction, and represents evidence for a top-antitop quark spin correlation difference from zero at a level of 4.2 standard deviations.« less

  5. µ-XRF Studies on the Colour Brilliance in Ancient Wool Carpets

    PubMed Central

    Meyer, Markus; Borca, Camelia N.; Huthwelker, Thomas; Bieber, Manfred; Meßlinger, Karl; Fink, Rainer H.

    2017-01-01

    Many handmade ancient and recent oriental wool carpets show outstanding brilliance and persistence of colour that is not achieved by common industrial dyeing procedures. Anthropologists have suggested the influence of wool fermentation prior to dyeing as key technique to achieve the high dyeing quality. By means of μ-XRF elemental mapping of mordant metals we corroborate this view and show a deep and homogenous penetration of colourants into fermented wool fibres. Furthermore we are able to apply this technique and prove that the fermentation process for ancient specimens cannot be investigated by standard methods due to the lack of intact cuticle layers. This finding suggests a broad range of further investigations that will contribute to a deeper understanding of the development of traditional dyeing techniques. Spectroscopic studies add information on the oxidation states of the metal ions within the respective mordant-dye-complexes and suggest a partial charge transfer as basis for a significant colour change when Fe mordants are used. PMID:29109824

  6. A comparison of VLSI architecture of finite field multipliers using dual, normal or standard basis

    NASA Technical Reports Server (NTRS)

    Hsu, I. S.; Truong, T. K.; Shao, H. M.; Deutsch, L. J.; Reed, I. S.

    1987-01-01

    Three different finite field multipliers are presented: (1) a dual basis multiplier due to Berlekamp; (2) a Massy-Omura normal basis multiplier; and (3) the Scott-Tavares-Peppard standard basis multiplier. These algorithms are chosen because each has its own distinct features which apply most suitably in different areas. Finally, they are implemented on silicon chips with nitride metal oxide semiconductor technology so that the multiplier most desirable for very large scale integration implementations can readily be ascertained.

  7. Determining the status quo of infection prevention and control standards in the hospitals of iran: a case study in 23 hospitals.

    PubMed

    Shojaee, Jalil; Moosazadeh, Mahmood

    2014-02-01

    Applying Prevention and Control of Infection (PCI) standards in hospitals reduces probable risks to patients, staff and visitors; it also increases efficiency, and ultimately improves productivity of hospitals. The current study aimed to determine the status quo of international standards of PCI in hospitals located in the north of Iran. This cross-sectional study was conducted in 23 hospitals. Data collection tool was a questionnaire with confirmed validity and reliability. . In this regard, 260 managers, section supervisors and infection control nurses participated in the study according to census basis. SPSS software version 16 was employed to analyze the data through descriptive and analytical statistics. Among the studied hospitals, 18 hospitals were public. Hospitals enjoyed 77.2% of leadership and programming, 80.8% of focus of programs, 67.4% of isolating methods, 88.2% of hand health and protection techniques, 78.8% of improving patient's safety and quality, 90.3% of training personnel, and 78.7% of the average status quo of PCI standards. This study revealed that PCI standards were significantly observed in the studied hospitals and that there were necessary conditions for full deployment of nosocomial infection surveillance.

  8. Reducing Time and Increasing Sensitivity in Sample Preparation for Adherent Mammalian Cell Metabolomics

    PubMed Central

    Lorenz, Matthew A.; Burant, Charles F.; Kennedy, Robert T.

    2011-01-01

    A simple, fast, and reproducible sample preparation procedure was developed for relative quantification of metabolites in adherent mammalian cells using the clonal β-cell line INS-1 as a model sample. The method was developed by evaluating the effect of different sample preparation procedures on high performance liquid chromatography- mass spectrometry quantification of 27 metabolites involved in glycolysis and the tricarboxylic acid cycle on a directed basis as well as for all detectable chromatographic features on an undirected basis. We demonstrate that a rapid water rinse step prior to quenching of metabolism reduces components that suppress electrospray ionization thereby increasing signal for 26 of 27 targeted metabolites and increasing total number of detected features from 237 to 452 with no detectable change of metabolite content. A novel quenching technique is employed which involves addition of liquid nitrogen directly to the culture dish and allows for samples to be stored at −80 °C for at least 7 d before extraction. Separation of quenching and extraction steps provides the benefit of increased experimental convenience and sample stability while maintaining metabolite content similar to techniques that employ simultaneous quenching and extraction with cold organic solvent. The extraction solvent 9:1 methanol: chloroform was found to provide superior performance over acetonitrile, ethanol, and methanol with respect to metabolite recovery and extract stability. Maximal recovery was achieved using a single rapid (~1 min) extraction step. The utility of this rapid preparation method (~5 min) was demonstrated through precise metabolite measurements (11% average relative standard deviation without internal standards) associated with step changes in glucose concentration that evoke insulin secretion in the clonal β-cell line INS-1. PMID:21456517

  9. 7 CFR 810.103 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD ADMINISTRATION (FEDERAL GRAIN INSPECTION SERVICE), DEPARTMENT OF AGRICULTURE OFFICIAL UNITED STATES STANDARDS FOR GRAIN General Provisions Principles Governing the Application of Standards § 810.103 Basis of...

  10. Ratio of sequential chromatograms for quantitative analysis and peak deconvolution: Application to standard addition method and process monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.

    1990-08-01

    This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less

  11. Task-based evaluation of segmentation algorithms for diffusion-weighted MRI without using a gold standard

    PubMed Central

    Jha, Abhinav K.; Kupinski, Matthew A.; Rodríguez, Jeffrey J.; Stephen, Renu M.; Stopeck, Alison T.

    2012-01-01

    In many studies, the estimation of the apparent diffusion coefficient (ADC) of lesions in visceral organs in diffusion-weighted (DW) magnetic resonance images requires an accurate lesion-segmentation algorithm. To evaluate these lesion-segmentation algorithms, region-overlap measures are used currently. However, the end task from the DW images is accurate ADC estimation, and the region-overlap measures do not evaluate the segmentation algorithms on this task. Moreover, these measures rely on the existence of gold-standard segmentation of the lesion, which is typically unavailable. In this paper, we study the problem of task-based evaluation of segmentation algorithms in DW imaging in the absence of a gold standard. We first show that using manual segmentations instead of gold-standard segmentations for this task-based evaluation is unreliable. We then propose a method to compare the segmentation algorithms that does not require gold-standard or manual segmentation results. The no-gold-standard method estimates the bias and the variance of the error between the true ADC values and the ADC values estimated using the automated segmentation algorithm. The method can be used to rank the segmentation algorithms on the basis of both accuracy and precision. We also propose consistency checks for this evaluation technique. PMID:22713231

  12. The role of continuity in residual-based variational multiscale modeling of turbulence

    NASA Astrophysics Data System (ADS)

    Akkerman, I.; Bazilevs, Y.; Calo, V. M.; Hughes, T. J. R.; Hulshoff, S.

    2008-02-01

    This paper examines the role of continuity of the basis in the computation of turbulent flows. We compare standard finite elements and non-uniform rational B-splines (NURBS) discretizations that are employed in Isogeometric Analysis (Hughes et al. in Comput Methods Appl Mech Eng, 194:4135 4195, 2005). We make use of quadratic discretizations that are C 0-continuous across element boundaries in standard finite elements, and C 1-continuous in the case of NURBS. The variational multiscale residual-based method (Bazilevs in Isogeometric analysis of turbulence and fluid-structure interaction, PhD thesis, ICES, UT Austin, 2006; Bazilevs et al. in Comput Methods Appl Mech Eng, submitted, 2007; Calo in Residual-based multiscale turbulence modeling: finite volume simulation of bypass transition. PhD thesis, Department of Civil and Environmental Engineering, Stanford University, 2004; Hughes et al. in proceedings of the XXI international congress of theoretical and applied mechanics (IUTAM), Kluwer, 2004; Scovazzi in Multiscale methods in science and engineering, PhD thesis, Department of Mechanical Engineering, Stanford Universty, 2004) is employed as a turbulence modeling technique. We find that C 1-continuous discretizations outperform their C 0-continuous counterparts on a per-degree-of-freedom basis. We also find that the effect of continuity is greater for higher Reynolds number flows.

  13. Assessment of multireference approaches to explicitly correlated full configuration interaction quantum Monte Carlo

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kersten, J. A. F., E-mail: jennifer.kersten@cantab.net; Alavi, Ali, E-mail: a.alavi@fkf.mpg.de; Max Planck Institute for Solid State Research, Heisenbergstraße 1, 70569 Stuttgart

    2016-08-07

    The Full Configuration Interaction Quantum Monte Carlo (FCIQMC) method has proved able to provide near-exact solutions to the electronic Schrödinger equation within a finite orbital basis set, without relying on an expansion about a reference state. However, a drawback to the approach is that being based on an expansion of Slater determinants, the FCIQMC method suffers from a basis set incompleteness error that decays very slowly with the size of the employed single particle basis. The FCIQMC results obtained in a small basis set can be improved significantly with explicitly correlated techniques. Here, we present a study that assesses andmore » compares two contrasting “universal” explicitly correlated approaches that fit into the FCIQMC framework: the [2]{sub R12} method of Kong and Valeev [J. Chem. Phys. 135, 214105 (2011)] and the explicitly correlated canonical transcorrelation approach of Yanai and Shiozaki [J. Chem. Phys. 136, 084107 (2012)]. The former is an a posteriori internally contracted perturbative approach, while the latter transforms the Hamiltonian prior to the FCIQMC simulation. These comparisons are made across the 55 molecules of the G1 standard set. We found that both methods consistently reduce the basis set incompleteness, for accurate atomization energies in small basis sets, reducing the error from 28 mE{sub h} to 3-4 mE{sub h}. While many of the conclusions hold in general for any combination of multireference approaches with these methodologies, we also consider FCIQMC-specific advantages of each approach.« less

  14. SIEST-A-RT: a study of vacancy diffusion in crystalline silicon using a local-basis first-principle (SIESTA) activation technique (ART).

    NASA Astrophysics Data System (ADS)

    El Mellouhi, Fedwa; Mousseau, Normand; Ordejón, Pablo

    2003-03-01

    We report on a first-principle study of vacancy-induced self-diffusion in crystalline silicon. Our simulations are performed on supercells containing 63 and 215 atoms. We generate the diffusion paths using the activation-relaxation technique (ART) [1], which can sample efficiently the energy landscape of complex systems. The forces and energy are evaluated using SIESTA [2], a selfconsistent density functional method using standard norm-conserving pseudopotentials and a flexible numerical linear combination of atomic orbitals basis set. Combining these two methods allows us to identify diffusion paths that would not be reachable with this degree of accuracy, using other methods. After a full relaxation of the neutral vacancy, we proceed to search for local diffusion paths. We identify various mechanisms like the formation of the four fold coordinated defect, and the recombination of dangling bonds by WWW process. The diffusion of the vacancy proceeds by hops to first nearest neighbor with an energy barrier of 0.69 eV. This work is funded in part by NSERC and NATEQ. NM is a Cottrell Scholar of the Research Corporation. [1] G. T. Barkema and N. Mousseau, Event-based relaxation of continuous disordered systems, Phys. Rev. Lett. 77, 4358 (1996); N. Mousseau and G. T. Barkema, Traveling through potential energy landscapes of disordered materials: ART, Phys. Rev. E 57, 2419 (1998). [2] Density functional method for very large systems with LCAO basis sets D. Sánchez-Portal, P. Ordejón, E. Artacho and J. M. Soler, Int. J. Quant. Chem. 65, 453 (1997).

  15. Caesarean Section: Could Different Transverse Abdominal Incision Techniques Influence Postpartum Pain and Subsequent Quality of Life? A Systematic Review

    PubMed Central

    Gizzo, Salvatore; Andrisani, Alessandra; Noventa, Marco; Di Gangi, Stefania; Quaranta, Michela; Cosmi, Erich; D’Antona, Donato; Nardelli, Giovanni Battista; Ambrosini, Guido

    2015-01-01

    The choice of the type of abdominal incision performed in caesarean delivery is made chiefly on the basis of the individual surgeon’s experience and preference. A general consensus on the most appropriate surgical technique has not yet been reached. The aim of this systematic review of the literature is to compare the two most commonly used transverse abdominal incisions for caesarean delivery, the Pfannenstiel incision and the modified Joel-Cohen incision, in terms of acute and chronic post-surgical pain and their subsequent influence in terms of quality of life. Electronic database searches formed the basis of the literature search and the following databases were searched in the time frame between January 1997 and December 2013: MEDLINE, EMBASE Sciencedirect and the Cochrane Library. Key search terms included: “acute pain”, “chronic pain”, “Pfannenstiel incision”, “Misgav-Ladach”, “Joel Cohen incision”, in combination with “Caesarean Section”, “abdominal incision”, “numbness”, “neuropathic pain” and “nerve entrapment”. Data on 4771 patients who underwent caesarean section (CS) was collected with regards to the relation between surgical techniques and postoperative outcomes defined as acute or chronic pain and future pregnancy desire. The Misgav-Ladach incision was associated with a significant advantage in terms of reduction of post-surgical acute and chronic pain. It was indicated as the optimal technique in view of its characteristic of reducing lower pelvic discomfort and pain, thus improving quality of life and future fertility desire. Further studies which are not subject to important bias like pre-existing chronic pain, non-standardized analgesia administration, variable length of skin incision and previous abdominal surgery are required. PMID:25646621

  16. Caesarean section: could different transverse abdominal incision techniques influence postpartum pain and subsequent quality of life? A systematic review.

    PubMed

    Gizzo, Salvatore; Andrisani, Alessandra; Noventa, Marco; Di Gangi, Stefania; Quaranta, Michela; Cosmi, Erich; D'Antona, Donato; Nardelli, Giovanni Battista; Ambrosini, Guido

    2015-01-01

    The choice of the type of abdominal incision performed in caesarean delivery is made chiefly on the basis of the individual surgeon's experience and preference. A general consensus on the most appropriate surgical technique has not yet been reached. The aim of this systematic review of the literature is to compare the two most commonly used transverse abdominal incisions for caesarean delivery, the Pfannenstiel incision and the modified Joel-Cohen incision, in terms of acute and chronic post-surgical pain and their subsequent influence in terms of quality of life. Electronic database searches formed the basis of the literature search and the following databases were searched in the time frame between January 1997 and December 2013: MEDLINE, EMBASE Sciencedirect and the Cochrane Library. Key search terms included: "acute pain", "chronic pain", "Pfannenstiel incision", "Misgav-Ladach", "Joel Cohen incision", in combination with "Caesarean Section", "abdominal incision", "numbness", "neuropathic pain" and "nerve entrapment". Data on 4771 patients who underwent caesarean section (CS) was collected with regards to the relation between surgical techniques and postoperative outcomes defined as acute or chronic pain and future pregnancy desire. The Misgav-Ladach incision was associated with a significant advantage in terms of reduction of post-surgical acute and chronic pain. It was indicated as the optimal technique in view of its characteristic of reducing lower pelvic discomfort and pain, thus improving quality of life and future fertility desire. Further studies which are not subject to important bias like pre-existing chronic pain, non-standardized analgesia administration, variable length of skin incision and previous abdominal surgery are required.

  17. Improved 206Pb/238U microprobe geochronology by the monitoring of a trace-element-related matrix effect; SHRIMP, ID-TIMS, ELA-ICP-MS and oxygen isotope documentation for a series of zircon standards

    USGS Publications Warehouse

    Black, L.P.; Kamo, S.L.; Allen, C.M.; Davis, D.W.; Aleinikoff, J.N.; Valley, J.W.; Mundil, R.; Campbell, I.H.; Korsch, R.J.; Williams, I.S.; Foudoulis, C.

    2004-01-01

    Precise isotope dilution-thermal ionisation mass spectrometry (ID-TIMS) documentation is given for two new Palaeozoic zircon standards (TEMORA 2 and R33). These data, in combination with results for previously documented standards (AS3, SL13, QGNG and TEMORA 1), provide the basis for a detailed investigation of inconsistencies in 206Pb/238U ages measured by microprobe. Although these ages are normally consistent between any two standards, their relative age offsets are often different from those established by ID-TIMS. This is true for both sensitive high-resolution ion-microprobe (SHRIMP) and excimer laser ablation-inductively coupled plasma-mass spectrometry (ELA-ICP-MS) dating, although the age offsets are in the opposite sense for the two techniques. Various factors have been investigated for possible correlations with age bias, in an attempt to resolve why the accuracy of the method is worse than the indicated precision. Crystallographic orientation, position on the grain-mount and oxygen isotopic composition are unrelated to the bias. There are, however, striking correlations between the 206Pb/238U age offsets and P, Sm and, most particularly, Nd abundances in the zircons. Although these are not believed to be the primary cause of this apparent matrix effect, they indicate that ionisation of 206Pb/238U is influenced, at least in part, by a combination of trace elements. Nd is sufficiently representative of the controlling trace elements that it provides a quantitative means of correcting for the microprobe age bias. This approach has the potential to reduce age biases associated with different techniques, different instrumentation and different standards within and between laboratories. Crown Copyright ?? 2004 Published by Elsevier B.V. All rights reserved.

  18. Oncoplastic round block technique has comparable operative parameters as standard wide local excision: a matched case-control study.

    PubMed

    Lim, Geok-Hoon; Allen, John Carson; Ng, Ruey Pyng

    2017-08-01

    Although oncoplastic breast surgery is used to resect larger tumors with lower re-excision rates compared to standard wide local excision (sWLE), criticisms of oncoplastic surgery include a longer-albeit, well concealed-scar, longer operating time and hospital stay, and increased risk of complications. Round block technique has been reported to be very suitable for patients with relatively smaller breasts and minimal ptosis. We aim to determine if round block technique will result in operative parameters comparable with sWLE. Breast cancer patients who underwent a round block procedure from 1st May 2014 to 31st January 2016 were included in the study. These patients were then matched for the type of axillary procedure, on a one to one basis, with breast cancer patients who had undergone sWLE from 1st August 2011 to 31st January 2016. The operative parameters between the 2 groups were compared. 22 patients were included in the study. Patient demographics and histologic parameters were similar in the 2 groups. No complications were reported in either group. The mean operating time was 122 and 114 minutes in the round block and sWLE groups, respectively (P=0.64). Length of stay was similar in the 2 groups (P=0.11). Round block patients had better cosmesis and lower re-excision rates. A higher rate of recurrence was observed in the sWLE group. The round block technique has comparable operative parameters to sWLE with no evidence of increased complications. Lower re-excision rate and better cosmesis were observed in the round block patients suggesting that the round block technique is not only comparable in general, but may have advantages to sWLE in selected cases.

  19. Guidelines for imaging retinoblastoma: imaging principles and MRI standardization.

    PubMed

    de Graaf, Pim; Göricke, Sophia; Rodjan, Firazia; Galluzzi, Paolo; Maeder, Philippe; Castelijns, Jonas A; Brisse, Hervé J

    2012-01-01

    Retinoblastoma is the most common intraocular tumor in children. The diagnosis is usually established by the ophthalmologist on the basis of fundoscopy and US. Together with US, high-resolution MRI has emerged as an important imaging modality for pretreatment assessment, i.e. for diagnostic confirmation, detection of local tumor extent, detection of associated developmental malformation of the brain and detection of associated intracranial primitive neuroectodermal tumor (trilateral retinoblastoma). Minimum requirements for pretreatment diagnostic evaluation of retinoblastoma or mimicking lesions are presented, based on consensus among members of the European Retinoblastoma Imaging Collaboration (ERIC). The most appropriate techniques for imaging in a child with leukocoria are reviewed. CT is no longer recommended. Implementation of a standardized MRI protocol for retinoblastoma in clinical practice may benefit children worldwide, especially those with hereditary retinoblastoma, since a decreased use of CT reduces the exposure to ionizing radiation.

  20. Improved Bat Algorithm Applied to Multilevel Image Thresholding

    PubMed Central

    2014-01-01

    Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed. PMID:25165733

  1. The effect of ionizing radiation on microbiological decontamination of medical herbs and biologically active compounds

    NASA Astrophysics Data System (ADS)

    Migdal, W.; Owczarczyk, B.; Kedzia, B.; Holderna-Kedzia, E.; Segiet-Kujawa, E.

    1998-06-01

    Several thousand tons of medical herbs are produced annually by pharmaceutical industry in Poland. This product should be of highest quality and microbial purity. Recently, chemical methods of decontamination are recognized as less safe, thus irradiation technique was chosen to replace them in use. In the Institute of Nuclear Chemistry and Technology the national program on the application of irradiation to the decontamination of medical herbs is in progress now. The purpose of the program is to elaborate, on the basis of research work, the facility standards and technological instructions indispensable for the practice of radiation technology.

  2. Non-destructive diagnostics of irradiated materials using neutron scattering from pulsed neutron sources

    NASA Astrophysics Data System (ADS)

    Korenev, Sergey; Sikolenko, Vadim

    2004-09-01

    The advantage of neutron-scattering studies as compared to the standard X-ray technique is the high penetration of neutrons that allow us to study volume effects. The high resolution of instrumentation on the basis neutron scattering allows measurement of the parameters of lattice structure with high precision. We suggest the use of neutron scattering from pulsed neutron sources for analysis of materials irradiated with pulsed high current electron and ion beams. The results of preliminary tests using this method for Ni foils that have been studied by neutron diffraction at the IBR-2 (Pulsed Fast Reactor at Joint Institute for Nuclear Research) are presented.

  3. Design methodology analysis: design and operational energy studies in a new high-rise office building. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1983-02-01

    Work on energy consumption in a large office building is reported, including the following tasks: (1) evaluating and testing the effectiveness of the existing ASHRAE 90-75 and 90-80 standards; (2) evaluating the effectiveness of the BEPS; (3) evaluating the effectiveness of some envelope and lighting design variables towards achieving the BEPS budgets; and (4) comparing the computer energy analysis technique, DOE-2.1, with manual calculation procedures. These tasks are the initial activities in the energy analysis of the Park Plaza Building and will serve as the basis for further understanding the results of ongoing data collection and analysis.

  4. Moulding techniques in lipstick manufacture: a comparative evaluation.

    PubMed

    Dweck, A C; Burnham, C A

    1980-06-01

    Synopsis This paper examines two methods of lipstick bulk manufacture: one via a direct method and the other via stock concentrates. The paper continues with a comparison of two manufactured bulks moulded in three different ways - first by split moulding, secondly by Rotamoulding, and finally by Ejectoret moulding. Full consideration is paid to time, labour and cost standards of each approach and the resultant moulding examined using some novel physical testing methods. The results of these tests are statistically analysed. Finally, on the basis of the gathered data and photomicrographical work a theoretical lipstick structure is proposed by which the results may be explained.

  5. Monitoring scale scores over time via quality control charts, model-based approaches, and time series techniques.

    PubMed

    Lee, Yi-Hsuan; von Davier, Alina A

    2013-07-01

    Maintaining a stable score scale over time is critical for all standardized educational assessments. Traditional quality control tools and approaches for assessing scale drift either require special equating designs, or may be too time-consuming to be considered on a regular basis with an operational test that has a short time window between an administration and its score reporting. Thus, the traditional methods are not sufficient to catch unusual testing outcomes in a timely manner. This paper presents a new approach for score monitoring and assessment of scale drift. It involves quality control charts, model-based approaches, and time series techniques to accommodate the following needs of monitoring scale scores: continuous monitoring, adjustment of customary variations, identification of abrupt shifts, and assessment of autocorrelation. Performance of the methodologies is evaluated using manipulated data based on real responses from 71 administrations of a large-scale high-stakes language assessment.

  6. Reed Solomon codes for error control in byte organized computer memory systems

    NASA Technical Reports Server (NTRS)

    Lin, S.; Costello, D. J., Jr.

    1984-01-01

    A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256K-bit DRAM's are organized in 32Kx8 bit-bytes. Byte oriented codes such as Reed Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. Some special decoding techniques for extended single-and-double-error-correcting RS codes which are capable of high speed operation are presented. These techniques are designed to find the error locations and the error values directly from the syndrome without having to use the iterative algorithm to find the error locator polynomial.

  7. The Doghouse Plot: History, Construction Techniques, and Application

    NASA Astrophysics Data System (ADS)

    Wilson, John Robert

    The Doghouse Plot visually represents an aircraft's performance during combined turn-climb maneuvers. The Doghouse Plot completely describes the turn-climb capability of an aircraft; a single plot demonstrates the relationship between climb performance, turn rate, turn radius, stall margin, and bank angle. Using NASA legacy codes, Empirical Drag Estimation Technique (EDET) and Numerical Propulsion System Simulation (NPSS), it is possible to reverse engineer sufficient basis data for commercial and military aircraft to construct Doghouse Plots. Engineers and operators can then use these to assess their aircraft's full performance envelope. The insight gained from these plots can broaden the understanding of an aircraft's performance and, in turn, broaden the operational scope of some aircraft that would otherwise be limited by the simplifications found in their Airplane Flight Manuals (AFM). More importantly, these plots can build on the current standards of obstacle avoidance and expose risks in operation.

  8. [Achilles tendon rupture : Current diagnostic and therapeutic standards].

    PubMed

    Hertel, G; Götz, J; Grifka, J; Willers, J

    2016-08-01

    A superior life expectancy and an increased activity in the population result in an increase in degenerative diseases, such as Achilles tendon ruptures. The medical history and physical examinations are the methods of choice to diagnose Achilles tendon ruptures. Ultrasound and radiography represent reasonable extended diagnostic procedures. In order to decide on the medical indications for the therapy concept, the advantages and disadvantages of conservative and surgical treatment options have to be weighed up on an indivdual basis. There are explicit contraindications for both treatment options. For the surgical treatment concept open suture techniques, minimally invasive methods and reconstructive procedures are available. The postoperative management of the patient is as important as the choice of surgical technique. With the correct medical indications and supervision of the patient it is possible to achieve extremely satisfying results for the patient with both conservative and surgical treatment options.

  9. Decoding of DBEC-TBED Reed-Solomon codes. [Double-Byte-Error-Correcting, Triple-Byte-Error-Detecting

    NASA Technical Reports Server (NTRS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1987-01-01

    A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256 K bit DRAM's are organized in 32 K x 8 bit-bytes. Byte-oriented codes such as Reed-Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. The paper presents a special decoding technique for double-byte-error-correcting, triple-byte-error-detecting RS codes which is capable of high-speed operation. This technique is designed to find the error locations and the error values directly from the syndrome without having to use the iterative algorithm to find the error locator polynomial.

  10. Measuring bio-oil upgrade intermediates and corrosive species with polarity-matched analytical approaches

    DOE PAGES

    Connatser, Raynella M.; Lewis, Sr., Samuel Arthur; Keiser, James R.; ...

    2014-10-03

    Integrating biofuels with conventional petroleum products requires improvements in processing to increase blendability with existing fuels. This work demonstrates analysis techniques for more hydrophilic bio-oil liquids that give improved quantitative and qualitative description of the total acid content and organic acid profiles. To protect infrastructure from damage and reduce the cost associated with upgrading, accurate determination of acid content and representative chemical compound analysis are central imperatives to assessing both the corrosivity and the progress toward removing oxygen and acidity in processed biomass liquids. Established techniques form an ample basis for bio-liquids evaluation. However, early in the upgrading process, themore » unique physical phases and varied hydrophilicity of many pyrolysis liquids can render analytical methods originally designed for use in petroleum-derived oils inadequate. In this work, the water solubility of the organic acids present in bio-oils is exploited in a novel extraction and titration technique followed by analysis on the water-based capillary electrophoresis (CE) platform. The modification of ASTM D664, the standard for Total Acid Number (TAN), to include aqueous carrier solvents improves the utility of that approach for quantifying acid content in hydrophilic bio-oils. Termed AMTAN (modified Total Acid Number), this technique offers 1.2% relative standard deviation and dynamic range comparable to the conventional ASTM method. Furthermore, the results of corrosion product evaluations using several different sources of real bio-oil are discussed in the context of the unique AMTAN and CE analytical approaches developed to facilitate those measurements.« less

  11. Assessment of the interlaboratory variability and robustness of JAK2V617F mutation assays: A study involving a consortium of 19 Italian laboratories

    PubMed Central

    Perricone, Margherita; Palandri, Francesca; Ottaviani, Emanuela; Angelini, Mario; Bagli, Laura; Bellesia, Enrica; Donati, Meris; Gemmati, Donato; Zucchini, Patrizia; Mancini, Stefania; Marchica, Valentina; Trubini, Serena; Matteis, Giovanna De; Zacomo, Silvia Di; Favarato, Mosè; Fioroni, Annamaria; Bolzonella, Caterina; Maccari, Giorgia; Navaglia, Filippo; Gatti, Daniela; Toffolatti, Luisa; Orlandi, Linda; Laloux, Vèronique; Manfrini, Marco; Galieni, Piero; Giannini, Barbara; Tieghi, Alessia; Barulli, Sara; Serino, Maria Luisa; Maccaferri, Monica; Scortechini, Anna Rita; Giuliani, Nicola; Vallisa, Daniele; Bonifacio, Massimiliano; Accorsi, Patrizia; Salbe, Cristina; Fazio, Vinicio; Gusella, Milena; Toffoletti, Eleonora; Salvucci, Marzia; Svaldi, Mirija; Gherlinzoni, Filippo; Cassavia, Francesca; Orsini, Francesco; Martinelli, Giovanni

    2017-01-01

    To date, a plenty of techniques for the detection of JAK2V617F is used over different laboratories, with substantial differences in specificity and sensitivity. Therefore, to provide reliable and comparable results, the standardization of molecular techniques is mandatory. A network of 19 centers was established to 1) evaluate the inter- and intra-laboratory variability in JAK2V617F quantification, 2) identify the most robust assay for the standardization of the molecular test and 3) allow consistent interpretation of individual patient analysis results. The study was conceived in 3 different rounds, in which all centers had to blindly test DNA samples with different JAK2V617F allele burden (AB) using both quantitative and qualitative assays. The positivity of samples with an AB < 1% was not detected by qualitative assays. Conversely, laboratories performing the quantitative approach were able to determine the expected JAK2V617F AB. Quantitative results were reliable across all mutation loads with moderate variability at low AB (0.1 and 1%; CV = 0.46 and 0.77, respectively). Remarkably, all laboratories clearly distinguished between the 0.1 and 1% mutated samples. In conclusion, a qualitative approach is not sensitive enough to detect the JAK2V617F mutation, especially at low AB. On the contrary, the ipsogen JAK2 MutaQuant CE-IVD kit resulted in a high, efficient and sensitive quantification detection of all mutation loads. This study sets the basis for the standardization of molecular techniques for JAK2V617F determination, which will require the employment of approved operating procedures and the use of certificated standards, such as the recent WHO 1st International Reference Panel for Genomic JAK2V617F. PMID:28427233

  12. Assessment of the interlaboratory variability and robustness of JAK2V617F mutation assays: A study involving a consortium of 19 Italian laboratories.

    PubMed

    Perricone, Margherita; Palandri, Francesca; Ottaviani, Emanuela; Angelini, Mario; Bagli, Laura; Bellesia, Enrica; Donati, Meris; Gemmati, Donato; Zucchini, Patrizia; Mancini, Stefania; Marchica, Valentina; Trubini, Serena; De Matteis, Giovanna; Di Zacomo, Silvia; Favarato, Mosè; Fioroni, Annamaria; Bolzonella, Caterina; Maccari, Giorgia; Navaglia, Filippo; Gatti, Daniela; Toffolatti, Luisa; Orlandi, Linda; Laloux, Vèronique; Manfrini, Marco; Galieni, Piero; Giannini, Barbara; Tieghi, Alessia; Barulli, Sara; Serino, Maria Luisa; Maccaferri, Monica; Scortechini, Anna Rita; Giuliani, Nicola; Vallisa, Daniele; Bonifacio, Massimiliano; Accorsi, Patrizia; Salbe, Cristina; Fazio, Vinicio; Gusella, Milena; Toffoletti, Eleonora; Salvucci, Marzia; Svaldi, Mirija; Gherlinzoni, Filippo; Cassavia, Francesca; Orsini, Francesco; Martinelli, Giovanni

    2017-05-16

    To date, a plenty of techniques for the detection of JAK2V617F is used over different laboratories, with substantial differences in specificity and sensitivity. Therefore, to provide reliable and comparable results, the standardization of molecular techniques is mandatory.A network of 19 centers was established to 1) evaluate the inter- and intra-laboratory variability in JAK2V617F quantification, 2) identify the most robust assay for the standardization of the molecular test and 3) allow consistent interpretation of individual patient analysis results. The study was conceived in 3 different rounds, in which all centers had to blindly test DNA samples with different JAK2V617F allele burden (AB) using both quantitative and qualitative assays.The positivity of samples with an AB < 1% was not detected by qualitative assays. Conversely, laboratories performing the quantitative approach were able to determine the expected JAK2V617F AB. Quantitative results were reliable across all mutation loads with moderate variability at low AB (0.1 and 1%; CV = 0.46 and 0.77, respectively). Remarkably, all laboratories clearly distinguished between the 0.1 and 1% mutated samples.In conclusion, a qualitative approach is not sensitive enough to detect the JAK2V617F mutation, especially at low AB. On the contrary, the ipsogen JAK2 MutaQuant CE-IVD kit resulted in a high, efficient and sensitive quantification detection of all mutation loads. This study sets the basis for the standardization of molecular techniques for JAK2V617F determination, which will require the employment of approved operating procedures and the use of certificated standards, such as the recent WHO 1st International Reference Panel for Genomic JAK2V617F.

  13. High-level ab initio enthalpies of formation of 2,5-dimethylfuran, 2-methylfuran, and furan.

    PubMed

    Feller, David; Simmie, John M

    2012-11-29

    A high-level ab initio thermochemical technique, known as the Feller-Petersen-Dixon method, is used to calculate the total atomization energies and hence the enthalpies of formation of 2,5-dimethylfuran, 2-methylfuran, and furan itself as a means of rationalizing significant discrepancies in the literature. In order to avoid extremely large standard coupled cluster theory calculations, the explicitly correlated CCSD(T)-F12b variation was used with basis sets up to cc-pVQZ-F12. After extrapolating to the complete basis set limit and applying corrections for core/valence, scalar relativistic, and higher order effects, the final Δ(f)H° (298.15 K) values, with the available experimental values in parentheses are furan -34.8 ± 3 (-34.7 ± 0.8), 2-methylfuran -80.3 ± 5 (-76.4 ± 1.2), and 2,5-dimethylfuran -124.6 ± 6 (-128.1 ± 1.1) kJ mol(-1). The theoretical results exhibit a compelling internal consistency.

  14. Radial basis function neural networks in non-destructive determination of compound aspirin tablets on NIR spectroscopy.

    PubMed

    Dou, Ying; Mi, Hong; Zhao, Lingzhi; Ren, Yuqiu; Ren, Yulin

    2006-09-01

    The application of the second most popular artificial neural networks (ANNs), namely, the radial basis function (RBF) networks, has been developed for quantitative analysis of drugs during the last decade. In this paper, the two components (aspirin and phenacetin) were simultaneously determined in compound aspirin tablets by using near-infrared (NIR) spectroscopy and RBF networks. The total database was randomly divided into a training set (50) and a testing set (17). Different preprocessing methods (standard normal variate (SNV), multiplicative scatter correction (MSC), first-derivative and second-derivative) were applied to two sets of NIR spectra of compound aspirin tablets with different concentrations of two active components and compared each other. After that, the performance of RBF learning algorithm adopted the nearest neighbor clustering algorithm (NNCA) and the criterion for selection used a cross-validation technique. Results show that using RBF networks to quantificationally analyze tablets is reliable, and the best RBF model was obtained by first-derivative spectra.

  15. [The German Disease Management Guideline Asthma: methods and development process].

    PubMed

    Kopp, Ina; Lelgemann, Monika; Ollenschläger, Günter

    2006-01-01

    The German National Program for Disease Management Guidelines, which is being operated under the auspices of the German Medical Association (GMA), the Association of the Scientific Medical Societies (AWMF) and the National Association of Statutory Health Insurance Physicians (NASHIP), provides a conceptual basis for the disease management of prioritized healthcare aspects. The main objective of the program is to establish consensus of the medical professions on key recommendations covering all sectors of healthcare provision and facilitating the coordination of care for the individual patient through time and across interfaces. Within the scope of this program, the Scientific Medical Societies concerned with the prevention, diagnosis, treatment and rehabilitation of asthma in children, adolescents and adults have reached consensus on the core contents for a National Disease Management Guideline for Asthma. This consensus was reached by applying formal techniques and on the basis of the adaptation of recommendations from existing guidelines with high quality standards in methodology and reporting, and information from evidence reports.

  16. Stress Management Apps With Regard to Emotion-Focused Coping and Behavior Change Techniques: A Content Analysis

    PubMed Central

    Hoffmann, Alexandra; Bleser, Gabriele

    2017-01-01

    Background Chronic stress has been shown to be associated with disease. This link is not only direct but also indirect through harmful health behavior such as smoking or changing eating habits. The recent mHealth trend offers a new and promising approach to support the adoption and maintenance of appropriate stress management techniques. However, only few studies have dealt with the inclusion of evidence-based content within stress management apps for mobile phones. Objective The aim of this study was to evaluate stress management apps on the basis of a new taxonomy of effective emotion-focused stress management techniques and an established taxonomy of behavior change techniques. Methods Two trained and independent raters evaluated 62 free apps found in Google Play with regard to 26 behavior change and 15 emotion-focused stress management techniques in October 2015. Results The apps included an average of 4.3 behavior change techniques (SD 4.2) and 2.8 emotion-focused stress management techniques (SD 2.6). The behavior change technique score and stress management technique score were highly correlated (r=.82, P=.01). Conclusions The broad variation of different stress management strategies found in this sample of apps goes in line with those found in conventional stress management interventions and self-help literature. Moreover, this study provided a first step toward more detailed and standardized taxonomies, which can be used to investigate evidence-based content in stress management interventions and enable greater comparability between different intervention types. PMID:28232299

  17. Strength evaluation of prosthetic check sockets, copolymer sockets, and definitive laminated sockets.

    PubMed

    Gerschutz, Maria J; Haynes, Michael L; Nixon, Derek; Colvin, James M

    2012-01-01

    A prosthesis encounters loading through forces and torques exerted by the person with amputation. International Organization for Standardization (ISO) standard 10328 was designed to test most lower-limb prosthetic components. However, this standard does not include prosthetic sockets. We measured static failure loads of prosthetic sockets using a modified ISO 10328 and then compared them with the criteria set by this standard for other components. Check socket (CS) strengths were influenced by thickness, material choice, and fabrication method. Copolymer socket (CP) strengths depended on thickness and fabrication methods. A majority of the CSs and all of the CPs failed to pass the ISO 10328 ductile loading criterion. In contrast, the strengths of definitive laminated sockets (DLs) were influenced more by construction material and technique. A majority of the DLs failed to pass the ISO 10328 brittle loading criterion. Analyzing prosthetic sockets from a variety of facilities demonstrated that socket performance varies considerably between and within facilities. The results from this article provide a foundation for understanding the quality of prosthetic sockets, some insight into possible routes for improving the current care delivered to patients, and a comparative basis for future technology.

  18. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    PubMed Central

    Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450

  19. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    PubMed

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  20. Approximate techniques of structural reanalysis

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lowder, H. E.

    1974-01-01

    A study is made of two approximate techniques for structural reanalysis. These include Taylor series expansions for response variables in terms of design variables and the reduced-basis method. In addition, modifications to these techniques are proposed to overcome some of their major drawbacks. The modifications include a rational approach to the selection of the reduced-basis vectors and the use of Taylor series approximation in an iterative process. For the reduced basis a normalized set of vectors is chosen which consists of the original analyzed design and the first-order sensitivity analysis vectors. The use of the Taylor series approximation as a first (initial) estimate in an iterative process, can lead to significant improvements in accuracy, even with one iteration cycle. Therefore, the range of applicability of the reanalysis technique can be extended. Numerical examples are presented which demonstrate the gain in accuracy obtained by using the proposed modification techniques, for a wide range of variations in the design variables.

  1. Laryngoscope illuminance in a tertiary children's hospital: implications for quality laryngoscopy.

    PubMed

    Volsky, Peter G; Murphy, Michael K; Darrow, David H

    2014-07-01

    Laryngoscopes are used by otolaryngologists in a variety of hospital emergency and critical care settings. However, only rarely have quality-related aspects of laryngoscope function and application been studied. To compare the illuminance of laryngoscopes commonly used in a hospital setting to established standards and to assess the potential effects of maintenance practices on laryngoscope illuminance. Observational study of laryngoscope light output and cross-sectional survey of individuals charged with laryngoscope maintenance in a tertiary care children's hospital. Illuminance was chosen as the unit of measurement (lux). Laryngoscopes in the operating room, emergency department, and pediatric intensive care unit were tested according to a standard technique. Illuminance standards for laryngoscopes, published by the International Organization for Standardization (ISO) (500 lux) and in the medical literature (867 lux) were used as benchmarks. Mean laryngoscope illuminance by type of laryngoscope and light source and percentage of laryngoscopes with illuminance below established standards as well as nonfunctioning units. Maintenance practices were evaluated as a secondary outcome. A total of 319 laryngoscopes were tested; 283 were incandescent bulb units used by anesthesiologists, emergency physicians, and intensivists and 36 were xenon light units used by otolaryngologists. Mean (SD) illuminance was 1330 (1160) lux in the incandescent group and 16,600 (13,000) lux in the xenon group (P < .001). Substandard illuminance was observed only in the incandescent group, in 29% to 43% of laryngoscopes; 5% of the incandescent group did not turn on at all. Maintenance of laryngoscopes was performed on a reactive rather than a preventive basis. At our facility, approximately one-third of incandescent laryngoscopes exhibited substandard light output. On the basis of these findings, our hospital has converted all of its incandescent laryngoscopes to light-emitting diode (LED) devices. Such changes, as well as the institution of a quality-control program including scheduled laryngoscope inspection and battery and bulb replacement for incandescent laryngoscopes, may reduce adverse events associated with poor-quality direct laryngoscopy.

  2. 49 CFR 525.7 - Basis for petition.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION EXEMPTIONS FROM AVERAGE FUEL ECONOMY STANDARDS § 525.7 Basis for... comply with that average fuel economy standard; and (4) Anticipated consumer demand in the United States... these lubricants, explain the reasons for not so doing. (f) For each affected model year, a fuel economy...

  3. 7 CFR 75.8 - Basis of service.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 3 2014-01-01 2014-01-01 false Basis of service. 75.8 Section 75.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG...

  4. 7 CFR 75.8 - Basis of service.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Basis of service. 75.8 Section 75.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG...

  5. 7 CFR 75.8 - Basis of service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Basis of service. 75.8 Section 75.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG...

  6. 7 CFR 75.8 - Basis of service.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Basis of service. 75.8 Section 75.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG...

  7. 7 CFR 75.8 - Basis of service.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Basis of service. 75.8 Section 75.8 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG...

  8. 38 CFR 21.4204 - Periodic certifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... eligible persons. An eligible person enrolled in a course which leads to a standard college degree, excepting eligible persons pursuing the course on a less than half-time basis, must verify each month his or... standard college degree, if a school organized on a term, quarter, or semester basis has reported...

  9. 40 CFR 465.20 - Applicability; description of the galvanized basis material subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... galvanized basis material subcategory. 465.20 Section 465.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Galvanized Basis Material Subcategory § 465.20 Applicability; description of the galvanized basis material...

  10. 40 CFR 465.30 - Applicability; description of the aluminum basis material subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... aluminum basis material subcategory. 465.30 Section 465.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Aluminum Basis Material Subcategory § 465.30 Applicability; description of the aluminum basis material...

  11. New methods and results for quantification of lightning-aircraft electrodynamics

    NASA Technical Reports Server (NTRS)

    Pitts, Felix L.; Lee, Larry D.; Perala, Rodney A.; Rudolph, Terence H.

    1987-01-01

    The NASA F-106 collected data on the rates of change of electromagnetic parameters on the aircraft surface during over 700 direct lightning strikes while penetrating thunderstorms at altitudes from 15,000 t0 40,000 ft (4,570 to 12,190 m). These in situ measurements provided the basis for the first statistical quantification of the lightning electromagnetic threat to aircraft appropriate for determining indirect lightning effects on aircraft. These data are used to update previous lightning criteria and standards developed over the years from ground-based measurements. The proposed standards will be the first which reflect actual aircraft responses measured at flight altitudes. Nonparametric maximum likelihood estimates of the distribution of the peak electromagnetic rates of change for consideration in the new standards are obtained based on peak recorder data for multiple-strike flights. The linear and nonlinear modeling techniques developed provide means to interpret and understand the direct-strike electromagnetic data acquired on the F-106. The reasonable results obtained with the models, compared with measured responses, provide increased confidence that the models may be credibly applied to other aircraft.

  12. Laser-Based Propagation of Human iPS and ES Cells Generates Reproducible Cultures with Enhanced Differentiation Potential

    PubMed Central

    Hohenstein Elliott, Kristi A.; Peterson, Cory; Soundararajan, Anuradha; Kan, Natalia; Nelson, Brandon; Spiering, Sean; Mercola, Mark; Bright, Gary R.

    2012-01-01

    Proper maintenance of stem cells is essential for successful utilization of ESCs/iPSCs as tools in developmental and drug discovery studies and in regenerative medicine. Standardization is critical for all future applications of stem cells and necessary to fully understand their potential. This study reports a novel approach for the efficient, consistent expansion of human ESCs and iPSCs using laser sectioning, instead of mechanical devices or enzymes, to divide cultures into defined size clumps for propagation. Laser-mediated propagation maintained the pluripotency, quality, and genetic stability of ESCs/iPSCs and led to enhanced differentiation potential. This approach removes the variability associated with ESC/iPSC propagation, significantly reduces the expertise, labor, and time associated with manual passaging techniques and provides the basis for scalable delivery of standardized ESC/iPSC lines. Adoption of standardized protocols would allow researchers to understand the role of genetics, environment, and/or procedural effects on stem cells and would ensure reproducible production of stem cell cultures for use in clinical/therapeutic applications. PMID:22701128

  13. 40 CFR 466.10 - Applicability; description of the steel basis material.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... steel basis material. 466.10 Section 466.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PORCELAIN ENAMELING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 466.10 Applicability; description of the steel basis material. This subpart...

  14. 40 CFR 465.10 - Applicability; description of the steel basis material subcategory.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... steel basis material subcategory. 465.10 Section 465.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 465.10 Applicability; description of the steel basis material subcategory...

  15. 40 CFR 465.10 - Applicability; description of the steel basis material subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... steel basis material subcategory. 465.10 Section 465.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 465.10 Applicability; description of the steel basis material subcategory...

  16. 40 CFR 466.10 - Applicability; description of the steel basis material.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... steel basis material. 466.10 Section 466.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 466.10 Applicability; description of the steel basis material. This subpart applies to...

  17. 40 CFR 466.10 - Applicability; description of the steel basis material.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... steel basis material. 466.10 Section 466.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PORCELAIN ENAMELING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 466.10 Applicability; description of the steel basis material. This subpart...

  18. 40 CFR 466.10 - Applicability; description of the steel basis material.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... steel basis material. 466.10 Section 466.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PORCELAIN ENAMELING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 466.10 Applicability; description of the steel basis material. This subpart...

  19. 40 CFR 466.10 - Applicability; description of the steel basis material.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... steel basis material. 466.10 Section 466.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 466.10 Applicability; description of the steel basis material. This subpart applies to...

  20. From Data to Knowledge – Promising Analytical Tools and Techniques for Capture and Reuse of Corporate Knowledge and to Aid in the State Evaluation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.

    2010-10-29

    The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usablemore » search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.« less

  1. Novel Calibration Technique for a Coulometric Evolved Vapor Analyzer for Measuring Water Content of Materials

    NASA Astrophysics Data System (ADS)

    Bell, S. A.; Miao, P.; Carroll, P. A.

    2018-04-01

    Evolved vapor coulometry is a measurement technique that selectively detects water and is used to measure water content of materials. The basis of the measurement is the quantitative electrolysis of evaporated water entrained in a carrier gas stream. Although this measurement has a fundamental principle—based on Faraday's law which directly relates electrolysis current to amount of substance electrolyzed—in practice it requires calibration. Commonly, reference materials of known water content are used, but the variety of these is limited, and they are not always available for suitable values, materials, with SI traceability, or with well-characterized uncertainty. In this paper, we report development of an alternative calibration approach using as a reference the water content of humid gas of defined dew point traceable to the SI via national humidity standards. The increased information available through this new type of calibration reveals a variation of the instrument performance across its range not visible using the conventional approach. The significance of this is discussed along with details of the calibration technique, example results, and an uncertainty evaluation.

  2. Atmospheric optical calibration system

    DOEpatents

    Hulstrom, Roland L.; Cannon, Theodore W.

    1988-01-01

    An atmospheric optical calibration system is provided to compare actual atmospheric optical conditions to standard atmospheric optical conditions on the basis of aerosol optical depth, relative air mass, and diffuse horizontal skylight to global horizontal photon flux ratio. An indicator can show the extent to which the actual conditions vary from standard conditions. Aerosol scattering and absorption properties, diffuse horizontal skylight to global horizontal photon flux ratio, and precipitable water vapor determined on a real-time basis for optical and pressure measurements are also used to generate a computer spectral model and for correcting actual performance response of a photovoltaic device to standard atmospheric optical condition response on a real-time basis as the device is being tested in actual outdoor conditions.

  3. 26 CFR 301.6334-3 - Determination of exempt amount.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... means an amount equal to— (1) The sum of— (i) The standard deduction (including additional standard... income other than on a weekly basis, the amount payable to that individual during any applicable pay... a regular weekly basis. (2) Specific pay periods other than weekly. In the case of wages, salary, or...

  4. Measurement of total ultrasonic power using thermal expansion and change in buoyancy of an absorbing target

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, P. K., E-mail: premkdubey@gmail.com; Kumar, Yudhisther; Gupta, Reeta

    2014-05-15

    The Radiation Force Balance (RFB) technique is well established and most widely used for the measurement of total ultrasonic power radiated by ultrasonic transducer. The technique is used as a primary standard for calibration of ultrasonic transducers with relatively fair uncertainty in the low power (below 1 W) regime. In this technique, uncertainty comparatively increases in the range of few watts wherein the effects such as thermal heating of the target, cavitations, and acoustic streaming dominate. In addition, error in the measurement of ultrasonic power is also caused due to movement of absorber at relatively high radiated force which occursmore » at high power level. In this article a new technique is proposed which does not measure the balance output during transducer energized state as done in RFB. It utilizes the change in buoyancy of the absorbing target due to local thermal heating. The linear thermal expansion of the target changes the apparent mass in water due to buoyancy change. This forms the basis for the measurement of ultrasonic power particularly in watts range. The proposed method comparatively reduces uncertainty caused by various ultrasonic effects that occur at high power such as overshoot due to momentum of target at higher radiated force. The functionality of the technique has been tested and compared with the existing internationally recommended RFB technique.« less

  5. Measurement of total ultrasonic power using thermal expansion and change in buoyancy of an absorbing target

    NASA Astrophysics Data System (ADS)

    Dubey, P. K.; Kumar, Yudhisther; Gupta, Reeta; Jain, Anshul; Gohiya, Chandrashekhar

    2014-05-01

    The Radiation Force Balance (RFB) technique is well established and most widely used for the measurement of total ultrasonic power radiated by ultrasonic transducer. The technique is used as a primary standard for calibration of ultrasonic transducers with relatively fair uncertainty in the low power (below 1 W) regime. In this technique, uncertainty comparatively increases in the range of few watts wherein the effects such as thermal heating of the target, cavitations, and acoustic streaming dominate. In addition, error in the measurement of ultrasonic power is also caused due to movement of absorber at relatively high radiated force which occurs at high power level. In this article a new technique is proposed which does not measure the balance output during transducer energized state as done in RFB. It utilizes the change in buoyancy of the absorbing target due to local thermal heating. The linear thermal expansion of the target changes the apparent mass in water due to buoyancy change. This forms the basis for the measurement of ultrasonic power particularly in watts range. The proposed method comparatively reduces uncertainty caused by various ultrasonic effects that occur at high power such as overshoot due to momentum of target at higher radiated force. The functionality of the technique has been tested and compared with the existing internationally recommended RFB technique.

  6. Galerkin v. discrete-optimal projection in nonlinear model reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin Thomas; Barone, Matthew Franklin; Antil, Harbir

    Discrete-optimal model-reduction techniques such as the Gauss{Newton with Approximated Tensors (GNAT) method have shown promise, as they have generated stable, accurate solutions for large-scale turbulent, compressible ow problems where standard Galerkin techniques have failed. However, there has been limited comparative analysis of the two approaches. This is due in part to difficulties arising from the fact that Galerkin techniques perform projection at the time-continuous level, while discrete-optimal techniques do so at the time-discrete level. This work provides a detailed theoretical and experimental comparison of the two techniques for two common classes of time integrators: linear multistep schemes and Runge{Kutta schemes.more » We present a number of new ndings, including conditions under which the discrete-optimal ROM has a time-continuous representation, conditions under which the two techniques are equivalent, and time-discrete error bounds for the two approaches. Perhaps most surprisingly, we demonstrate both theoretically and experimentally that decreasing the time step does not necessarily decrease the error for the discrete-optimal ROM; instead, the time step should be `matched' to the spectral content of the reduced basis. In numerical experiments carried out on a turbulent compressible- ow problem with over one million unknowns, we show that increasing the time step to an intermediate value decreases both the error and the simulation time of the discrete-optimal reduced-order model by an order of magnitude.« less

  7. A technique for measuring vertically and horizontally polarized microwave brightness temperatures using electronic polarization-basis rotation

    NASA Technical Reports Server (NTRS)

    Gasiewski, Albin J.

    1992-01-01

    This technique for electronically rotating the polarization basis of an orthogonal-linear polarization radiometer is based on the measurement of the first three feedhorn Stokes parameters, along with the subsequent transformation of this measured Stokes vector into a rotated coordinate frame. The technique requires an accurate measurement of the cross-correlation between the two orthogonal feedhorn modes, for which an innovative polarized calibration load was developed. The experimental portion of this investigation consisted of a proof of concept demonstration of the technique of electronic polarization basis rotation (EPBR) using a ground based 90-GHz dual orthogonal-linear polarization radiometer. Practical calibration algorithms for ground-, aircraft-, and space-based instruments were identified and tested. The theoretical effort consisted of radiative transfer modeling using the planar-stratified numerical model described in Gasiewski and Staelin (1990).

  8. Satisfiability of logic programming based on radial basis function neural networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamadneh, Nawaf; Sathasivam, Saratha; Tilahun, Surafel Luleseged

    2014-07-10

    In this paper, we propose a new technique to test the Satisfiability of propositional logic programming and quantified Boolean formula problem in radial basis function neural networks. For this purpose, we built radial basis function neural networks to represent the proportional logic which has exactly three variables in each clause. We used the Prey-predator algorithm to calculate the output weights of the neural networks, while the K-means clustering algorithm is used to determine the hidden parameters (the centers and the widths). Mean of the sum squared error function is used to measure the activity of the two algorithms. We appliedmore » the developed technique with the recurrent radial basis function neural networks to represent the quantified Boolean formulas. The new technique can be applied to solve many applications such as electronic circuits and NP-complete problems.« less

  9. Reduced basis technique for evaluating the sensitivity coefficients of the nonlinear tire response

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Tanner, John A.; Peters, Jeanne M.

    1992-01-01

    An efficient reduced-basis technique is proposed for calculating the sensitivity of nonlinear tire response to variations in the design variables. The tire is modeled using a 2-D, moderate rotation, laminated anisotropic shell theory, including the effects of variation in material and geometric parameters. The vector of structural response and its first-order and second-order sensitivity coefficients are each expressed as a linear combination of a small number of basis vectors. The effectiveness of the basis vectors used in approximating the sensitivity coefficients is demonstrated by a numerical example involving the Space Shuttle nose-gear tire, which is subjected to uniform inflation pressure.

  10. Detection of infragenual arterial disease using noncontrast-enhanced MR angiography in patients with diabetes

    PubMed Central

    Liu, Xin; Zhang, Na; Fan, Zhaoyang; Feng, Fei; Yang, Qi; Zheng, Hairong; Liu, Pengcheng; Li, Debiao

    2013-01-01

    Purpose To evaluate the diagnostic performance of a newly developed noncontrast-enhanced MR angiography (NCE-MRA) technique using flow-sensitive dephasing (FSD) prepared steady-state free precession (SSFP) for detecting calf arterial disease in patients with diabetes. Materials and Methods Forty-five patients with diabetes who underwent routine CE-MRA of lower extremities were recruited for NCE-MRA at the calf on a 1.5T MR system. Image quality evaluated on a four-point scale and diagnostic performance for detecting more than 50% arterial stenosis were statistically analyzed, using CE-MRA as the standard of reference. Results A total of 264 calf arterial segments were obtained in the 45 patients with 88 legs. The percentage of diagnostic arterial segments was all 98% for NCE- and CE-MRA. The image quality, SNR, CNR was 3.3, 177, 138 and 3.5, 103, 99 for NCE-MRA and CE-MRA respectively. The average sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of NCE-MRA were 97%, 96%, 90%, 99%, and 96%, respectively on a per-segment basis and 90%, 84%, 82%, 91%, and 87%, respectively on a per-patients basis. Conclusion The NCE-MRA technique demonstrates adequate image quality in the delineation of calf arteries and consistent diagnostic performance for detecting significant stenosis with CE-MRA in patients with diabetes. PMID:24925770

  11. HPLC fingerprint analysis combined with chemometrics for pattern recognition of ginger.

    PubMed

    Feng, Xu; Kong, Weijun; Wei, Jianhe; Ou-Yang, Zhen; Yang, Meihua

    2014-03-01

    Ginger, the fresh rhizome of Zingiber officinale Rosc. (Zingiberaceae), has been used worldwide; however, for a long time, there has been no standard approbated internationally for its quality control. To establish an efficacious and combinational method and pattern recognition technique for quality control of ginger. A simple, accurate and reliable method based on high-performance liquid chromatography with photodiode array (HPLC-PDA) detection was developed for establishing the chemical fingerprints of 10 batches of ginger from different markets in China. The method was validated in terms of precision, reproducibility and stability; and the relative standard deviations were all less than 1.57%. On the basis of this method, the fingerprints of 10 batches of ginger samples were obtained, which showed 16 common peaks. Coupled with similarity evaluation software, the similarities between each fingerprint of the sample and the simulative mean chromatogram were in the range of 0.998-1.000. Then, the chemometric techniques, including similarity analysis, hierarchical clustering analysis and principal component analysis were applied to classify the ginger samples. Consistent results were obtained to show that ginger samples could be successfully classified into two groups. This study revealed that HPLC-PDA method was simple, sensitive and reliable for fingerprint analysis, and moreover, for pattern recognition and quality control of ginger.

  12. Development of an Uncertainty Model for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walter, Joel A.; Lawrence, William R.; Elder, David W.; Treece, Michael D.

    2010-01-01

    This paper introduces an uncertainty model being developed for the National Transonic Facility (NTF). The model uses a Monte Carlo technique to propagate standard uncertainties of measured values through the NTF data reduction equations to calculate the combined uncertainties of the key aerodynamic force and moment coefficients and freestream properties. The uncertainty propagation approach to assessing data variability is compared with ongoing data quality assessment activities at the NTF, notably check standard testing using statistical process control (SPC) techniques. It is shown that the two approaches are complementary and both are necessary tools for data quality assessment and improvement activities. The SPC approach is the final arbiter of variability in a facility. Its result encompasses variation due to people, processes, test equipment, and test article. The uncertainty propagation approach is limited mainly to the data reduction process. However, it is useful because it helps to assess the causes of variability seen in the data and consequently provides a basis for improvement. For example, it is shown that Mach number random uncertainty is dominated by static pressure variation over most of the dynamic pressure range tested. However, the random uncertainty in the drag coefficient is generally dominated by axial and normal force uncertainty with much less contribution from freestream conditions.

  13. The far-infrared view on the distant Universe

    NASA Astrophysics Data System (ADS)

    Elbaz, David

    2015-08-01

    I will review what we have learnt on distant galaxies from far infrared surveys and present news ways to identify z>2 highly star-forming galaxies, often missed by standard techniques such as LBGs, that may represent the missing progenitors of passive z~2 galaxies. I will also discuss inconsistencies between SFR indicators that can be linked to the starburstiness and compactness of star-forming galaxies. Based on these results we will discuss the evidence in favor/against the existence of a SFR-M* main sequence up to z=4. The impact of the spatial distribution of star formation and its evolution with redshift will be discussed on the basis of newly obtained ALMA data.

  14. Computer-assisted enzyme immunoassays and simplified immunofluorescence assays: applications for the diagnostic laboratory and the veterinarian's office.

    PubMed

    Jacobson, R H; Downing, D R; Lynch, T J

    1982-11-15

    A computer-assisted enzyme-linked immunosorbent assay (ELISA) system, based on kinetics of the reaction between substrate and enzyme molecules, was developed for testing large numbers of sera in laboratory applications. Systematic and random errors associated with conventional ELISA technique were identified leading to results formulated on a statistically validated, objective, and standardized basis. In a parallel development, an inexpensive system for field and veterinary office applications contained many of the qualities of the computer-assisted ELISA. This system uses a fluorogenic indicator (rather than the enzyme-substrate interaction) in a rapid test (15 to 20 minutes' duration) which promises broad application in serodiagnosis.

  15. Antioxidative properties of hydroxycinnamic acid derivatives and a phenylpropanoid glycoside. A pulse radiolysis study

    NASA Astrophysics Data System (ADS)

    Lin, Weizhen; Navaratnam, Suppiah; Yao, Side; Lin, Nianyun

    1998-10-01

    Spectral and redox properties of the phenoxyl radicals from hydroxycinnamic acid derivatives and one selected component of phenylpropanoid glycosides, verbascoside, were studied using pulse radiolysis techniques. On the basis of the pH dependence of phenoxyl radical absorptions, the p Ka values for deprotonation of sinapic acid radical and ferulic acid radical are 4.9 and 5.2. The rate constants of one electron oxidation of those antioxidants by azide radical and bromide radical ion were determined at pH 7. The redox potentials of those antioxidants were determined as 0.59-0.71 V vs NHE at pH 7 with reference standard 4-methoxyphenol and resorcinol.

  16. Analytic integration of real-virtual counterterms in NNLO jet cross sections I

    NASA Astrophysics Data System (ADS)

    Aglietti, Ugo; Del Duca, Vittorio; Duhr, Claude; Somogyi, Gábor; Trócsányi, Zoltán

    2008-09-01

    We present analytic evaluations of some integrals needed to give explicitly the integrated real-virtual counterterms, based on a recently proposed subtraction scheme for next-to-next-to-leading order (NNLO) jet cross sections. After an algebraic reduction of the integrals, integration-by-parts identities are used for the reduction to master integrals and for the computation of the master integrals themselves by means of differential equations. The results are written in terms of one- and two-dimensional harmonic polylogarithms, once an extension of the standard basis is made. We expect that the techniques described here will be useful in computing other integrals emerging in calculations in perturbative quantum field theories.

  17. Microelectromechanical systems(MEMS): Launching Research Concepts into the Marketplace

    NASA Astrophysics Data System (ADS)

    Arney, Susanne

    1999-04-01

    More than a decade following the demonstration of the first spinning micromotors and microgears, the field of microelectromechanical systems (MEMS) has burgeoned on a worldwide basis. Integrated circuit design, fabrication, and packaging techniques have provided the foundation for the growth of an increasingly mature MEMS infrastructure which spans numerous topics of research as well as industrial application. The remarkable proliferation of MEMS concepts into such contrasting arenas of application as automotive sensors, biology, optical and wireless telecommunications, displays, printing, and physics experiments will be described. Challenges to commercialization of research prototypes will be discussed with emphasis on the development of design, fabrication, packaging, reliability and standards which fundamentally enable the application of MEMS to a highly diversified marketplace.

  18. The use of deep convective clouds to uniformly calibrate the next generation of geostationary reflective solar imagers

    NASA Astrophysics Data System (ADS)

    Doelling, David R.; Bhatt, Rajendra; Haney, Conor O.; Gopalan, Arun; Scarino, Benjamin R.

    2017-09-01

    The new 3rd generation geostationary (GEO) imagers will have many of the same NPP-VIIRS imager spectral bands, thereby offering the opportunity to apply the VIIRS cloud, aerosol, and land use retrieval algorithms on the new GEO imager measurements. Climate quality retrievals require multi-channel calibrated radiances that are stable over time. The deep convective cloud calibration technique (DCCT) is a large ensemble statistical technique that assumes that the DCC reflectance is stable over time. Because DCC are found in sufficient numbers across all GEO domains, they provide a uniform calibration stability evaluation across the GEO constellation. The baseline DCCT has been successful in calibrating visible and near-infrared channels. However, for shortwave infrared (SWIR) channels the DCCT is not as effective to monitor radiometric stability. The DCCT was optimized as a function wavelength in this paper. For SWIR bands, the greatest reduction of the DCC response trend standard error was achieved through deseasonalization. This is effective because the DCC reflectance exhibits small regional seasonal cycles that can be characterized on a monthly basis. On the other hand, the inter-annually variability in DCC response was found to be extremely small. The Met-9 0.65-μm channel DCC response was found to have a 3% seasonal cycle. Deseasonalization reduced the trend standard error from 1% to 0.4%. For the NPP-VIIRS SWIR bands, deseasonalization reduced the trend standard error by more than half. All VIIRS SWIR band trend standard errors were less than 1%. The DCCT should be able to monitor the stability of all GEO imager solar reflective bands across the tropical domain with the same uniform accuracy.

  19. Estimating propagation velocity through a surface acoustic wave sensor

    DOEpatents

    Xu, Wenyuan; Huizinga, John S.

    2010-03-16

    Techniques are described for estimating the propagation velocity through a surface acoustic wave sensor. In particular, techniques which measure and exploit a proper segment of phase frequency response of the surface acoustic wave sensor are described for use as a basis of bacterial detection by the sensor. As described, use of velocity estimation based on a proper segment of phase frequency response has advantages over conventional techniques that use phase shift as the basis for detection.

  20. Microfabricated multijunction thermal converters

    NASA Astrophysics Data System (ADS)

    Wunsch, Thomas Franzen

    2001-12-01

    In order to develop improved standards for the measurement of ac voltages and currents, a new thin-film fabrication technique for the multijunction thermal converter has been developed. The ability of a thermal converter to relate an rms ac voltage or current to a dc value is characterized by a quantity called `ac-dc difference' that is ideally zero. The best devices produced using the new techniques have ac-dc differences below 1 × 10-6 in the range of frequencies from 20 Hz to 10 kHz and below 7.5 × 10-6 in the range of frequencies from 20 kHz to 300 kHz. This is a reduction of two orders of magnitude in the lower frequency range and one order of magnitude in the higher frequency range over devices produced at the National Institute of Standards and Technology in 1996. The performance achieved is competitive with the best techniques in the world for ac measurements and additional evaluation is therefore warranted to determine the suitability of the devices for use as national standards that form the legal basis for traceable rms voltage measurements of time varying waveforms in the United States. The construction of the new devices is based on thin-film fabrication of a heated wire supported by a thermally isolated thin-film membrane. The membrane is produced utilizing a reactive ion plasma etch. A photoresist lift- off technique is used to pattern the metal thin-film layers that form the heater and the multijunction thermocouple circuit. The etching and lift-off allow the device to be produced without wet chemical etches that are time consuming and impede the investigation of structures with differing materials. These techniques result in an approach to fabrication that is simple, inexpensive, and free from the manual construction techniques used in the fabrication of conventional single and multijunction thermoelements. Thermal, thermoelectric, and electrical models have been developed to facilitate designs that reduce the low- frequency error. At high frequencies, from 300 kHz to 1 MHz, the performance of the device is degraded by a capacitive coupling effect that produces an ac-dc difference of approximately -90 × 10-6 at 1 MHz. A model is developed that explains this behavior. The model shows that an improvement in performance in the high-frequency range is possible through the use of very high or very low resistivity silicon substrates.

  1. Atmospheric optical calibration system

    DOEpatents

    Hulstrom, R.L.; Cannon, T.W.

    1988-10-25

    An atmospheric optical calibration system is provided to compare actual atmospheric optical conditions to standard atmospheric optical conditions on the basis of aerosol optical depth, relative air mass, and diffuse horizontal skylight to global horizontal photon flux ratio. An indicator can show the extent to which the actual conditions vary from standard conditions. Aerosol scattering and absorption properties, diffuse horizontal skylight to global horizontal photon flux ratio, and precipitable water vapor determined on a real-time basis for optical and pressure measurements are also used to generate a computer spectral model and for correcting actual performance response of a photovoltaic device to standard atmospheric optical condition response on a real-time basis as the device is being tested in actual outdoor conditions. 7 figs.

  2. Atmospheric optical calibration system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hulstrom, R.L.; Cannon, T.W.

    1988-10-25

    An atmospheric optical calibration system is provided to compare actual atmospheric optical conditions to standard atmospheric optical conditions on the basis of aerosol optical depth, relative air mass, and diffuse horizontal skylight to global horizontal photon flux ratio. An indicator can show the extent to which the actual conditions vary from standard conditions. Aerosol scattering and absorption properties, diffuse horizontal skylight to global horizontal photon flux ratio, and precipitable water vapor determined on a real-time basis for optical and pressure measurements are also used to generate a computer spectral model and for correcting actual performance response of a photovoltaic devicemore » to standard atmospheric optical condition response on a real-time basis as the device is being tested in actual outdoor conditions. 7 figs.« less

  3. Bayesian image reconstruction - The pixon and optimal image modeling

    NASA Technical Reports Server (NTRS)

    Pina, R. K.; Puetter, R. C.

    1993-01-01

    In this paper we describe the optimal image model, maximum residual likelihood method (OptMRL) for image reconstruction. OptMRL is a Bayesian image reconstruction technique for removing point-spread function blurring. OptMRL uses both a goodness-of-fit criterion (GOF) and an 'image prior', i.e., a function which quantifies the a priori probability of the image. Unlike standard maximum entropy methods, which typically reconstruct the image on the data pixel grid, OptMRL varies the image model in order to find the optimal functional basis with which to represent the image. We show how an optimal basis for image representation can be selected and in doing so, develop the concept of the 'pixon' which is a generalized image cell from which this basis is constructed. By allowing both the image and the image representation to be variable, the OptMRL method greatly increases the volume of solution space over which the image is optimized. Hence the likelihood of the final reconstructed image is greatly increased. For the goodness-of-fit criterion, OptMRL uses the maximum residual likelihood probability distribution introduced previously by Pina and Puetter (1992). This GOF probability distribution, which is based on the spatial autocorrelation of the residuals, has the advantage that it ensures spatially uncorrelated image reconstruction residuals.

  4. Stress Management Apps With Regard to Emotion-Focused Coping and Behavior Change Techniques: A Content Analysis.

    PubMed

    Christmann, Corinna Anna; Hoffmann, Alexandra; Bleser, Gabriele

    2017-02-23

    Chronic stress has been shown to be associated with disease. This link is not only direct but also indirect through harmful health behavior such as smoking or changing eating habits. The recent mHealth trend offers a new and promising approach to support the adoption and maintenance of appropriate stress management techniques. However, only few studies have dealt with the inclusion of evidence-based content within stress management apps for mobile phones. The aim of this study was to evaluate stress management apps on the basis of a new taxonomy of effective emotion-focused stress management techniques and an established taxonomy of behavior change techniques. Two trained and independent raters evaluated 62 free apps found in Google Play with regard to 26 behavior change and 15 emotion-focused stress management techniques in October 2015. The apps included an average of 4.3 behavior change techniques (SD 4.2) and 2.8 emotion-focused stress management techniques (SD 2.6). The behavior change technique score and stress management technique score were highly correlated (r=.82, P=.01). The broad variation of different stress management strategies found in this sample of apps goes in line with those found in conventional stress management interventions and self-help literature. Moreover, this study provided a first step toward more detailed and standardized taxonomies, which can be used to investigate evidence-based content in stress management interventions and enable greater comparability between different intervention types. ©Corinna Anna Christmann, Alexandra Hoffmann, Gabriele Bleser. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 23.02.2017.

  5. Chopped random-basis quantum optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caneva, Tommaso; Calarco, Tommaso; Montangero, Simone

    2011-08-15

    In this work, we describe in detail the chopped random basis (CRAB) optimal control technique recently introduced to optimize time-dependent density matrix renormalization group simulations [P. Doria, T. Calarco, and S. Montangero, Phys. Rev. Lett. 106, 190501 (2011)]. Here, we study the efficiency of this control technique in optimizing different quantum processes and we show that in the considered cases we obtain results equivalent to those obtained via different optimal control methods while using less resources. We propose the CRAB optimization as a general and versatile optimal control technique.

  6. Using a pruned, nondirect product basis in conjunction with the multi-configuration time-dependent Hartree (MCTDH) method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wodraszka, Robert, E-mail: Robert.Wodraszka@chem.queensu.ca; Carrington, Tucker, E-mail: Tucker.Carrington@queensu.ca

    In this paper, we propose a pruned, nondirect product multi-configuration time dependent Hartree (MCTDH) method for solving the Schrödinger equation. MCTDH uses optimized 1D basis functions, called single particle functions, but the size of the standard direct product MCTDH basis scales exponentially with D, the number of coordinates. We compare the pruned approach to standard MCTDH calculations for basis sizes small enough that the latter are possible and demonstrate that pruning the basis reduces the CPU cost of computing vibrational energy levels of acetonitrile (D = 12) by more than two orders of magnitude. Using the pruned method, it ismore » possible to do calculations with larger bases, for which the cost of standard MCTDH calculations is prohibitive. Pruning the basis complicates the evaluation of matrix-vector products. In this paper, they are done term by term for a sum-of-products Hamiltonian. When no attempt is made to exploit the fact that matrices representing some of the factors of a term are identity matrices, one needs only to carefully constrain indices. In this paper, we develop new ideas that make it possible to further reduce the CPU time by exploiting identity matrices.« less

  7. Operational Soil Moisture Retrieval Techniques: Theoretical Comparisons in the Context of Improving the NASA Standard Approach

    NASA Astrophysics Data System (ADS)

    Mladenova, I. E.; Jackson, T. J.; Bindlish, R.; Njoku, E. G.; Chan, S.; Cosh, M. H.

    2012-12-01

    We are currently evaluating potential improvements to the standard NASA global soil moisture product derived using observations acquired from the Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E). A major component of this effort is a thorough review of the theoretical basis of available passive-based soil moisture retrieval algorithms suitable for operational implementation. Several agencies provide routine soil moisture products. Our research focuses on five well-establish techniques that are capable of carrying out global retrieval using the same AMSR-E data set as the NASA approach (i.e. X-band brightness temperature data). In general, most passive-based algorithms include two major components: radiative transfer modeling, which provides the smooth surface reflectivity properties of the soil surface, and a complex dielectric constant model of the soil-water mixture. These two components are related through the Fresnel reflectivity equations. Furthermore, the land surface temperature, vegetation, roughness and soil properties need to be adequately accounted for in the radiative transfer and dielectric modeling. All of the available approaches we have examined follow the general data processing flow described above, however, the actual solutions as well as the final products can be very different. This is primarily a result of the assumptions, number of sensor variables utilized, the selected ancillary data sets and approaches used to account for the effect of the additional geophysical variables impacting the measured signal. The operational NASA AMSR-E-based retrievals have been shown to have a dampened temporal response and sensitivity range. Two possible approaches to addressing these issues are being evaluated: enhancing the theoretical basis of the existing algorithm, if feasible, or directly adjusting the dynamic range of the final soil moisture product. Both of these aspects are being actively investigated and will be discussed in our talk. Improving the quality and reliability of the global soil moisture product would result in greater acceptance and utilization in the related applications. USDA is an equal opportunity provider and employer.

  8. Comparison of the sensitivity and specificity of 5 image sets of dual-energy computed tomography for detecting first-pass myocardial perfusion defects compared with positron emission tomography.

    PubMed

    Li, Wenhuan; Zhu, Xiaolian; Li, Jing; Peng, Cheng; Chen, Nan; Qi, Zhigang; Yang, Qi; Gao, Yan; Zhao, Yang; Sun, Kai; Li, Kuncheng

    2014-12-01

    The sensitivity and specificity of 5 different image sets of dual-energy computed tomography (DECT) for the detection of first-pass myocardial perfusion defects have not systematically been compared using positron emission tomography (PET) as a reference standard. Forty-nine consecutive patients, with known or strongly suspected of coronary artery disease, were prospectively enrolled in our study. Cardiac DECT was performed at rest state using a second-generation 128-slice dual-source CT. The DECT data were reconstructed to iodine maps, monoenergetic images, 100 kV images, nonlinearly blended images, and linearly blended images by different postprocessing techniques. The myocardial perfusion defects on DECT images were visually assessed by 5 observers, using standard 17-segment model. Diagnostic accuracy of 5 image sets was assessed using nitrogen-13 ammonia PET as the gold standard. Discrimination was quantified using the area under the receiver operating characteristic curve (AUC), and AUCs were compared using the method of DeLong. The DECT and PET examinations were successfully completed in 30 patients and a total of 90 territories and 510 segments were analyzed. Cardiac PET revealed myocardial perfusion defects in 56 territories (62%) and 209 segments (41%). The AUC of iodine maps, monoenergetic images, 100 kV images, nonlinearly blended images, and linearly blended images were 0.986, 0.934, 0.913, 0.881, and 0.871, respectively, on a per-territory basis. These values were 0.922, 0.813, 0.779, 0.763, and 0.728, respectively, on a per-segment basis. DECT iodine maps shows high sensitivity and specificity, and is superior to other DECT image sets for the detection of myocardial perfusion defects in the first-pass myocardial perfusion.

  9. Seven propositions of the science of improvement: exploring foundations.

    PubMed

    Perla, Rocco J; Provost, Lloyd P; Parry, Gareth J

    2013-01-01

    The phrase "Science of Improvement" or "Improvement Science" is commonly used today by a range of people and professions to mean different things, creating confusion to those trying to learn about improvement. In this article, we briefly define the concepts of improvement and science, and review the history of the consideration of "improvement" as a science. We trace key concepts and ideas in improvement to their philosophical and theoretical foundation with a focus on Deming's System of Profound Knowledge. We suggest that Deming's system has a firm association with many contemporary and historic philosophic and scientific debates and concepts. With reference to these debates and concepts, we identify 7 propositions that provide the scientific and philosophical foundation for the science of improvement. A standard view of the science of improvement does not presently exist that is grounded in the philosophical and theoretical basis of the field. The 7 propositions outlined here demonstrate the value of examining the underpinnings of improvement. This is needed to both advance the field and minimize confusion about what the phrase "science of improvement" represents. We argue that advanced scientists of improvement are those who like Deming and Shewhart can integrate ideas, concepts, and models between scientific disciplines for the purpose of developing more robust improvement models, tools, and techniques with a focus on application and problem solving in real world contexts. The epistemological foundations and theoretical basis of the science of improvement and its reasoning methods need to be critically examined to ensure its continued development and relevance. If improvement efforts and projects in health care are to be characterized under the canon of science, then health care professionals engaged in quality improvement work would benefit from a standard set of core principles, a standard lexicon, and an understanding of the evolution of the science of improvement.

  10. FT-IR, FT-Raman, UV spectra and DFT calculations on monomeric and dimeric structure of 2-amino-5-bromobenzoic acid.

    PubMed

    Karabacak, Mehmet; Cinar, Mehmet

    2012-02-01

    In this work, the molecular conformation, vibrational and electronic transition analysis of 2-amino-5-bromobenzoic acid (2A5BrBA) were presented for the ground state using experimental techniques (FT-IR, FT-Raman and UV) and density functional theory (DFT) employing B3LYP exchange correlation with the 6-311++G(d,p) basis set. FT-IR and FT-Raman spectra were recorded in the regions of 400-4000 cm(-1) and 50-4000 cm(-1), respectively. There are four conformers, C1, C2, C3 and C4 for this molecule. The geometrical parameters, energies and wavenumbers have been obtained for all four conformers. The computational results diagnose the most stable conformer of 2A5BrBA as the C1 form. The complete assignments of fundamental vibrations were performed on the basis of the total energy distribution (TED) of the vibrational modes, calculated with scaled quantum mechanics (SQM) method. Raman activities calculated by DFT method have been converted to the corresponding Raman intensities using Raman scattering theory. The UV spectra of investigated compound were recorded in the region of 200-400 nm for ethanol and water solutions. The electronic properties were evaluated with help of time-dependent DFT (TD-DFT) theoretically and results were compared with experimental observations. The thermodynamic properties of the studied compound at different temperatures were calculated, revealing the correlations between standard heat capacity, standard entropy, standard enthalpy changes and temperatures. The observed and the calculated geometric parameters, vibrational wavenumbers and electronic transitions were compared with observed data and found to be in good agreement. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Application of artificial neural networks for conformity analysis of fuel performed with an optical fiber sensor

    NASA Astrophysics Data System (ADS)

    Possetti, Gustavo Rafael Collere; Coradin, Francelli Klemba; Côcco, Lílian Cristina; Yamamoto, Carlos Itsuo; de Arruda, Lucia Valéria Ramos; Falate, Rosane; Muller, Marcia; Fabris, José Luís

    2008-04-01

    The liquid fuel quality control is an important issue that brings benefits for the State, for the consumers and for the environment. The conformity analysis, in special for gasoline, demands a rigorous sampling technique among gas stations and other economic agencies, followed by a series of standard physicochemical tests. Such procedures are commonly expensive and time demanding and, moreover, a specialist is often required to carry out the tasks. Such drawbacks make the development of alternative analysis tools an important research field. The fuel refractive index is an additional parameter to help the fuel conformity analysis, besides the prospective optical fiber sensors, which operate like transducers with singular properties. When this parameter is correlated with the sample density, it becomes possible to determine conformity zones that cannot be analytically defined. This work presents an application of artificial neural networks based on Radial Basis Function to determine these zones. A set of 45 gasoline samples, collected in several gas stations and previously analyzed according to the rules of Agência Nacional do Petróleo, Gás Natural e Biocombustíveis, a Brazilian regulatory agency, constituted the database to build two neural networks. The input variables of first network are the samples refractive indices, measured with an Abbe refractometer, and the density of the samples measured with a digital densimeter. For the second network the input variables included, besides the samples densities, the wavelength response of a long-period grating to the samples refractive indices. The used grating was written in an optical fiber using the point-to-point technique by submitting the fiber to consecutive electrical arcs from a splice machine. The output variables of both Radial Basis Function Networks are represented by the conformity status of each sample, according to report of tests carried out following the American Society for Testing and Materials and/or Brazilian Association of Technical Rules standards. A subset of 35 samples, randomly chosen from the database, was used to design and calibrate (train) both networks. The two networks topologies (numbers of Radial Basis Function neurons of the hidden layer and function radius) were built in order to minimize the root mean square error. The subset composed by the other 10 samples was used to validate the final networks architectures. The obtained results have demonstrated that both networks reach a good predictive capability.

  12. Automated Assessment of Child Vocalization Development Using LENA.

    PubMed

    Richards, Jeffrey A; Xu, Dongxin; Gilkerson, Jill; Yapanel, Umit; Gray, Sharmistha; Paul, Terrance

    2017-07-12

    To produce a novel, efficient measure of children's expressive vocal development on the basis of automatic vocalization assessment (AVA), child vocalizations were automatically identified and extracted from audio recordings using Language Environment Analysis (LENA) System technology. Assessment was based on full-day audio recordings collected in a child's unrestricted, natural language environment. AVA estimates were derived using automatic speech recognition modeling techniques to categorize and quantify the sounds in child vocalizations (e.g., protophones and phonemes). These were expressed as phone and biphone frequencies, reduced to principal components, and inputted to age-based multiple linear regression models to predict independently collected criterion-expressive language scores. From these models, we generated vocal development AVA estimates as age-standardized scores and development age estimates. AVA estimates demonstrated strong statistical reliability and validity when compared with standard criterion expressive language assessments. Automated analysis of child vocalizations extracted from full-day recordings in natural settings offers a novel and efficient means to assess children's expressive vocal development. More research remains to identify specific mechanisms of operation.

  13. Temperature-dependent microindentation data of an epoxy composition in the glassy region

    NASA Astrophysics Data System (ADS)

    Minster, Jiří; Králík, Vlastimil

    2015-02-01

    The short-term instrumented microindentation technique was applied for assessing the influence of temperature in the glassy region on the time-dependent mechanical properties of an average epoxy resin mix near to its native state. Linear viscoelasticity theory with the assumption of time-independent Poisson ratio value forms the basis for processing the experimental results. The sharp standard Berkovich indenter was used to measure the local mechanical properties at temperatures 20, 24, 28, and 35 °C. The short-term viscoelastic compliance histories were defined by the Kohlrausch-Williams-Watts double exponential function. The findings suggest that depth-sensing indentation data of thermorheologically simple materials influenced by different temperatures in the glassy region can also be used, through the time-temperature superposition, to extract viscoelastic response functions accurately. This statement is supported by the comparison of the viscoelastic compliance master curve of the tested material with data derived from standard macro creep measurements under pressure on the material in a conformable state.

  14. Quantifying and characterizing proanthocyanidins in cranberries in relation to urinary tract health.

    PubMed

    Krueger, Christian G; Reed, Jess D; Feliciano, Rodrigo P; Howell, Amy B

    2013-05-01

    The "A-type" proanthocyanidins in cranberry fruit (Vaccinium macrocarpon Ait.) are bioactive components associated with prevention of urinary tract infections (UTI). Cranberry juice, fruit (fresh and dried), functional foods, and cranberry dietary supplements are promoted for prevention of UTI and for maintenance of urinary tract health (UTH), on the basis of their content of cranberry proanthocyanidins (c-PAC) with "A-type" interflavan bonds. With increasing consumer use of cranberries for maintenance of UTH and an expanding number of commercial cranberry products of different types, the availability of unified methods for measuring levels of c-PAC is important. This review discusses quantitative and qualitative analysis of c-PAC with "A-type" interflavan bonds in relation to their biological activity for UTI prevention. The integrity (including authenticity, standardization, efficacy, and safety) of cranberry fruit, juices, and dietary supplements may now be measured by using recent advances in mass spectrometry, liquid chromatography, production of c-PAC standards, and improved simple quantitative techniques.

  15. NIST Role in Advancing Innovation

    NASA Astrophysics Data System (ADS)

    Semerjian, Hratch

    2006-03-01

    According to the National Innovation Initiative, a report of the Council on Competitiveness, innovation will be the single most important factor in determining America's success through the 21^st century. NIST mission is to promote U.S. innovation and industrial competitiveness by advancing measurement science, standards, and technology -- in ways that enhance economic security and improve the quality of life for all Americans. NIST innovations in measurement science and technology often become the basis for new industrial capabilities. Several examples of such developments will be discussed, including the development of techniques for manipulation and measurement of biomolecules which may become the building blocks for molecular electronics; expansion of the frontiers of quantum theory to develop the field of quantum computing and communication; development of atomic scale measurement capabilities for future nano- and molecular scale electronic devices; development of a lab-on-a-chip that can detect within seconds trace amounts of toxic chemicals in water, or can be used for rapid DNA analysis; and standards to facilitate supply chain interoperability.

  16. Uncertainty Evaluation of the New Setup for Measurement of Water-Vapor Permeation Rate by a Dew-Point Sensor

    NASA Astrophysics Data System (ADS)

    Hudoklin, D.; Šetina, J.; Drnovšek, J.

    2012-09-01

    The measurement of the water-vapor permeation rate (WVPR) through materials is very important in many industrial applications such as the development of new fabrics and construction materials, in the semiconductor industry, packaging, vacuum techniques, etc. The demand for this kind of measurement grows considerably and thus many different methods for measuring the WVPR are developed and standardized within numerous national and international standards. However, comparison of existing methods shows a low level of mutual agreement. The objective of this paper is to demonstrate the necessary uncertainty evaluation for WVPR measurements, so as to provide a basis for development of a corresponding reference measurement standard. This paper presents a specially developed measurement setup, which employs a precision dew-point sensor for WVPR measurements on specimens of different shapes. The paper also presents a physical model, which tries to account for both dynamic and quasi-static methods, the common types of WVPR measurements referred to in standards and scientific publications. An uncertainty evaluation carried out according to the ISO/IEC guide to the expression of uncertainty in measurement (GUM) shows the relative expanded ( k = 2) uncertainty to be 3.0 % for WVPR of 6.71 mg . h-1 (corresponding to permeance of 30.4 mg . m-2. day-1 . hPa-1).

  17. 7 CFR 42.105 - Basis for selection of sample.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE COMMODITY STANDARDS AND STANDARD CONTAINER REGULATIONS STANDARDS FOR CONDITION OF FOOD CONTAINERS Procedures for Stationary Lot Sampling and Inspection...

  18. Candidate substances for space bioprocessing methodology and data specification for benefit evaluation

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Analytical and quantitative economic techniques are applied to the evaluation of the economic benefits of a wide range of substances for space bioprocessing. On the basis of expected clinical applications, as well as the size of the patient that could be affected by the clinical applications, eight substances are recommended for further benefit evaluation. Results show that a transitional probability methodology can be used to model at least one clinical application for each of these substances. In each recommended case, the disease and its therapy are sufficiently well understood and documented, and the statistical data is available to operate the model and produce estimates of the impact of new therapy systems on the cost of treatment, morbidity, and mortality. Utilizing the morbidity and mortality information produced by the model, a standard economic technique called the Value of Human Capital is used to estimate the social welfare benefits that could be attributable to the new therapy systems.

  19. Measurement of tracer gas distributions using an open-path FTIR system coupled with computed tomography

    NASA Astrophysics Data System (ADS)

    Drescher, Anushka C.; Yost, Michael G.; Park, Doo Y.; Levine, Steven P.; Gadgil, Ashok J.; Fischer, Marc L.; Nazaroff, William W.

    1995-05-01

    Optical remote sensing and iterative computed tomography (CT) can be combined to measure the spatial distribution of gaseous pollutant concentrations in a plane. We have conducted chamber experiments to test this combination of techniques using an Open Path Fourier Transform Infrared Spectrometer (OP-FTIR) and a standard algebraic reconstruction technique (ART). ART was found to converge to solutions that showed excellent agreement with the ray integral concentrations measured by the FTIR but were inconsistent with simultaneously gathered point sample concentration measurements. A new CT method was developed based on (a) the superposition of bivariate Gaussians to model the concentration distribution and (b) a simulated annealing minimization routine to find the parameters of the Gaussians that resulted in the best fit to the ray integral concentration data. This new method, named smooth basis function minimization (SBFM) generated reconstructions that agreed well, both qualitatively and quantitatively, with the concentration profiles generated from point sampling. We present one set of illustrative experimental data to compare the performance of ART and SBFM.

  20. Arthroscopic trans-osseous rotator cuff repair

    PubMed Central

    Chillemi, Claudio; Mantovani, Matteo

    2017-01-01

    Summary Background: Mechanical factors are at the basis of any tendon healing process, being pressure an aspect able to positively influence it. For this reason transosseous rotator cuff repair represents the gold standard procedure for patients affected by a cuff tear, maximizing the tendon footprint contact area and reducing motion at the tendon to bone interface. Methods: The Authors present an all arthroscopic suture bridge-like transosseous repair with the preparation of a single transosseous tunnel perfor med thanks to a precise dedicated instrument (Compasso®) and one implant (Elite-SPK®) with the use of only 3 suture wires. In addition this technique permits to accurately prepare the bony side of the lesion without any risk or complication, such as anchor pull-out and greater tuberosity bone osteolysis. Conclusions: However, even if this technique seems less demanding, the arthroscopic transosseous repair is still an advanced procedure, and should be performed only by well prepared arthroscopic shoulder surgeons. Level of evidence: V. PMID:28717607

  1. Delayed entanglement echo for individual control of a large number of nuclear spins

    PubMed Central

    Wang, Zhen-Yu; Casanova, Jorge; Plenio, Martin B.

    2017-01-01

    Methods to selectively detect and manipulate nuclear spins by single electrons of solid-state defects play a central role for quantum information processing and nanoscale nuclear magnetic resonance (NMR). However, with standard techniques, no more than eight nuclear spins have been resolved by a single defect centre. Here we develop a method that improves significantly the ability to detect, address and manipulate nuclear spins unambiguously and individually in a broad frequency band by using a nitrogen-vacancy (NV) centre as model system. On the basis of delayed entanglement control, a technique combining microwave and radio frequency fields, our method allows to selectively perform robust high-fidelity entangling gates between hardly resolved nuclear spins and the NV electron. Long-lived qubit memories can be naturally incorporated to our method for improved performance. The application of our ideas will increase the number of useful register qubits accessible to a defect centre and improve the signal of nanoscale NMR. PMID:28256508

  2. Delayed entanglement echo for individual control of a large number of nuclear spins.

    PubMed

    Wang, Zhen-Yu; Casanova, Jorge; Plenio, Martin B

    2017-03-03

    Methods to selectively detect and manipulate nuclear spins by single electrons of solid-state defects play a central role for quantum information processing and nanoscale nuclear magnetic resonance (NMR). However, with standard techniques, no more than eight nuclear spins have been resolved by a single defect centre. Here we develop a method that improves significantly the ability to detect, address and manipulate nuclear spins unambiguously and individually in a broad frequency band by using a nitrogen-vacancy (NV) centre as model system. On the basis of delayed entanglement control, a technique combining microwave and radio frequency fields, our method allows to selectively perform robust high-fidelity entangling gates between hardly resolved nuclear spins and the NV electron. Long-lived qubit memories can be naturally incorporated to our method for improved performance. The application of our ideas will increase the number of useful register qubits accessible to a defect centre and improve the signal of nanoscale NMR.

  3. [CT pulmonary density mapping: surgical utility].

    PubMed

    Gavezzoli, D; Caputo, P; Manelli, A; Zuccon, W; Faccini, M; Bonandrini, L

    2002-04-01

    The present paper considers the technique of CT scan maps of pulmonary isodensity, examining lung density differences as a function of the type of disease and considering their significance for the purposes of refined, useful diagnosis in a surgical context. METHODS. The method is used to examine 3 groups of subjects selected on a clinical/anamnestic basis and a further group already admitted for surgery. For each patient we obtained 2 thoracic density scans during the phase of maximum inspiration and expiration. On each scan we constructed 50 isodensity maps, the equivalent of more than 2500 measurements: the preliminary standard was represented by 100 wide windows to produce total "illumination" of the pulmonary fields. The isodensity windows were then codified differently. Subsequently, the density scans were analysed with the technique of scalar decomposition. The CT scan maps of lung isodensity proved useful for certain lung diseases in which early diagnosis, topographic extent of the pathology and the refined definition of the pathological picture provide important solutions as regards the indication and planning of surgical treatment and for the evaluation of the operative risk and prognosis. We consider that the technique is rapidly performed, not complex and inexpensive and is able to supply detailed information on the lung parenchyma such as to be used not only as a routine technique but also in emergencies.

  4. Localized basis functions and other computational improvements in variational nonorthogonal basis function methods for quantum mechanical scattering problems involving chemical reactions

    NASA Technical Reports Server (NTRS)

    Schwenke, David W.; Truhlar, Donald G.

    1990-01-01

    The Generalized Newton Variational Principle for 3D quantum mechanical reactive scattering is briefly reviewed. Then three techniques are described which improve the efficiency of the computations. First, the fact that the Hamiltonian is Hermitian is used to reduce the number of integrals computed, and then the properties of localized basis functions are exploited in order to eliminate redundant work in the integral evaluation. A new type of localized basis function with desirable properties is suggested. It is shown how partitioned matrices can be used with localized basis functions to reduce the amount of work required to handle the complex boundary conditions. The new techniques do not introduce any approximations into the calculations, so they may be used to obtain converged solutions of the Schroedinger equation.

  5. Response Time Analysis and Test of Protection System Instrument Channels for APR1400 and OPR1000

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Chang Jae; Han, Seung; Yun, Jae Hee

    2015-07-01

    Safety limits are required to maintain the integrity of physical barriers designed to prevent the uncontrolled release of radioactive materials in nuclear power plants. The safety analysis establishes two critical constraints that include an analytical limit in terms of a measured or calculated variable, and a specific time after the analytical limit is reached to begin protective action. Keeping with the nuclear regulations and industry standards, satisfying these two requirements will ensure that the safety limit will not be exceeded during the design basis event, either an anticipated operational occurrence or a postulated accident. Various studies on the setpoint determinationmore » methodology for the safety-related instrumentation have been actively performed to ensure that the requirement of the analytical limit is satisfied. In particular, the protection setpoint methodology for the advanced power reactor 1400 (APP1400) and the optimized power reactor 1000 (OPR1000) has been recently developed to cover both the design basis event and the beyond design basis event. The developed setpoint methodology has also been quantitatively validated using specific computer programs and setpoint calculations. However, the safety of nuclear power plants cannot be fully guaranteed by satisfying the requirement of the analytical limit. In spite of the response time verification requirements of nuclear regulations and industry standards, it is hard to find the studies on the systematically integrated methodology regarding the response time evaluation. In cases of APR1400 and OPR1000, the response time analysis for the plant protection system is partially included in the setpoint calculation and the response time test is separately performed via the specific plant procedure. The test technique has a drawback which is the difficulty to demonstrate completeness of timing test. The analysis technique has also a demerit of resulting in extreme times that not actually possible. Thus, the establishment of the systematic response time evaluation methodology is needed to justify the conformance to the response time requirement used in the safety analysis. This paper proposes the response time evaluation methodology for APR1400 and OPR1000 using the combined analysis and test technique to confirm that the plant protection system can meet the analytical response time assumed in the safety analysis. In addition, the results of the quantitative evaluation performed for APR1400 and OPR1000 are presented in this paper. The proposed response time analysis technique consists of defining the response time requirement, determining the critical signal path for the trip parameter, allocating individual response time to each component on the signal path, and analyzing the total response time for the trip parameter, and demonstrates that the total analyzed response time does not exceed the response time requirement. The proposed response time test technique is composed of defining the response time requirement, determining the critical signal path for the trip parameter, determining the test method for each component on the signal path, performing the response time test, and demonstrates that the total test result does not exceed the response time requirement. The total response time should be tested in a single test that covers from the sensor to the final actuation device on the instrument channel. When the total channel is not tested in a single test, separate tests on groups of components or single components including the total instrument channel shall be combined to verify the total channel response. For APR1400 and OPR1000, the ramp test technique is used for the pressure and differential pressure transmitters and the step function testing technique is applied to the signal processing equipment and final actuation device. As a result, it can be demonstrated that the response time requirement is satisfied by the combined analysis and test technique. Therefore, the proposed methodology in this paper plays a crucial role in guaranteeing the safety of the nuclear power plants systematically satisfying one of two critical requirements from the safety analysis. (authors)« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hinkebein, Thomas E.

    The intrusion of gas into oils stored within the SPR has been examined. When oil is stored in domal salts, gases intrude into the stored oil from the surrounding salt. Aspects of the mechanism of gas intrusion have been examined. In all cases, this gas intrusion results in increases in the oil vapor pressure. Data that have been gathered from 1993 to August 2002 are presented to show the resultant increases in bubble-point pressure on a cavern-by-cavern as well as on a stream basis. The measurement techniques are presented with particular emphasis on the TVP 95. Data analysis methods aremore » presented to show the methods required to obtain recombined cavern oil compositions. Gas-oil ratios are also computed from the data and are presented on a cavern-by-cavern and stream basis. The observed increases in bubble-point pressure and gas-oil ratio are further statistically analyzed to allow data interpretation. Emissions plume modeling is used to determine adherence to state air regulations. Gas intrusion is observed to be variable among the sites and within each dome. Gas intrusions at Bryan Mound and Big Hill have resulted in the largest increases in bubble-point pressure for the Strategic Petroleum Reserve (SPR). The streams at Bayou Choctaw and West Hackberry show minimal bubble-point pressure increases. Emissions plume modeling, using the state mandated ISCST code, of oil storage tanks showed that virtually no gas may be released when H2S standards are considered. DOE plans to scavenge H2S to comply with the very tight standards on this gas. With the assumption of scavenging, benzene releases become the next most controlling factor. Model results show that a GOR of 0.6 SCF/BBL may be emissions that are within standards. Employing the benzene gas release standard will significantly improve oil deliverability. New plume modeling using the computational fluid dynamics code, FLUENT, is addressing limitations of the state mandated ISCST model.« less

  7. Toward Joint Hypothesis-Tests Seismic Event Screening Analysis: Ms|mb and Event Depth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dale; Selby, Neil

    2012-08-14

    Well established theory can be used to combine single-phenomenology hypothesis tests into a multi-phenomenology event screening hypothesis test (Fisher's and Tippett's tests). Commonly used standard error in Ms:mb event screening hypothesis test is not fully consistent with physical basis. Improved standard error - Better agreement with physical basis, and correctly partitions error to include Model Error as a component of variance, correctly reduces station noise variance through network averaging. For 2009 DPRK test - Commonly used standard error 'rejects' H0 even with better scaling slope ({beta} = 1, Selby et al.), improved standard error 'fails to rejects' H0.

  8. Simulant Basis for the Standard High Solids Vessel Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Reid A.; Fiskum, Sandra K.; Suffield, Sarah R.

    The Waste Treatment and Immobilization Plant (WTP) is working to develop a Standard High Solids Vessel Design (SHSVD) process vessel. To support testing of this new design, WTP engineering staff requested that a Newtonian simulant and a non-Newtonian simulant be developed that would represent the Most Adverse Design Conditions (in development) with respect to mixing performance as specified by WTP. The majority of the simulant requirements are specified in 24590-PTF-RPT-PE-16-001, Rev. 0. The first step in this process is to develop the basis for these simulants. This document describes the basis for the properties of these two simulant types. Themore » simulant recipes that meet this basis will be provided in a subsequent document.« less

  9. Using sparse regularization for multi-resolution tomography of the ionosphere

    NASA Astrophysics Data System (ADS)

    Panicciari, T.; Smith, N. D.; Mitchell, C. N.; Da Dalt, F.; Spencer, P. S. J.

    2015-10-01

    Computerized ionospheric tomography (CIT) is a technique that allows reconstructing the state of the ionosphere in terms of electron content from a set of slant total electron content (STEC) measurements. It is usually denoted as an inverse problem. In this experiment, the measurements are considered coming from the phase of the GPS signal and, therefore, affected by bias. For this reason the STEC cannot be considered in absolute terms but rather in relative terms. Measurements are collected from receivers not evenly distributed in space and together with limitations such as angle and density of the observations, they are the cause of instability in the operation of inversion. Furthermore, the ionosphere is a dynamic medium whose processes are continuously changing in time and space. This can affect CIT by limiting the accuracy in resolving structures and the processes that describe the ionosphere. Some inversion techniques are based on ℓ2 minimization algorithms (i.e. Tikhonov regularization) and a standard approach is implemented here using spherical harmonics as a reference to compare the new method. A new approach is proposed for CIT that aims to permit sparsity in the reconstruction coefficients by using wavelet basis functions. It is based on the ℓ1 minimization technique and wavelet basis functions due to their properties of compact representation. The ℓ1 minimization is selected because it can optimize the result with an uneven distribution of observations by exploiting the localization property of wavelets. Also illustrated is how the inter-frequency biases on the STEC are calibrated within the operation of inversion, and this is used as a way for evaluating the accuracy of the method. The technique is demonstrated using a simulation, showing the advantage of ℓ1 minimization to estimate the coefficients over the ℓ2 minimization. This is in particular true for an uneven observation geometry and especially for multi-resolution CIT.

  10. Big data driven cycle time parallel prediction for production planning in wafer manufacturing

    NASA Astrophysics Data System (ADS)

    Wang, Junliang; Yang, Jungang; Zhang, Jie; Wang, Xiaoxi; Zhang, Wenjun Chris

    2018-07-01

    Cycle time forecasting (CTF) is one of the most crucial issues for production planning to keep high delivery reliability in semiconductor wafer fabrication systems (SWFS). This paper proposes a novel data-intensive cycle time (CT) prediction system with parallel computing to rapidly forecast the CT of wafer lots with large datasets. First, a density peak based radial basis function network (DP-RBFN) is designed to forecast the CT with the diverse and agglomerative CT data. Second, the network learning method based on a clustering technique is proposed to determine the density peak. Third, a parallel computing approach for network training is proposed in order to speed up the training process with large scaled CT data. Finally, an experiment with respect to SWFS is presented, which demonstrates that the proposed CTF system can not only speed up the training process of the model but also outperform the radial basis function network, the back-propagation-network and multivariate regression methodology based CTF methods in terms of the mean absolute deviation and standard deviation.

  11. Molecular structure and spectroscopic characterization of Carbamazepine with experimental techniques and DFT quantum chemical calculations

    NASA Astrophysics Data System (ADS)

    Suhasini, M.; Sailatha, E.; Gunasekaran, S.; Ramkumaar, G. R.

    2015-04-01

    A systematic vibrational spectroscopic assignment and analysis of Carbamazepine has been carried out by using FT-IR, FT-Raman and UV spectral data. The vibrational analysis were aided by electronic structure calculations - ab initio (RHF) and hybrid density functional methods (B3LYP) performed with standard basis set 6-31G(d,p). Molecular equilibrium geometries, electronic energies, natural bond order analysis, harmonic vibrational frequencies and IR intensities have been computed. A detailed interpretation of the vibrational spectra of the molecule has been made on the basis of the calculated Potential Energy Distribution (PED) by VEDA program. UV-visible spectrum of the compound was also recorded and the electronic properties, such as HOMO and LUMO energies and λmax were determined by HF/6-311++G(d,p) Time-Dependent method. The thermodynamic functions of the title molecule were also performed using the RHF and DFT methods. The restricted Hartree-Fock and density functional theory-based nuclear magnetic resonance (NMR) calculation procedure was also performed, and it was used for assigning the 13C and 1H NMR chemical shifts of Carbamazepine.

  12. 7 CFR 54.1005 - Basis of service.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Basis of service. 54.1005 Section 54.1005 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... and Poultry Products § 54.1005 Basis of service. (a) Certification of Sanitary Design and Fabrication...

  13. 7 CFR 54.1005 - Basis of service.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 3 2012-01-01 2012-01-01 false Basis of service. 54.1005 Section 54.1005 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... and Poultry Products § 54.1005 Basis of service. (a) Certification of Sanitary Design and Fabrication...

  14. Adaptive technique for matching the spectral response in skin lesions' images

    NASA Astrophysics Data System (ADS)

    Pavlova, P.; Borisova, E.; Pavlova, E.; Avramov, L.

    2015-03-01

    The suggested technique is a subsequent stage for data obtaining from diffuse reflectance spectra and images of diseased tissue with a final aim of skin cancer diagnostics. Our previous work allows us to extract patterns for some types of skin cancer, as a ratio between spectra, obtained from healthy and diseased tissue in the range of 380 - 780 nm region. The authenticity of the patterns depends on the tested point into the area of lesion, and the resulting diagnose could also be fixed with some probability. In this work, two adaptations are implemented to localize pixels of the image lesion, where the reflectance spectrum corresponds to pattern. First adapts the standard to the personal patient and second - translates the spectrum white point basis to the relative white point of the image. Since the reflectance spectra and the image pixels are regarding to different white points, a correction of the compared colours is needed. The latest is done using a standard method for chromatic adaptation. The technique follows the steps below: -Calculation the colorimetric XYZ parameters for the initial white point, fixed by reflectance spectrum from healthy tissue; -Calculation the XYZ parameters for the distant white point on the base of image of nondiseased tissue; -Transformation the XYZ parameters for the test-spectrum by obtained matrix; -Finding the RGB values of the XYZ parameters for the test-spectrum according sRGB; Finally, the pixels of the lesion's image, corresponding to colour from the test-spectrum and particular diagnostic pattern are marked with a specific colour.

  15. Galerkin v. least-squares Petrov–Galerkin projection in nonlinear model reduction

    DOE PAGES

    Carlberg, Kevin Thomas; Barone, Matthew F.; Antil, Harbir

    2016-10-20

    Least-squares Petrov–Galerkin (LSPG) model-reduction techniques such as the Gauss–Newton with Approximated Tensors (GNAT) method have shown promise, as they have generated stable, accurate solutions for large-scale turbulent, compressible flow problems where standard Galerkin techniques have failed. Furthermore, there has been limited comparative analysis of the two approaches. This is due in part to difficulties arising from the fact that Galerkin techniques perform optimal projection associated with residual minimization at the time-continuous level, while LSPG techniques do so at the time-discrete level. This work provides a detailed theoretical and computational comparison of the two techniques for two common classes of timemore » integrators: linear multistep schemes and Runge–Kutta schemes. We present a number of new findings, including conditions under which the LSPG ROM has a time-continuous representation, conditions under which the two techniques are equivalent, and time-discrete error bounds for the two approaches. Perhaps most surprisingly, we demonstrate both theoretically and computationally that decreasing the time step does not necessarily decrease the error for the LSPG ROM; instead, the time step should be ‘matched’ to the spectral content of the reduced basis. In numerical experiments carried out on a turbulent compressible-flow problem with over one million unknowns, we show that increasing the time step to an intermediate value decreases both the error and the simulation time of the LSPG reduced-order model by an order of magnitude.« less

  16. Galerkin v. least-squares Petrov–Galerkin projection in nonlinear model reduction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlberg, Kevin Thomas; Barone, Matthew F.; Antil, Harbir

    Least-squares Petrov–Galerkin (LSPG) model-reduction techniques such as the Gauss–Newton with Approximated Tensors (GNAT) method have shown promise, as they have generated stable, accurate solutions for large-scale turbulent, compressible flow problems where standard Galerkin techniques have failed. Furthermore, there has been limited comparative analysis of the two approaches. This is due in part to difficulties arising from the fact that Galerkin techniques perform optimal projection associated with residual minimization at the time-continuous level, while LSPG techniques do so at the time-discrete level. This work provides a detailed theoretical and computational comparison of the two techniques for two common classes of timemore » integrators: linear multistep schemes and Runge–Kutta schemes. We present a number of new findings, including conditions under which the LSPG ROM has a time-continuous representation, conditions under which the two techniques are equivalent, and time-discrete error bounds for the two approaches. Perhaps most surprisingly, we demonstrate both theoretically and computationally that decreasing the time step does not necessarily decrease the error for the LSPG ROM; instead, the time step should be ‘matched’ to the spectral content of the reduced basis. In numerical experiments carried out on a turbulent compressible-flow problem with over one million unknowns, we show that increasing the time step to an intermediate value decreases both the error and the simulation time of the LSPG reduced-order model by an order of magnitude.« less

  17. Math: Basic Skills Content Standards

    ERIC Educational Resources Information Center

    CASAS - Comprehensive Adult Student Assessment Systems (NJ1), 2008

    2008-01-01

    This document presents content standards tables for math. [CASAS content standards tables are designed for educators at national, state and local levels to inform the alignment of content standards, instruction and assessment. The Content Standards along with the CASAS Competencies form the basis of the CASAS integrated assessment and curriculum…

  18. 40 CFR 466.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... pretreatment standards the following equivalent mass standards are provided. (1) There shall be no discharge of... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PORCELAIN ENAMELING POINT SOURCE CATEGORY Cast Iron Basis...

  19. 40 CFR 466.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... pretreatment standards the following equivalent mass standards are provided. (1) There shall be no discharge of... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Cast Iron Basis Material...

  20. 40 CFR 466.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... pretreatment standards the following equivalent mass standards are provided. (1) There shall be no discharge of... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PORCELAIN ENAMELING POINT SOURCE CATEGORY Cast Iron Basis...

  1. 40 CFR 466.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... pretreatment standards the following equivalent mass standards are provided. (1) There shall be no discharge of... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Cast Iron Basis Material...

  2. 40 CFR 466.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... pretreatment standards the following equivalent mass standards are provided. (1) There shall be no discharge of... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) PORCELAIN ENAMELING POINT SOURCE CATEGORY Cast Iron Basis...

  3. Mesoscale, Radiometrically Referenced, Multi-Temporal Hyperspectral Data for Co2 Leak Detection by Locating Spatial Variation of Biophysically Relevant Parameters

    NASA Astrophysics Data System (ADS)

    McCann, Cooper Patrick

    Low-cost flight-based hyperspectral imaging systems have the potential to provide valuable information for ecosystem and environmental studies as well as aide in land management and land health monitoring. This thesis describes (1) a bootstrap method of producing mesoscale, radiometrically-referenced hyperspectral data using the Landsat surface reflectance (LaSRC) data product as a reference target, (2) biophysically relevant basis functions to model the reflectance spectra, (3) an unsupervised classification technique based on natural histogram splitting of these biophysically relevant parameters, and (4) local and multi-temporal anomaly detection. The bootstrap method extends standard processing techniques to remove uneven illumination conditions between flight passes, allowing the creation of radiometrically self-consistent data. Through selective spectral and spatial resampling, LaSRC data is used as a radiometric reference target. Advantages of the bootstrap method include the need for minimal site access, no ancillary instrumentation, and automated data processing. Data from a flight on 06/02/2016 is compared with concurrently collected ground based reflectance spectra as a means of validation achieving an average error of 2.74%. Fitting reflectance spectra using basis functions, based on biophysically relevant spectral features, allows both noise and data reductions while shifting information from spectral bands to biophysical features. Histogram splitting is used to determine a clustering based on natural splittings of these fit parameters. The Indian Pines reference data enabled comparisons of the efficacy of this technique to established techniques. The splitting technique is shown to be an improvement over the ISODATA clustering technique with an overall accuracy of 34.3/19.0% before merging and 40.9/39.2% after merging. This improvement is also seen as an improvement of kappa before/after merging of 24.8/30.5 for the histogram splitting technique compared to 15.8/28.5 for ISODATA. Three hyperspectral flights over the Kevin Dome area, covering 1843 ha, acquired 06/21/2014, 06/24/2015 and 06/26/2016 are examined with different methods of anomaly detection. Detection of anomalies within a single data set is examined to determine, on a local scale, areas that are significantly different from the surrounding area. Additionally, the detection and identification of persistent anomalies and non-persistent anomalies was investigated across multiple data sets.

  4. Implementation of the US EPA (United States Environmental Protection Agency) Regional Oxidant Modeling System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novak, J.H.

    1984-05-01

    Model design, implementation and quality assurance procedures can have a significant impact on the effectiveness of long term utility of any modeling approach. The Regional Oxidant Modeling System (ROMS) is exceptionally complex because it treats all chemical and physical processes thought to affect ozone concentration on a regional scale. Thus, to effectively illustrate useful design and implementation techniques, this paper describes the general modeling framework which forms the basis of the ROMS. This framework is flexible enough to allow straightforward update or replacement of the chemical kinetics mechanism and/or any theoretical formulations of the physical processes. Use of the Jacksonmore » Structured Programming (JSP) method to implement this modeling framework has not only increased programmer productivity and quality of the resulting programs, but also has provided standardized program design, dynamic documentation, and easily maintainable and transportable code. A summary of the JSP method is presented to encourage modelers to pursue this technique in their own model development efforts. In addition, since data preparation is such an integral part of a successful modeling system, the ROMS processor network is described with emphasis on the internal quality control techniques.« less

  5. Amplitude modulation detection by human listeners in sound fields.

    PubMed

    Zahorik, Pavel; Kim, Duck O; Kuwada, Shigeyuki; Anderson, Paul W; Brandewie, Eugene; Srinivasan, Nirmal

    2011-10-01

    The temporal modulation transfer function (TMTF) approach allows techniques from linear systems analysis to be used to predict how the auditory system will respond to arbitrary patterns of amplitude modulation (AM). Although this approach forms the basis for a standard method of predicting speech intelligibility based on estimates of the acoustical modulation transfer function (MTF) between source and receiver, human sensitivity to AM as characterized by the TMTF has not been extensively studied under realistic listening conditions, such as in reverberant sound fields. Here, TMTFs (octave bands from 2 - 512 Hz) were obtained in 3 listening conditions simulated using virtual auditory space techniques: diotic, anechoic sound field, reverberant room sound field. TMTFs were then related to acoustical MTFs estimated using two different methods in each of the listening conditions. Both diotic and anechoic data were found to be in good agreement with classic results, but AM thresholds in the reverberant room were lower than predictions based on acoustical MTFs. This result suggests that simple linear systems techniques may not be appropriate for predicting TMTFs from acoustical MTFs in reverberant sound fields, and may be suggestive of mechanisms that functionally enhance modulation during reverberant listening.

  6. Total transverse rupture of the duodenum after blunt abdominal trauma.

    PubMed

    Pirozzi, Cesare; Di Marco, Carluccio; Loponte, Margherita; Savino, Grazia

    2014-05-11

    Complete transverse rupture of the duodenum as an isolated lesion in blunt trauma can be considered as exceptional. The aim of this report is to discuss diagnostic procedures and surgical options in such an infrequent presentation. We report on a 37 year old man who had a total transverse rupture of the duodenum after blunt abdominal trauma. Diagnosis was suspected after contrast enhanced CT scan and confirmed at laparotomy; duodenal rupture was repaired by an end to end duodenal-duodenal anastomosis, after Kocher maneuver. The patient had fast and complete recovery. A high index of suspicion is necessary for timely diagnosis. Multi detector contrast enhanced CT scan is the gold standard for that aim. Surgical management must be tailored on an individual basis, since many techniques are available for both reconstruction and duodenum decompression. Kocher maneuver is essential for complete inspection of the pancreatic duodenal block and for appropriate reconstruction. Management of isolated duodenal rupture can be difficult. Contrast enhanced TC scans is essential for timely diagnosis. Primary repair can be achieved by an end to end duodenum anastomosis after Kocher maneuver, although alternative techniques are available for tailored solutions. Complex duodenum decompression techniques are not mandatory.

  7. The effect of sampling techniques used in the multiconfigurational Ehrenfest method

    NASA Astrophysics Data System (ADS)

    Symonds, C.; Kattirtzi, J. A.; Shalashilin, D. V.

    2018-05-01

    In this paper, we compare and contrast basis set sampling techniques recently developed for use in the ab initio multiple cloning method, a direct dynamics extension to the multiconfigurational Ehrenfest approach, used recently for the quantum simulation of ultrafast photochemistry. We demonstrate that simultaneous use of basis set cloning and basis function trains can produce results which are converged to the exact quantum result. To demonstrate this, we employ these sampling methods in simulations of quantum dynamics in the spin boson model with a broad range of parameters and compare the results to accurate benchmarks.

  8. The effect of sampling techniques used in the multiconfigurational Ehrenfest method.

    PubMed

    Symonds, C; Kattirtzi, J A; Shalashilin, D V

    2018-05-14

    In this paper, we compare and contrast basis set sampling techniques recently developed for use in the ab initio multiple cloning method, a direct dynamics extension to the multiconfigurational Ehrenfest approach, used recently for the quantum simulation of ultrafast photochemistry. We demonstrate that simultaneous use of basis set cloning and basis function trains can produce results which are converged to the exact quantum result. To demonstrate this, we employ these sampling methods in simulations of quantum dynamics in the spin boson model with a broad range of parameters and compare the results to accurate benchmarks.

  9. Synthetic mesh in the surgical repair of pelvic organ prolapse: current status and future directions.

    PubMed

    Keys, Tristan; Campeau, Lysanne; Badlani, Gopal

    2012-08-01

    In light of the recent Food and Drug Administration public health notification regarding complications associated with transvaginally placed mesh for pelvic organ prolapse (POP) repair, we review recent literature to evaluate current outcomes and complication data, analyze the clinical need for mesh on the basis of genetic and biochemical etiologies of POP, and investigate trends of mesh use via an American Urological Association member survey. Mesh-based techniques show better anatomic results than traditional repair of anterior POP, but subjective outcomes are equivalent. Further research and Level I evidence are required before mesh-based repair of POP can be standardized. Adequate surgical training and patient selection should decrease complication rates. Published by Elsevier Inc.

  10. Design Development Test and Evaluation (DDT and E) Considerations for Safe and Reliable Human Rated Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Miller, James; Leggett, Jay; Kramer-White, Julie

    2008-01-01

    A team directed by the NASA Engineering and Safety Center (NESC) collected methodologies for how best to develop safe and reliable human rated systems and how to identify the drivers that provide the basis for assessing safety and reliability. The team also identified techniques, methodologies, and best practices to assure that NASA can develop safe and reliable human rated systems. The results are drawn from a wide variety of resources, from experts involved with the space program since its inception to the best-practices espoused in contemporary engineering doctrine. This report focuses on safety and reliability considerations and does not duplicate or update any existing references. Neither does it intend to replace existing standards and policy.

  11. Coniferous forest classification and inventory using Landsat and digital terrain data

    NASA Technical Reports Server (NTRS)

    Franklin, J.; Logan, T. L.; Woodcock, C. E.; Strahler, A. H.

    1986-01-01

    Machine-processing techniques were used in a Forest Classification and Inventory System (FOCIS) procedure to extract and process tonal, textural, and terrain information from registered Landsat multispectral and digital terrain data. Using FOCIS as a basis for stratified sampling, the softwood timber volumes of the Klamath National Forest and Eldorado National Forest were estimated within standard errors of 4.8 and 4.0 percent, respectively. The accuracy of these large-area inventories is comparable to the accuracy yielded by use of conventional timber inventory methods, but, because of automation, the FOCIS inventories are more rapid (9-12 months compared to 2-3 years for conventional manual photointerpretation, map compilation and drafting, field sampling, and data processing) and are less costly.

  12. Lightning Pin Injection Test: MOSFETS in "ON" State

    NASA Technical Reports Server (NTRS)

    Ely, Jay J.; Nguyen, Truong X.; Szatkowski, George N.; Koppen, Sandra V.; Mielnik, John J.; Vaughan, Roger K.; Saha, Sankalita; Wysocki, Philip F.; Celaya, Jose R.

    2011-01-01

    The test objective was to evaluate MOSFETs for induced fault modes caused by pin-injecting a standard lightning waveform into them while operating. Lightning Pin-Injection testing was performed at NASA LaRC. Subsequent fault-mode and aging studies were performed by NASA ARC researchers using the Aging and Characterization Platform for semiconductor components. This report documents the test process and results, to provide a basis for subsequent lightning tests. The ultimate IVHM goal is to apply prognostic and health management algorithms using the features extracted during aging to allow calculation of expected remaining useful life. A survey of damage assessment techniques based upon inspection is provided, and includes data for optical microscope and X-ray inspection. Preliminary damage assessments based upon electrical parameters are also provided.

  13. Ion bipolar junction transistors

    PubMed Central

    Tybrandt, Klas; Larsson, Karin C.; Richter-Dahlfors, Agneta; Berggren, Magnus

    2010-01-01

    Dynamic control of chemical microenvironments is essential for continued development in numerous fields of life sciences. Such control could be achieved with active chemical circuits for delivery of ions and biomolecules. As the basis for such circuitry, we report a solid-state ion bipolar junction transistor (IBJT) based on conducting polymers and thin films of anion- and cation-selective membranes. The IBJT is the ionic analogue to the conventional semiconductor BJT and is manufactured using standard microfabrication techniques. Transistor characteristics along with a model describing the principle of operation, in which an anionic base current amplifies a cationic collector current, are presented. By employing the IBJT as a bioelectronic circuit element for delivery of the neurotransmitter acetylcholine, its efficacy in modulating neuronal cell signaling is demonstrated. PMID:20479274

  14. 45 CFR 164.102 - Statutory basis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Statutory basis. 164.102 Section 164.102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY General Provisions § 164.102 Statutory basis. The provisions of this part are adopted...

  15. 45 CFR 164.102 - Statutory basis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Statutory basis. 164.102 Section 164.102 Public Welfare Department of Health and Human Services ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY General Provisions § 164.102 Statutory basis. The provisions of this part are adopted...

  16. 45 CFR 164.102 - Statutory basis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Statutory basis. 164.102 Section 164.102 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED REQUIREMENTS SECURITY AND PRIVACY General Provisions § 164.102 Statutory basis. The provisions of this part are adopted...

  17. 16 CFR 1203.30 - Purpose, basis, and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1203.30 Commercial Practices CONSUMER PRODUCT SAFETY COMMISSION CONSUMER PRODUCT SAFETY ACT REGULATIONS SAFETY STANDARD FOR BICYCLE HELMETS Certification § 1203.30 Purpose, basis, and scope. (a) Purpose. The... of compliance in the form specified. (b) Basis. Section 14(a)(1) of the Consumer Product Safety Act...

  18. A Radial Basis Function Approach to Financial Time Series Analysis

    DTIC Science & Technology

    1993-12-01

    including efficient methods for parameter estimation and pruning, a pointwise prediction error estimator, and a methodology for controlling the "data...collection of practical techniques to address these issues for a modeling methodology . Radial Basis Function networks. These techniques in- clude efficient... methodology often then amounts to a careful consideration of the interplay between model complexity and reliability. These will be recurrent themes

  19. Evolvable Neuronal Paths: A Novel Basis for Information and Search in the Brain

    PubMed Central

    Fernando, Chrisantha; Vasas, Vera; Szathmáry, Eörs; Husbands, Phil

    2011-01-01

    We propose a previously unrecognized kind of informational entity in the brain that is capable of acting as the basis for unlimited hereditary variation in neuronal networks. This unit is a path of activity through a network of neurons, analogous to a path taken through a hidden Markov model. To prove in principle the capabilities of this new kind of informational substrate, we show how a population of paths can be used as the hereditary material for a neuronally implemented genetic algorithm, (the swiss-army knife of black-box optimization techniques) which we have proposed elsewhere could operate at somatic timescales in the brain. We compare this to the same genetic algorithm that uses a standard ‘genetic’ informational substrate, i.e. non-overlapping discrete genotypes, on a range of optimization problems. A path evolution algorithm (PEA) is defined as any algorithm that implements natural selection of paths in a network substrate. A PEA is a previously unrecognized type of natural selection that is well suited for implementation by biological neuronal networks with structural plasticity. The important similarities and differences between a standard genetic algorithm and a PEA are considered. Whilst most experiments are conducted on an abstract network model, at the conclusion of the paper a slightly more realistic neuronal implementation of a PEA is outlined based on Izhikevich spiking neurons. Finally, experimental predictions are made for the identification of such informational paths in the brain. PMID:21887266

  20. Environmental quality assessment of groundwater resources in Al Jabal Al Akhdar, Sultanate of Oman

    NASA Astrophysics Data System (ADS)

    Al-Kalbani, Mohammed Saif; Price, Martin F.; Ahmed, Mushtaque; Abahussain, Asma; O'Higgins, Timothy

    2017-11-01

    The research was conducted to assess the quality of groundwater resources of Al Jabal Al Akhdar, Oman. 11 drinking water sources were sampled during summer and winter seasons during 2012-2013 to evaluate their physico-chemical quality indicators; and assess their suitability for drinking and other domestic purposes. Sample collection, handling and processing followed the standard methods recommended by APHA and analyzed in quality assured laboratories using appropriate analytical methods and instrumental techniques. The results show that the quality parameters in all drinking water resources are within the permissible limits set by Omani and WHO standards; and the drinking water quality index is good or medium in quality based on NFS-WQI classification criteria, indicating their suitability for human consumption. There is an indication of the presence of high nitrate concentrations in some groundwater wells, which require more investigations and monitoring program to be conducted on regular basis to ensure good quality water supply for the residents in the mountain. The trilinear Piper diagram shows that most of the drinking water resources of the study area fall in the field of calcium and bicarbonate type with some magnesium bicarbonate type indicating that most of the major ions are natural in origin due to the geology of the region. This study is a first step towards providing indicators on groundwater quality of this fragile mountain ecosystem, which will be the basis for future planning decisions on corrective demand management measures to protect groundwater resources of Al Jabal Al Akhdar.

  1. Who is that masked educator? Deconstructing the teaching and learning processes of an innovative humanistic simulation technique.

    PubMed

    McAllister, Margaret; Searl, Kerry Reid; Davis, Susan

    2013-12-01

    Simulation learning in nursing has long made use of mannequins, standardized actors and role play to allow students opportunity to practice technical body-care skills and interventions. Even though numerous strategies have been developed to mimic or amplify clinical situations, a common problem that is difficult to overcome in even the most well-executed simulation experiences, is that students may realize the setting is artificial and fail to fully engage, remember or apply the learning. Another problem is that students may learn technical competence but remain uncertain about communicating with the person. Since communication capabilities are imperative in human service work, simulation learning that only achieves technical competence in students is not fully effective for the needs of nursing education. Furthermore, while simulation learning is a burgeoning space for innovative practices, it has been criticized for the absence of a basis in theory. It is within this context that an innovative simulation learning experience named "Mask-Ed (KRS simulation)", has been deconstructed and the active learning components examined. Establishing a theoretical basis for creative teaching and learning practices provides an understanding of how, why and when simulation learning has been effective and it may help to distinguish aspects of the experience that could be improved. Three conceptual theoretical fields help explain the power of this simulation technique: Vygotskian sociocultural learning theory, applied theatre and embodiment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. 7 CFR 653.3 - Adaptation of technical standards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 6 2012-01-01 2012-01-01 false Adaptation of technical standards. 653.3 Section 653.3..., DEPARTMENT OF AGRICULTURE SUPPORT ACTIVITIES TECHNICAL STANDARDS § 653.3 Adaptation of technical standards. Technical standards and criteria developed on a national basis may require special adaptation to meet local...

  3. 7 CFR 653.3 - Adaptation of technical standards.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 6 2014-01-01 2014-01-01 false Adaptation of technical standards. 653.3 Section 653.3..., DEPARTMENT OF AGRICULTURE SUPPORT ACTIVITIES TECHNICAL STANDARDS § 653.3 Adaptation of technical standards. Technical standards and criteria developed on a national basis may require special adaptation to meet local...

  4. 7 CFR 653.3 - Adaptation of technical standards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 6 2011-01-01 2011-01-01 false Adaptation of technical standards. 653.3 Section 653.3..., DEPARTMENT OF AGRICULTURE SUPPORT ACTIVITIES TECHNICAL STANDARDS § 653.3 Adaptation of technical standards. Technical standards and criteria developed on a national basis may require special adaptation to meet local...

  5. 7 CFR 653.3 - Adaptation of technical standards.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 6 2013-01-01 2013-01-01 false Adaptation of technical standards. 653.3 Section 653.3..., DEPARTMENT OF AGRICULTURE SUPPORT ACTIVITIES TECHNICAL STANDARDS § 653.3 Adaptation of technical standards. Technical standards and criteria developed on a national basis may require special adaptation to meet local...

  6. 7 CFR 51.1440 - Application of standards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... STANDARDS) United States Standards for Grades of Shelled Pecans Application of Standards § 51.1440 Application of standards. The grade of a lot of shelled pecans shall be determined on the basis of a composite... container or number of containers in which the pecans are obviously of a quality or size materially...

  7. Discovery and identification of quality markers of Chinese medicine based on pharmacokinetic analysis.

    PubMed

    He, Jun; Feng, Xinchi; Wang, Kai; Liu, Changxiao; Qiu, Feng

    2018-02-28

    Quality control of Chinese medicine (CM) is an effective measure to ensure the safety and efficacy of CM in clinical practice, which is also a key factor to restrict the modernization process of CM. Various chemical components exist in CM and the determination of several chemical components is the main approach for quality control of vast majority of CM in the present. However, many components determined lack not only specificity, but also biological activities. This is bound to greatly reduce the actual value of quality standard of CM. Professor Changxiao Liu proposed the "quality marker" (Q-marker) concept to ensure the standardization and rationalization for the quality control of CM. As we all know, CMs are taken orally in most cases and could be extensively metabolized in vivo. Both prototype components and the metabolites could be the actual therapeutic material basis. Pharmacokinetic studies could benefit the elucidation of actual therapeutic material basis which is closely related to the identification of Q-markers. Therefore, a new strategy about Q-marker was proposed based on the pharmacokinetic analysis of CM, hoping to provide some ideas for the discovery and identification of Q-marker. The relationship between pharmacokinetic studies and the identification of Q-markers was demonstrated in this review and a new strategy was proposed. Starting from the pharmacokinetic analysis, reverse tracing of the prototype active components and the potential prodrugs in CM were conducted first and the therapeutic material basis were identified as Q-markers. Then, modern analytical techniques and methods were applied to obtain comprehensive quality control for these constituents. Several CMs including gingko biloba, ginseng, Periplocae Cortex, Mori Cortex, Bupleuri Radix and Scutellariae Radix were listed as examples to clarify how the new strategy could be applied. Pharmacokinetic studies play an important role for the elucidation of therapeutic material basis of CM and the identification of Q-markers and it should be taken into account during the process of the investigation of Q-marker. Copyright © 2018 Elsevier GmbH. All rights reserved.

  8. Isolated glenohumeral range of motion, excluding side-to-side difference in humeral retroversion, in asymptomatic high-school baseball players.

    PubMed

    Mihata, Teruhisa; Takeda, Atsushi; Kawakami, Takeshi; Itami, Yasuo; Watanabe, Chisato; Doi, Munekazu; Neo, Masashi

    2016-06-01

    Glenohumeral range of motion is correlated with shoulder capsular condition and is thus considered to be predictive of shoulder pathology. However, in throwing athletes, a side-to-side difference in humeral retroversion makes it difficult to evaluate capsular condition on the basis of glenohumeral range of motion measured by using the conventional technique. The purpose of this study was to measure isolated glenohumeral rotation, excluding side-to-side differences in humeral retroversion, in asymptomatic high-school baseball players. A total of 195 high-school baseball players (52 pitchers and 143 position players; median age, 16 years) and 20 high-school non-throwing athletes (median age, 16 years) without any shoulder symptoms were enroled in this study. Glenohumeral external and internal rotations were measured by using both a conventional technique and our ultrasound-assisted technique. This technique, neutral rotation, was standardized on the basis of the ultrasonographically visualized location of the bicipital groove to exclude side-to-side differences in humeral retroversion from the calculated rotation angle. Intra- and inter-observer agreements of rotational measurements were evaluated by using intra-class correlation coefficients (ICCs). Isolated glenohumeral rotation measurements, excluding side-to-side differences in humeral retroversion, demonstrated excellent intra-observer (ICC > 0.89) and inter-observer (ICC > 0.78) agreements. Isolated glenohumeral internal rotation was significantly less in the dominant shoulder than in the non-dominant shoulder in asymptomatic baseball players (P < 0.001). Isolated glenohumeral external rotation in baseball players was significantly greater than in non-throwing athletes (P < 0.05). In the baseball players, humeral torsion in the dominant shoulder was significantly greater than that in the non-dominant shoulder (P < 0.001), indicating that the retroversion angle was greater in dominant shoulders than in non-dominant shoulders. Isolated glenohumeral external and internal rotations can be measured with high intra- and inter-observer reliability with the exclusion of side-to-side differences in humeral retroversion. Capsular and muscular changes in the throwing shoulder may be better evaluated by using our ultrasound-assisted technique. Cross-sectional study, Level III.

  9. Standards for discharge measurement with standardized nozzles and orifices

    NASA Technical Reports Server (NTRS)

    1940-01-01

    The following standards give the standardized forms for two throttling devices, standard nozzles and standard orifices, and enable them to be used in circular pipes without calibration. The definition of the standards are applicable in principle to the calibration and use of nonstandardized throttling devices, such as the venturi tube. The standards are valid, likewise, as a basis for discharge measurements in the German acceptance standards.

  10. Standardized pivot shift test improves measurement accuracy.

    PubMed

    Hoshino, Yuichi; Araujo, Paulo; Ahlden, Mattias; Moore, Charity G; Kuroda, Ryosuke; Zaffagnini, Stefano; Karlsson, Jon; Fu, Freddie H; Musahl, Volker

    2012-04-01

    The variability of the pivot shift test techniques greatly interferes with achieving a quantitative and generally comparable measurement. The purpose of this study was to compare the variation of the quantitative pivot shift measurements with different surgeons' preferred techniques to a standardized technique. The hypothesis was that standardizing the pivot shift test would improve consistency in the quantitative evaluation when compared with surgeon-specific techniques. A whole lower body cadaveric specimen was prepared to have a low-grade pivot shift on one side and high-grade pivot shift on the other side. Twelve expert surgeons performed the pivot shift test using (1) their preferred technique and (2) a standardized technique. Electromagnetic tracking was utilized to measure anterior tibial translation and acceleration of the reduction during the pivot shift test. The variation of the measurement was compared between the surgeons' preferred technique and the standardized technique. The anterior tibial translation during pivot shift test was similar between using surgeons' preferred technique (left 24.0 ± 4.3 mm; right 15.5 ± 3.8 mm) and using standardized technique (left 25.1 ± 3.2 mm; right 15.6 ± 4.0 mm; n.s.). However, the variation in acceleration was significantly smaller with the standardized technique (left 3.0 ± 1.3 mm/s(2); right 2.5 ± 0.7 mm/s(2)) compared with the surgeons' preferred technique (left 4.3 ± 3.3 mm/s(2); right 3.4 ± 2.3 mm/s(2); both P < 0.01). Standardizing the pivot shift test maneuver provides a more consistent quantitative evaluation and may be helpful in designing future multicenter clinical outcome trials. Diagnostic study, Level I.

  11. Defense Infrastructure: Improved Guidance Needed for Estimating Alternatively Financed Project Liabilities

    DTIC Science & Technology

    2013-04-01

    auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for...our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and...majority of the project financing is obtained from financial institutions in the form of construction loans or military housing bonds. The servicemembers

  12. 7 CFR 56.4 - Basis of grading service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) VOLUNTARY GRADING OF SHELL EGGS Grading of Shell Eggs General § 56.4 Basis of grading service. (a) Any grading service in... the basis of the “United States Standards, Grades, and Weight Classes for Egg Shells.” However...

  13. 7 CFR 56.4 - Basis of grading service.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) VOLUNTARY GRADING OF SHELL EGGS Grading of Shell Eggs General § 56.4 Basis of grading service. (a) Any grading service in... the basis of the “United States Standards, Grades, and Weight Classes for Egg Shells.” However...

  14. 5 CFR 339.206 - Disqualification on the basis of medical history.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... history. 339.206 Section 339.206 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... Disqualification on the basis of medical history. A candidate may not be disqualified for any position solely on the basis of medical history. For positions with medical standards or physical requirements, or...

  15. 5 CFR 339.206 - Disqualification on the basis of medical history.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... history. 339.206 Section 339.206 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... Disqualification on the basis of medical history. A candidate may not be disqualified for any position solely on the basis of medical history. For positions with medical standards or physical requirements, or...

  16. 5 CFR 339.206 - Disqualification on the basis of medical history.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... history. 339.206 Section 339.206 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... Disqualification on the basis of medical history. A candidate may not be disqualified for any position solely on the basis of medical history. For positions with medical standards or physical requirements, or...

  17. 5 CFR 339.206 - Disqualification on the basis of medical history.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... history. 339.206 Section 339.206 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... Disqualification on the basis of medical history. A candidate may not be disqualified for any position solely on the basis of medical history. For positions with medical standards or physical requirements, or...

  18. 5 CFR 339.206 - Disqualification on the basis of medical history.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... history. 339.206 Section 339.206 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE... Disqualification on the basis of medical history. A candidate may not be disqualified for any position solely on the basis of medical history. For positions with medical standards or physical requirements, or...

  19. Comparison of a new noncoplanar intensity-modulated radiation therapy technique for craniospinal irradiation with 3 coplanar techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Anders T., E-mail: andehans@rm.dk; Lukacova, Slavka; Lassen-Ramshad, Yasmin

    2015-01-01

    When standard conformal x-ray technique for craniospinal irradiation is used, it is a challenge to achieve satisfactory dose coverage of the target including the area of the cribriform plate, while sparing organs at risk. We present a new intensity-modulated radiation therapy (IMRT), noncoplanar technique, for delivering irradiation to the cranial part and compare it with 3 other techniques and previously published results. A total of 13 patients who had previously received craniospinal irradiation with standard conformal x-ray technique were reviewed. New treatment plans were generated for each patient using the noncoplanar IMRT-based technique, a coplanar IMRT-based technique, and a coplanarmore » volumetric-modulated arch therapy (VMAT) technique. Dosimetry data for all patients were compared with the corresponding data from the conventional treatment plans. The new noncoplanar IMRT technique substantially reduced the mean dose to organs at risk compared with the standard radiation technique. The 2 other coplanar techniques also reduced the mean dose to some of the critical organs. However, this reduction was not as substantial as the reduction obtained by the noncoplanar technique. Furthermore, compared with the standard technique, the IMRT techniques reduced the total calculated radiation dose that was delivered to the normal tissue, whereas the VMAT technique increased this dose. Additionally, the coverage of the target was significantly improved by the noncoplanar IMRT technique. Compared with the standard technique, the coplanar IMRT and the VMAT technique did not improve the coverage of the target significantly. All the new planning techniques increased the number of monitor units (MU) used—the noncoplanar IMRT technique by 99%, the coplanar IMRT technique by 122%, and the VMAT technique by 26%—causing concern for leak radiation. The noncoplanar IMRT technique covered the target better and decreased doses to organs at risk compared with the other techniques. All the new techniques increased the number of MU compared with the standard technique.« less

  20. Demonstration of landfill gas enhancement techniques in landfill simulators

    NASA Astrophysics Data System (ADS)

    Walsh, J. J.; Vogt, W. G.

    1982-02-01

    Various techniques to enhance gas production in sanitary landfills were applied to landfill simulators. These techniques include (1) accelerated moisture addition, (2) leachate recycling, (3) buffer addition, (4) nutrient addition, and (5) combinations of the above. Results are compiled through on-going operation and monitoring of sixteen landfill simulators. These test cells contain about 380 kg of municipal solid waste. Quantities of buffer and nutrient materials were placed in selected cells at the time of loading. Water is added to all test cells on a monthly basis; leachate is withdrawn from all cells (and recycled on selected cells) also on a monthly basis. Daily monitoring of gas volumes and refuse temperatures is performed. Gas and leachate samples are collected and analyzed on a monthly basis. Leachate and gas quality and quantity reslts are presented for the first 18 months of operation.

  1. Comparison of Preloaded Bougie versus Standard Bougie Technique for Endotracheal Intubation in a Cadaveric Model.

    PubMed

    Baker, Jay B; Maskell, Kevin F; Matlock, Aaron G; Walsh, Ryan M; Skinner, Carl G

    2015-07-01

    We compared intubating with a preloaded bougie (PB) against standard bougie technique in terms of success rates, time to successful intubation and provider preference on a cadaveric airway model. In this prospective, crossover study, healthcare providers intubated a cadaver using the PB technique and the standard bougie technique. Participants were randomly assigned to start with either technique. Following standardized training and practice, procedural success and time for each technique was recorded for each participant. Subsequently, participants were asked to rate their perceived ease of intubation on a visual analogue scale of 1 to 10 (1=difficult and 10=easy) and to select which technique they preferred. 47 participants with variable experience intubating were enrolled at an emergency medicine intern airway course. The success rate of all groups for both techniques was equal (95.7%). The range of times to completion for the standard bougie technique was 16.0-70.2 seconds, with a mean time of 29.7 seconds. The range of times to completion for the PB technique was 15.7-110.9 seconds, with a mean time of 29.4 seconds. There was a non-significant difference of 0.3 seconds (95% confidence interval -2.8 to 3.4 seconds) between the two techniques. Participants rated the relative ease of intubation as 7.3/10 for the standard technique and 7.6/10 for the preloaded technique (p=0.53, 95% confidence interval of the difference -0.97 to 0.50). Thirty of 47 participants subjectively preferred the PB technique (p=0.039). There was no significant difference in success or time to intubation between standard bougie and PB techniques. The majority of participants in this study preferred the PB technique. Until a clear and clinically significant difference is found between these techniques, emergency airway operators should feel confident in using the technique with which they are most comfortable.

  2. RECOMMENDED FOUNDATION FILL MATERIALS CONSTRUCTION STANDARD OF THE FLORIDA RADON RESEARCH PROGRAM

    EPA Science Inventory

    The report summarizes the technical basis for a recommended foundation fill materials standard for new construction houses in Florida. he radon-control construction standard was developed by the Florida Radon Research Program (FRRP). ill material standards are formulated for: (1)...

  3. Thermodynamic properties of rhamnolipid micellization and adsorption.

    PubMed

    Mańko, Diana; Zdziennicka, Anna; Jańczuk, Bronisław

    2014-07-01

    of the surface tension, density, viscosity and conductivity of aqueous solutions of rhamnolipid at natural and controlled pH were made at 293 K. On the basis of the obtained results the critical micelle concentration of rhamnolipid and its Gibbs surface excess concentration at the water-air interface were determined. The maximal surface excess concentration was considered in the light of the size of rhamnolipid molecule. Next the Gibbs standard free energy of rhamnolipid adsorption at this interface was determined on the basis of the different approaches to this energy. The standard free energy of adsorption was also deduced on the basis of the surface tension of n-hexane and water-n-hexane interface tension. Standard free energy obtained in this way was close to those determined by using the Langmuir, Szyszkowski, Aronson and Rosen, Gu and Zhu as well as modified Gamboa and Olea equations. The standard free energy of rhamnolipid adsorption at the water-air interface was compared to its standard free energy of micellization which was determined from the Philips equation taking into account the degree of rhamnolipid dissociation in the micelles. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Optimization of auxiliary basis sets for the LEDO expansion and a projection technique for LEDO-DFT.

    PubMed

    Götz, Andreas W; Kollmar, Christian; Hess, Bernd A

    2005-09-01

    We present a systematic procedure for the optimization of the expansion basis for the limited expansion of diatomic overlap density functional theory (LEDO-DFT) and report on optimized auxiliary orbitals for the Ahlrichs split valence plus polarization basis set (SVP) for the elements H, Li--F, and Na--Cl. A new method to deal with near-linear dependences in the LEDO expansion basis is introduced, which greatly reduces the computational effort of LEDO-DFT calculations. Numerical results for a test set of small molecules demonstrate the accuracy of electronic energies, structural parameters, dipole moments, and harmonic frequencies. For larger molecular systems the numerical errors introduced by the LEDO approximation can lead to an uncontrollable behavior of the self-consistent field (SCF) process. A projection technique suggested by Löwdin is presented in the framework of LEDO-DFT, which guarantees for SCF convergence. Numerical results on some critical test molecules suggest the general applicability of the auxiliary orbitals presented in combination with this projection technique. Timing results indicate that LEDO-DFT is competitive with conventional density fitting methods. (c) 2005 Wiley Periodicals, Inc.

  5. POD/MAC-Based Modal Basis Selection for a Reduced Order Nonlinear Response Analysis

    NASA Technical Reports Server (NTRS)

    Rizzi, Stephen A.; Przekop, Adam

    2007-01-01

    A feasibility study was conducted to explore the applicability of a POD/MAC basis selection technique to a nonlinear structural response analysis. For the case studied the application of the POD/MAC technique resulted in a substantial improvement of the reduced order simulation when compared to a classic approach utilizing only low frequency modes present in the excitation bandwidth. Further studies are aimed to expand application of the presented technique to more complex structures including non-planar and two-dimensional configurations. For non-planar structures the separation of different displacement components may not be necessary or desirable.

  6. 40 CFR 465.15 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CFR part 403 and achieve the following pretreatment standards for new sources. The mass of wastewater... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Pretreatment standards for new sources... GUIDELINES AND STANDARDS (CONTINUED) COIL COATING POINT SOURCE CATEGORY Steel Basis Material Subcategory...

  7. 40 CFR 465.25 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 CFR part 403 and achieve the following pretreatment standards for new sources. The mass of... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Pretreatment standards for new sources... GUIDELINES AND STANDARDS (CONTINUED) COIL COATING POINT SOURCE CATEGORY Galvanized Basis Material Subcategory...

  8. 40 CFR 465.35 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 CFR part 403 and achieve the following pretreatment standards for new sources. The mass of... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Pretreatment standards for new sources... GUIDELINES AND STANDARDS (CONTINUED) COIL COATING POINT SOURCE CATEGORY Aluminum Basis Material Subcategory...

  9. Distributed Compressive Sensing

    DTIC Science & Technology

    2009-01-01

    example, smooth signals are sparse in the Fourier basis, and piecewise smooth signals are sparse in a wavelet basis [8]; the commercial coding standards MP3...including wavelets [8], Gabor bases [8], curvelets [35], etc., are widely used for representation and compression of natural signals, images, and...spikes and the sine waves of a Fourier basis, or the Fourier basis and wavelets . Signals that are sparsely represented in frames or unions of bases can

  10. Practical utilization of recombinant AAV vector reference standards: focus on vector genomes titration by free ITR qPCR.

    PubMed

    D'Costa, Susan; Blouin, Veronique; Broucque, Frederic; Penaud-Budloo, Magalie; François, Achille; Perez, Irene C; Le Bec, Christine; Moullier, Philippe; Snyder, Richard O; Ayuso, Eduard

    2016-01-01

    Clinical trials using recombinant adeno-associated virus (rAAV) vectors have demonstrated efficacy and a good safety profile. Although the field is advancing quickly, vector analytics and harmonization of dosage units are still a limitation for commercialization. AAV reference standard materials (RSMs) can help ensure product safety by controlling the consistency of assays used to characterize rAAV stocks. The most widely utilized unit of vector dosing is based on the encapsidated vector genome. Quantitative polymerase chain reaction (qPCR) is now the most common method to titer vector genomes (vg); however, significant inter- and intralaboratory variations have been documented using this technique. Here, RSMs and rAAV stocks were titered on the basis of an inverted terminal repeats (ITRs) sequence-specific qPCR and we found an artificial increase in vg titers using a widely utilized approach. The PCR error was introduced by using single-cut linearized plasmid as the standard curve. This bias was eliminated using plasmid standards linearized just outside the ITR region on each end to facilitate the melting of the palindromic ITR sequences during PCR. This new "Free-ITR" qPCR delivers vg titers that are consistent with titers obtained with transgene-specific qPCR and could be used to normalize in-house product-specific AAV vector standards and controls to the rAAV RSMs. The free-ITR method, including well-characterized controls, will help to calibrate doses to compare preclinical and clinical data in the field.

  11. TRIBAL WATER QUALITY STANDARDS WORKSHOP

    EPA Science Inventory

    Water quality standards are the foundation for water management actions. They provide the basis for regulating discharges of pollutants to surface waters, and provide a target for restoration of degraded waters. Water quality standards identify and protect uses of the water bod...

  12. Data Standards for Omics Data: The Basis of Data Sharing and Reuse

    PubMed Central

    Chervitz, Stephen A.; Deutsch, Eric W.; Field, Dawn; Parkinson, Helen; Quackenbush, John; Rocca-Serra, Phillipe; Sansone, Susanna-Assunta; Stoeckert, Christian J.; Taylor, Chris F.; Taylor, Ronald; Ball, Catherine A.

    2014-01-01

    To facilitate sharing of Omics data, many groups of scientists have been working to establish the relevant data standards. The main components of data sharing standards are experiment description standards, data exchange standards, terminology standards, and experiment execution standards. Here we provide a survey of existing and emerging standards that are intended to assist the free and open exchange of large-format data. PMID:21370078

  13. HBA1c: clinical and biological agreement for standardization of assay methods. Report by the experts of ALFEDIAM (Association de Langue Française pour lEtude du Diabète et des Maladies Métabolique) and SFBC (Société Française de Biologie Clinique).

    PubMed

    Gillery, P; Bordas-Fonfrède, M; Chapelle, J P; Drouin, P; Hue, G; Lévy-Marchal, C; Périer, C; Sélam, J L; Slama, G; Thivolet, C; Vialettes, B

    1999-09-01

    Glycohaemoglobin, and particularly haemoglobin A1c(HbA1c), assays have been used for many years to retrospectively evaluate the glycaemic control of diabetic patients. Cut-off values have been established for deciding treatment modifications. The techniques used in the laboratories however exhibit varying quality, and all of them are not yet standardized. The consequence is an under-utilization of this test, especially in non-hospital practice. In this context, working groups of Société Française de Biologie Clinique (SFBC), Association de Langue Française pour l'Etude du Diabète et des Maladies Métaboliques (ALFEDIAM) and Société Française d'Endocrinologie (SFE) have met together, in order to analyze the national status, and to propose practical recommendations for implementing a standardization process on the basis of international experiences. It is recommended to exclusively express results as HbA1c percentage, using methods standardized and certified by comparison to reference methods such as those using Diabetes Control and Complications Trial (DCCT) values. Simultaneously, contacts have been established with manufacturers, and the realisation of periodic quality control surveys was encouraged.

  14. Determining Remaining Useful Life of Aging Cables in Nuclear Power Plants – Interim Study FY13

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simmons, Kevin L.; Fifield, Leonard S.; Westman, Matthew P.

    2013-09-27

    The most important criterion for cable performance is its ability to withstand a design-basis accident. With nearly 1000 km of power, control, instrumentation, and other cables typically found in an NPP, it would be a significant undertaking to inspect all of the cables. Degradation of the cable jacket, electrical insulation, and other cable components is a key issue that is likely to affect the ability of the currently installed cables to operate safely and reliably for another 20 to 40 years beyond the initial operating life. The development of one or more nondestructive evaluation (NDE) techniques and supporting models thatmore » could assist in determining the remaining life expectancy of cables or their current degradation state would be of significant interest. The ability to nondestructively determine material and electrical properties of cable jackets and insulation without disturbing the cables or connections has been deemed essential. Currently, the only technique accepted by industry to measure cable elasticity (the gold standard for determining cable insulation degradation) is the indentation measurement. All other NDE techniques are used to find flaws in the cable and do not provide information to determine the current health or life expectancy. There is no single NDE technique that can satisfy all of the requirements needed for making a life-expectancy determination, but a wide range of methods have been evaluated for use in NPPs as part of a continuous evaluation program. The commonly used methods are indentation and visual inspection, but these are only suitable for easily accessible cables. Several NDE methodologies using electrical techniques are in use today for flaw detection but there are none that can predict the life of a cable. There are, however, several physical and chemical ptoperty changes in cable insulation as a result of thermal and radiation damage. In principle, these properties may be targets for advanced NDE methods to provide early warning of aging and degradation. Examples of such key indicators include changes in chemical structure, mechanical modulus, and dielectric permittivity. While some of these indicators are the basis of currently used technologies, there is a need to increase the volume of cable that may be inspected with a single measurement, and if possible, to develop techniques for in-situ inspection (i.e., while the cable is in operation). This is the focus of the present report.« less

  15. FAIR exempting separate T (1) measurement (FAIREST): a novel technique for online quantitative perfusion imaging and multi-contrast fMRI.

    PubMed

    Lai, S; Wang, J; Jahng, G H

    2001-01-01

    A new pulse sequence, dubbed FAIR exempting separate T(1) measurement (FAIREST) in which a slice-selective saturation recovery acquisition is added in addition to the standard FAIR (flow-sensitive alternating inversion recovery) scheme, was developed for quantitative perfusion imaging and multi-contrast fMRI. The technique allows for clean separation between and thus simultaneous assessment of BOLD and perfusion effects, whereas quantitative cerebral blood flow (CBF) and tissue T(1) values are monitored online. Online CBF maps were obtained using the FAIREST technique and the measured CBF values were consistent with the off-line CBF maps obtained from using the FAIR technique in combination with a separate sequence for T(1) measurement. Finger tapping activation studies were carried out to demonstrate the applicability of the FAIREST technique in a typical fMRI setting for multi-contrast fMRI. The relative CBF and BOLD changes induced by finger-tapping were 75.1 +/- 18.3 and 1.8 +/- 0.4%, respectively, and the relative oxygen consumption rate change was 2.5 +/- 7.7%. The results from correlation of the T(1) maps with the activation images on a pixel-by-pixel basis show that the mean T(1) value of the CBF activation pixels is close to the T(1) of gray matter while the mean T(1) value of the BOLD activation pixels is close to the T(1) range of blood and cerebrospinal fluid. Copyright 2001 John Wiley & Sons, Ltd.

  16. Decision trees in epidemiological research.

    PubMed

    Venkatasubramaniam, Ashwini; Wolfson, Julian; Mitchell, Nathan; Barnes, Timothy; JaKa, Meghan; French, Simone

    2017-01-01

    In many studies, it is of interest to identify population subgroups that are relatively homogeneous with respect to an outcome. The nature of these subgroups can provide insight into effect mechanisms and suggest targets for tailored interventions. However, identifying relevant subgroups can be challenging with standard statistical methods. We review the literature on decision trees, a family of techniques for partitioning the population, on the basis of covariates, into distinct subgroups who share similar values of an outcome variable. We compare two decision tree methods, the popular Classification and Regression tree (CART) technique and the newer Conditional Inference tree (CTree) technique, assessing their performance in a simulation study and using data from the Box Lunch Study, a randomized controlled trial of a portion size intervention. Both CART and CTree identify homogeneous population subgroups and offer improved prediction accuracy relative to regression-based approaches when subgroups are truly present in the data. An important distinction between CART and CTree is that the latter uses a formal statistical hypothesis testing framework in building decision trees, which simplifies the process of identifying and interpreting the final tree model. We also introduce a novel way to visualize the subgroups defined by decision trees. Our novel graphical visualization provides a more scientifically meaningful characterization of the subgroups identified by decision trees. Decision trees are a useful tool for identifying homogeneous subgroups defined by combinations of individual characteristics. While all decision tree techniques generate subgroups, we advocate the use of the newer CTree technique due to its simplicity and ease of interpretation.

  17. 40 CFR 465.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Galvanized Basis Material Subcategory § 465.24 Pretreatment standards for existing sources. Except as provided in 40 CFR 403.7 and 403.13, any...

  18. 40 CFR 465.24 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) COIL COATING POINT SOURCE CATEGORY Galvanized Basis Material Subcategory § 465.24 Pretreatment standards for existing sources. Except as provided in 40 CFR 403.7 and 403...

  19. 40 CFR 465.34 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) COIL COATING POINT SOURCE CATEGORY Aluminum Basis Material Subcategory § 465.34 Pretreatment standards for existing sources. Except as provided in 40 CFR 403.7 and 403...

  20. 40 CFR 465.14 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) COIL COATING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 465.14 Pretreatment standards for existing sources. Except as provided in 40 CFR 403.7 and 403...

  1. 40 CFR 465.34 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Aluminum Basis Material Subcategory § 465.34 Pretreatment standards for existing sources. Except as provided in 40 CFR 403.7 and 403.13, any...

  2. 40 CFR 465.25 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and achieve the following pretreatment standards for new sources. The mass of wastewater pollutants in... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for new sources... GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Galvanized Basis Material Subcategory § 465.25...

  3. 40 CFR 465.14 - Pretreatment standards for existing sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for existing...) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 465.14 Pretreatment standards for existing sources. Except as provided in 40 CFR 403.7 and 403.13, any...

  4. 40 CFR 465.15 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... achieve the following pretreatment standards for new sources. The mass of wastewater pollutants in coil... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for new sources... GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Steel Basis Material Subcategory § 465.15...

  5. 40 CFR 465.35 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and achieve the following pretreatment standards for new sources. The mass of wastewater pollutants in... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for new sources... GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Aluminum Basis Material Subcategory § 465.35...

  6. Crystal Growth of ZnSe and Related Ternary Compound Semiconductors by Vapor Transport in Low Gravity

    NASA Technical Reports Server (NTRS)

    Su, Ching-Hua; Ramachandran, N.

    2013-01-01

    Crystals of ZnSe and related ternary compounds, such as ZnSeS and ZnSeTe, will be grown by physical vapor transport in the Material Science Research Rack (MSRR) on International Space Station (ISS). The objective of the project is to determine the relative contributions of gravity-driven fluid flows to the compositional distribution, incorporation of impurities and defects, and deviation from stoichiometry observed in the crystals grown by vapor transport as results of buoyance-driven convection and growth interface fluctuations caused by irregular fluid-flows on Earth. The investigation consists of extensive ground-based experimental and theoretical research efforts and concurrent flight experimentation. The objectives of the ground-based studies are (1) obtain the experimental data and conduct the analyses required to define the optimum growth parameters for the flight experiments, (2) perfect various characterization techniques to establish the standard procedure for material characterization, (3) quantitatively establish the characteristics of the crystals grown on Earth as a basis for subsequent comparative evaluations of the crystals grown in a low-gravity environment and (4) develop theoretical and analytical methods required for such evaluations. ZnSe and related ternary compounds have been grown by vapor transport technique with real time in-situ non-invasive monitoring techniques. The grown crystals have been characterized extensively by various techniques to correlate the grown crystal properties with the growth conditions.

  7. Minimally invasive extravesical ureteral reimplantation for vesicoureteral reflux.

    PubMed

    Chen, Hsiao-Wen; Lin, Ghi-Jen; Lai, Ching-Horng; Chu, Sheng-Hsien; Chuang, Cheng-Keng

    2002-04-01

    We designed a new extravesical ureteral reimplantation technique with a minimally invasive approach from skin to ureterovesical junction with less perivesical tissue manipulation to avoid extensive bladder denervation. Between July 1996 and December 2000, 37 boys and 52 girls 1.2 to 10.8 years old (mean age plus or minus standard deviation 3.8 +/- 2.5) (113 ureters) were treated with minimally invasive extravesical ureteral reimplantation. Vesicoureteral reflux was graded I to V in 8, 12, 43, 29 and 21 cases, respectively. The technique involves an approximately 10 to 15 mm. incision passing through the small triangular gap of the aponeurosis of the external abdominal oblique muscle and transversalis fascia to the point of the ureterovesical junction. The surgical field was exposed with mini-retractors and fine dissecting instruments were used to avoid unnecessary tissue manipulation. At postoperative followup 1 patient had persistent grade II reflux and 2 had moderate hydronephrosis and hydroureter, which resolved after 18 months. No patient returned due to voiding inefficiency or for pain control after discharge from the outpatient setting. This new technique can be easily used for vesicoureteral reflux with the advantages of simple intervention for surgeons, especially those with inguinal herniorrhaphy and antireflux surgery experience, and less wound discomfort for patients. The whole procedure can be performed on an outpatient basis. However, the decision to use this technique should be based on individual consideration.

  8. Designing for Compressive Sensing: Compressive Art, Camouflage, Fonts, and Quick Response Codes

    DTIC Science & Technology

    2018-01-01

    an example where the signal is non-sparse in the standard basis, but sparse in the discrete cosine basis . The top plot shows the signal from the...previous example, now used as sparse discrete cosine transform (DCT) coefficients . The next plot shows the non-sparse signal in the standard...Romberg JK, Tao T. Stable signal recovery from incomplete and inaccurate measurements. Commun Pure Appl Math . 2006;59(8):1207–1223. 3. Donoho DL

  9. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  10. BASIS FOR PRIMARY AIR QUALITY CRITERIA AND STANDARDS

    EPA Science Inventory

    The Environmental Criteria and Assessment Office and the Office of Air Quality Planning and Standards are charged with responsibility for reviewing and assessing air quality criteria and air quality standards, respectively. Since adoption of the 1977 Clean Air Act Amendments, the...

  11. 7 CFR 810.803 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD ADMINISTRATION (FEDERAL GRAIN INSPECTION SERVICE), DEPARTMENT OF AGRICULTURE OFFICIAL UNITED STATES STANDARDS FOR GRAIN United States Standards for Mixed Grain Principles Governing the Application of Standards § 810...

  12. 7 CFR 810.1003 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD ADMINISTRATION (FEDERAL GRAIN INSPECTION SERVICE), DEPARTMENT OF AGRICULTURE OFFICIAL UNITED STATES STANDARDS FOR GRAIN United States Standards for Oats Principles Governing the Application of Standards § 810.1003...

  13. 7 CFR 810.203 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD ADMINISTRATION (FEDERAL GRAIN INSPECTION SERVICE), DEPARTMENT OF AGRICULTURE OFFICIAL UNITED STATES STANDARDS FOR GRAIN United States Standards for Barley Principles Governing the Application of Standards § 810.203...

  14. 7 CFR 810.1603 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD ADMINISTRATION (FEDERAL GRAIN INSPECTION SERVICE), DEPARTMENT OF AGRICULTURE OFFICIAL UNITED STATES STANDARDS FOR GRAIN United States Standards for Soybeans Principles Governing the Application of Standards § 810.1603...

  15. 7 CFR 810.603 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD ADMINISTRATION (FEDERAL GRAIN INSPECTION SERVICE), DEPARTMENT OF AGRICULTURE OFFICIAL UNITED STATES STANDARDS FOR GRAIN United States Standards for Flaxseed Principles Governing the Application of Standards § 810.603...

  16. 40 CFR 63.11623 - What are the testing requirements?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of the cyclone, dry basis, corrected to standard conditions, g/min; MOUTLET = Mass of particulate... PROGRAMS (CONTINUED) NATIONAL EMISSION STANDARDS FOR HAZARDOUS AIR POLLUTANTS FOR SOURCE CATEGORIES (CONTINUED) National Emission Standards for Hazardous Air Pollutants for Area Sources: Prepared Feeds...

  17. Concept and numerical simulations of a reactive anti-fragment armour layer

    NASA Astrophysics Data System (ADS)

    Hušek, Martin; Kala, Jiří; Král, Petr; Hokeš, Filip

    2017-07-01

    The contribution describes the concept and numerical simulation of a ballistic protective layer which is able to actively resist projectiles or smaller colliding fragments flying at high speed. The principle of the layer was designed on the basis of the action/reaction system of reactive armour which is used for the protection of armoured vehicles. As the designed ballistic layer consists of steel plates simultaneously combined with explosive material - primary explosive and secondary explosive - the technique of coupling the Finite Element Method with Smoothed Particle Hydrodynamics was used for the simulations. Certain standard situations which the ballistic layer should resist were simulated. The contribution describes the principles for the successful execution of numerical simulations, their results, and an evaluation of the functionality of the ballistic layer.

  18. Metrologies for quantitative nanomechanical testing and quality control in semiconductor manufacturing

    NASA Astrophysics Data System (ADS)

    Pratt, Jon R.; Kramar, John A.; Newell, David B.; Smith, Douglas T.

    2005-05-01

    If nanomechanical testing is to evolve into a tool for process and quality control in semiconductor fabrication, great advances in throughput, repeatability, and accuracy of the associated instruments and measurements will be required. A recent grant awarded by the NIST Advanced Technology Program seeks to address the throughput issue by developing a high-speed AFM-based platform for quantitative nanomechanical measurements. The following paper speaks to the issue of quantitative accuracy by presenting an overview of various standards and techniques under development at NIST and other national metrology institutes (NMIs) that can provide a metrological basis for nanomechanical testing. The infrastructure we describe places firm emphasis on traceability to the International System of Units, paving the way for truly quantitative, rather than qualitative, physical property testing.

  19. Wearable PWV technologies to measure Blood Pressure: eliminating brachial cuffs.

    PubMed

    Solá, J; Proença, M; Chételat, O

    2013-01-01

    The clinical demand for technologies to monitor Blood Pressure (BP) in ambulatory scenarios with minimal use of inflation cuffs is strong: new generation of BP monitors are expected to be not only accurate, but also non-occlusive. In this paper we review recent advances on the use of the so-called Pulse Wave Velocity (PWV) technologies to estimate BP in a beat-by-beat basis. After introducing the working principle and underlying methodological limitations, two implementation examples are provided. Pilot studies have demonstrated that novel PWV-based BP monitors depict accuracy scores falling within the limits of the British Hypertensive Society (BHS) Grade A standard. The reported techniques pave the way towards ambulatory-compliant, continuous and non-occlusive BP monitoring devices, where the use of inflation cuffs is drastically reduced.

  20. Playing biology's name game: identifying protein names in scientific text.

    PubMed

    Hanisch, Daniel; Fluck, Juliane; Mevissen, Heinz-Theodor; Zimmer, Ralf

    2003-01-01

    A growing body of work is devoted to the extraction of protein or gene interaction information from the scientific literature. Yet, the basis for most extraction algorithms, i.e. the specific and sensitive recognition of protein and gene names and their numerous synonyms, has not been adequately addressed. Here we describe the construction of a comprehensive general purpose name dictionary and an accompanying automatic curation procedure based on a simple token model of protein names. We designed an efficient search algorithm to analyze all abstracts in MEDLINE in a reasonable amount of time on standard computers. The parameters of our method are optimized using machine learning techniques. Used in conjunction, these ingredients lead to good search performance. A supplementary web page is available at http://cartan.gmd.de/ProMiner/.

  1. 40 CFR 63.1205 - What are the standards for hazardous waste burning lightweight aggregate kilns that are effective...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... rolling average, dry basis, corrected to 7 percent oxygen, and reported as propane; (6) Hydrochloric acid... hydrochloric acid equivalents, dry basis and corrected to 7 percent oxygen; and (7) Particulate matter in... average, dry basis, corrected to 7 percent oxygen, and reported as propane; (6) Hydrochloric acid and...

  2. 40 CFR 63.1205 - What are the standards for hazardous waste burning lightweight aggregate kilns that are effective...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... rolling average, dry basis, corrected to 7 percent oxygen, and reported as propane; (6) Hydrochloric acid... hydrochloric acid equivalents, dry basis and corrected to 7 percent oxygen; and (7) Particulate matter in... average, dry basis, corrected to 7 percent oxygen, and reported as propane; (6) Hydrochloric acid and...

  3. 40 CFR 63.1205 - What are the standards for hazardous waste burning lightweight aggregate kilns that are effective...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... rolling average, dry basis, corrected to 7 percent oxygen, and reported as propane; (6) Hydrochloric acid... hydrochloric acid equivalents, dry basis and corrected to 7 percent oxygen; and (7) Particulate matter in... average, dry basis, corrected to 7 percent oxygen, and reported as propane; (6) Hydrochloric acid and...

  4. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perkó, Zoltán, E-mail: Z.Perko@tudelft.nl; Gilli, Luca, E-mail: Gilli@nrg.eu; Lathouwers, Danny, E-mail: D.Lathouwers@tudelft.nl

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work ismore » focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both in terms of the accuracy of the resulting PC representation of quantities and the computational costs associated with constructing the sparse PCE. Basis adaptivity also seems to make the employment of PC techniques possible for problems with a higher number of input parameters (15–20), alleviating a well known limitation of the traditional approach. The prospect of larger scale applicability and the simplicity of implementation makes such adaptive PC algorithms particularly appealing for the sensitivity and uncertainty analysis of complex systems and legacy codes.« less

  5. Correlation consistent basis sets for actinides. I. The Th and U atoms.

    PubMed

    Peterson, Kirk A

    2015-02-21

    New correlation consistent basis sets based on both pseudopotential (PP) and all-electron Douglas-Kroll-Hess (DKH) Hamiltonians have been developed from double- to quadruple-zeta quality for the actinide atoms thorium and uranium. Sets for valence electron correlation (5f6s6p6d), cc - pV nZ - PP and cc - pV nZ - DK3, as well as outer-core correlation (valence + 5s5p5d), cc - pwCV nZ - PP and cc - pwCV nZ - DK3, are reported (n = D, T, Q). The -PP sets are constructed in conjunction with small-core, 60-electron PPs, while the -DK3 sets utilized the 3rd-order Douglas-Kroll-Hess scalar relativistic Hamiltonian. Both series of basis sets show systematic convergence towards the complete basis set limit, both at the Hartree-Fock and correlated levels of theory, making them amenable to standard basis set extrapolation techniques. To assess the utility of the new basis sets, extensive coupled cluster composite thermochemistry calculations of ThFn (n = 2 - 4), ThO2, and UFn (n = 4 - 6) have been carried out. After accurately accounting for valence and outer-core correlation, spin-orbit coupling, and even Lamb shift effects, the final 298 K atomization enthalpies of ThF4, ThF3, ThF2, and ThO2 are all within their experimental uncertainties. Bond dissociation energies of ThF4 and ThF3, as well as UF6 and UF5, were similarly accurate. The derived enthalpies of formation for these species also showed a very satisfactory agreement with experiment, demonstrating that the new basis sets allow for the use of accurate composite schemes just as in molecular systems composed only of lighter atoms. The differences between the PP and DK3 approaches were found to increase with the change in formal oxidation state on the actinide atom, approaching 5-6 kcal/mol for the atomization enthalpies of ThF4 and ThO2. The DKH3 atomization energy of ThO2 was calculated to be smaller than the DKH2 value by ∼1 kcal/mol.

  6. Swedish snus and the GothiaTek® standard

    PubMed Central

    2011-01-01

    Some smokeless tobacco products, such as Swedish snus, are today considered to be associated with substantially fewer health hazards than cigarettes. This risk differential has contributed to the scientific debate about the possibilities of harm reduction within the tobacco area. Although current manufacturing methods for snus build on those that were introduced more than a century ago, the low levels of unwanted substances in modern Swedish snus are largely due to improvements in production techniques and selection of raw materials in combination with several programs for quality assurance and quality control. These measures have been successively introduced during the past 30-40 years. In the late 1990s they formed the basis for a voluntary quality standard for Swedish snus named GothiaTek®. In recent years the standard has been accepted by the members of the trade organization European Smokeless Tobacco Council (ESTOC) so it has now evolved into an industrial standard for all smokeless tobacco products in Europe. The initial impetus for the mentioned changes of the production was quality problems related to microbial activity and formation of ammonia and nitrite in the finished products. Other contributing factors were that snus came under the jurisdiction of the Swedish Food Act in 1971, and concerns that emerged in the 1960s and 1970s about health effects of tobacco, and the significance of agrochemical residues and other potential toxicants in food stuffs. This paper summarizes the historical development of the manufacture of Swedish snus, describes the chemical composition of modern snus, and gives the background and rationale for the GothiaTek® standard, including the selection of constituents for which the standard sets limits. The paper also discusses the potential future of this voluntary standard in relation to current discussions about tobacco harm reduction and regulatory science in tobacco control. PMID:21575206

  7. Sympathetic Cooling of Lattice Atoms by a Bose-Einstein Condensate

    DTIC Science & Technology

    2010-08-13

    average out to zero net change in momentum. This type of cooling is the basis for techniques such as Zeeman slowing and Magneto - optical traps . On a...change in momentum. This type of cooling is the basis for techniques such as Zeeman slowing and Magneto - optical traps . On a more basic level, an excited...cause stimulated emission of a second excitation. A quantitative explanation requires the use of the density fluctuation operator . This operator

  8. Note: Photopyroelectric measurement of thermal effusivity of transparent liquids by a method free of fitting procedures.

    PubMed

    Ivanov, R; Marín, E; Villa, J; Aguilar, C Hernández; Pacheco, A Domínguez; Garrido, S Hernández

    2016-02-01

    In a recent paper published in this journal [R. Ivanov et al., Rev. Sci. Instrum. 86, 064902 (2015)], a methodology free of fitting procedures for determining the thermal effusivity of liquids using the electropyroelectric technique was reported. Here the same measurement principle is extended to the well-known photopyroelectric technique. The theoretical basis and experimental basis of the method are presented and its usefulness is demonstrated with measurements on test samples.

  9. The evolving role of stereotactic radiosurgery and stereotactic radiation therapy for patients with spine tumors.

    PubMed

    Rock, Jack P; Ryu, Samuel; Yin, Fang-Fang; Schreiber, Faye; Abdulhak, Muwaffak

    2004-01-01

    Traditional management strategies for patients with spinal tumors have undergone considerable changes during the last 15 years. Significant improvements in digital imaging, computer processing, and treatment planning have provided the basis for the application of stereotactic techniques, now the standard of care for intracranial pathology, to spinal pathology. In addition, certain of these improvements have also allowed us to progress from frame-based to frameless systems which now act to accurately assure the delivery of high doses of radiation to a precisely defined target volume while sparing injury to adjacent normal tissues. In this article we will describe the evolution from yesterday's standards for radiation therapy to the current state of the art for the treatment of patients with spinal tumors. This presentation will include a discussion of radiation dosing and toxicity, the overall process of extracranial radiation delivery, and the current state of the art regarding Cyberknife, Novalis, and tomotherapy. Additional discussion relating current research protocols and future directions for the management of benign tumors of the spine will also be presented.

  10. F-106 data summary and model results relative to threat criteria and protection design analysis

    NASA Technical Reports Server (NTRS)

    Pitts, F. L.; Finelli, G. B.; Perala, R. A.; Rudolph, T. H.

    1986-01-01

    The NASA F-106 has acquired considerable data on the rates-of-change of electromagnetic parameters on the aircraft surface during 690 direct lightning strikes while penetrating thunderstorms at altitudes ranging from 15,000 to 40,000 feet. These in-situ measurements have provided the basis for the first statistical quantification of the lightning electromagnetic threat to aircrat appropriate for determining lightning indirect effects on aircraft. The data are presently being used in updating previous lightning criteria and standards developed over the years from ground-based measurements. The new lightning standards will, therefore, be the first which reflect actual aircraft responses measured at flight altitudes. The modeling technique developed to interpret and understand the direct strike electromagnetic data acquired on the F-106 provides a means to model the interaction of the lightning channel with the F-106. The reasonable results obtained with the model, compared to measured responses, yield confidence that the model may be credibly applied to other aircraft types and uses in the prediction of internal coupling effects in the design of lightning protection for new aircraft.

  11. Improving integrity of on-line grammage measurement with traceable basic calibration.

    PubMed

    Kangasrääsiö, Juha

    2010-07-01

    The automatic control of grammage (basis weight) in paper and board production is based upon on-line grammage measurement. Furthermore, the automatic control of other quality variables such as moisture, ash content and coat weight, may rely on the grammage measurement. The integrity of Kr-85 based on-line grammage measurement systems was studied, by performing basic calibrations with traceably calibrated plastic reference standards. The calibrations were performed according to the EN ISO/IEC 17025 standard, which is a requirement for calibration laboratories. The observed relative measurement errors were 3.3% in the first time calibrations at the 95% confidence level. With the traceable basic calibration method, however, these errors can be reduced to under 0.5%, thus improving the integrity of on-line grammage measurements. Also a standardised algorithm, based on the experience from the performed calibrations, is proposed to ease the adjustment of the different grammage measurement systems. The calibration technique can basically be applied to all beta-radiation based grammage measurements. 2010 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Comparison of the acetyl bromide spectrophotometric method with other analytical lignin methods for determining lignin concentration in forage samples.

    PubMed

    Fukushima, Romualdo S; Hatfield, Ronald D

    2004-06-16

    Present analytical methods to quantify lignin in herbaceous plants are not totally satisfactory. A spectrophotometric method, acetyl bromide soluble lignin (ABSL), has been employed to determine lignin concentration in a range of plant materials. In this work, lignin extracted with acidic dioxane was used to develop standard curves and to calculate the derived linear regression equation (slope equals absorptivity value or extinction coefficient) for determining the lignin concentration of respective cell wall samples. This procedure yielded lignin values that were different from those obtained with Klason lignin, acid detergent acid insoluble lignin, or permanganate lignin procedures. Correlations with in vitro dry matter or cell wall digestibility of samples were highest with data from the spectrophotometric technique. The ABSL method employing as standard lignin extracted with acidic dioxane has the potential to be employed as an analytical method to determine lignin concentration in a range of forage materials. It may be useful in developing a quick and easy method to predict in vitro digestibility on the basis of the total lignin content of a sample.

  13. A procedure for estimating Bacillus cereus spores in soil and stream-sediment samples - A potential exploration technique

    USGS Publications Warehouse

    Watterson, J.R.

    1985-01-01

    The presence of bacterial spores of the Bacillus cereus group in soils and stream sediments appears to be a sensitive indicator of several types of concealed mineral deposits, including vein-type gold deposits. The B. cereus assay is rapid, inexpensive, and inherently reproducible. The test, currently under investigation for its potential in mineral exploration, is recommended for use on a research basis. Among the aerobic spore-forming bacilli, only B. cereus and closely related strains produce an opaque zone in egg-yolk emulsion agar. This characteristic, also known as the Nagler of lecitho-vitellin reaction, has long been used to rapidly indentify and estimate presumptive B. cereus. The test is here adapted to permit rapid estimation of B. cereus spores in soil and stream-sediment samples. Relative standard deviation was 10.3% on counts obtained from two 40-replicate pour-plate determinations. As many as 40 samples per day can be processed. Enough procedural detail is included to permit investigation of the test in conventional geochemical laboratories using standard microbiological safety precautions. ?? 1985.

  14. IAC Standardized Reporting of Breast Fine-Needle Aspiration Biopsy Cytology.

    PubMed

    Field, Andrew S; Schmitt, Fernando; Vielh, Philippe

    2017-01-01

    There have been many changes in the roles of fine-needle aspiration biopsy (FNAB) and core needle biopsy (CNB) in the diagnostic workup of breast lesions in routine breast clinics and in mammographic breast screening programs, as well as changes in the management algorithms utilized and the treatments available, since the NCI consensus on breast FNAB cytology in 1996. A standardized approach will improve training and performance of FNAB and smear-making techniques, and structured reporting will improve the quality and reproducibility of reports across departments, cities and countries providing a basis for quality assurance measures and improving patient care and facilitating research. Linking cytology reports to management algorithms will increase the clinicians' use of FNAB cytology and where appropriate CNB, and enhance the use of ancillary tests for prognostic testing. The IAC recognizes that the local medical infrastructure and resources for breast imaging, biopsy and treatment will influence the diagnostic workup and management of breast disease, but best practice guidelines should be established and modified as required. © 2016 S. Karger AG, Basel.

  15. Immunotherapy: a new standard of care in thoracic malignancies? A summary of the European Respiratory Society research seminar of the Thoracic Oncology Assembly.

    PubMed

    Costantini, Adrien; Grynovska, Marta; Lucibello, Francesca; Moisés, Jorge; Pagès, Franck; Tsao, Ming S; Shepherd, Frances A; Bouchaab, Hasna; Garassino, Marina; Aerts, Joachim G J V; Mazières, Julien; Mondini, Michele; Berghmans, Thierry; Meert, Anne-Pascale; Cadranel, Jacques

    2018-02-01

    In May 2017, the second European Respiratory Society research seminar of the Thoracic Oncology Assembly entitled "Immunotherapy, a new standard of care in thoracic malignancies?" was held in Paris, France. This seminar provided an opportunity to review the basis of antitumour immunity and to explain how immune checkpoint inhibitors (ICIs) work. The main therapeutic trials that have resulted in marketing authorisations for use of ICIs in lung cancer were reported. A particular focus was on the toxicity of these new molecules in relation to their immune-related adverse events. The need for biological selection, currently based on immunohistochemistry testing to identify the tumour expression of programmed death ligand (PD-L)1, was stressed, as well as the need to harmonise PD-L1 testing and techniques. Finally, sessions were dedicated to the combination of ICIs and radiotherapy and the place of ICIs in nonsmall cell lung cancer with oncogenic addictions. Finally, an important presentation was dedicated to the future of antitumour vaccination and of all ongoing trials in thoracic oncology. Copyright ©ERS 2018.

  16. A random variance model for detection of differential gene expression in small microarray experiments.

    PubMed

    Wright, George W; Simon, Richard M

    2003-12-12

    Microarray techniques provide a valuable way of characterizing the molecular nature of disease. Unfortunately expense and limited specimen availability often lead to studies with small sample sizes. This makes accurate estimation of variability difficult, since variance estimates made on a gene by gene basis will have few degrees of freedom, and the assumption that all genes share equal variance is unlikely to be true. We propose a model by which the within gene variances are drawn from an inverse gamma distribution, whose parameters are estimated across all genes. This results in a test statistic that is a minor variation of those used in standard linear models. We demonstrate that the model assumptions are valid on experimental data, and that the model has more power than standard tests to pick up large changes in expression, while not increasing the rate of false positives. This method is incorporated into BRB-ArrayTools version 3.0 (http://linus.nci.nih.gov/BRB-ArrayTools.html). ftp://linus.nci.nih.gov/pub/techreport/RVM_supplement.pdf

  17. Diagnostic Performance of a Novel Coronary CT Angiography Algorithm: Prospective Multicenter Validation of an Intracycle CT Motion Correction Algorithm for Diagnostic Accuracy.

    PubMed

    Andreini, Daniele; Lin, Fay Y; Rizvi, Asim; Cho, Iksung; Heo, Ran; Pontone, Gianluca; Bartorelli, Antonio L; Mushtaq, Saima; Villines, Todd C; Carrascosa, Patricia; Choi, Byoung Wook; Bloom, Stephen; Wei, Han; Xing, Yan; Gebow, Dan; Gransar, Heidi; Chang, Hyuk-Jae; Leipsic, Jonathon; Min, James K

    2018-06-01

    Motion artifact can reduce the diagnostic accuracy of coronary CT angiography (CCTA) for coronary artery disease (CAD). The purpose of this study was to compare the diagnostic performance of an algorithm dedicated to correcting coronary motion artifact with the performance of standard reconstruction methods in a prospective international multicenter study. Patients referred for clinically indicated invasive coronary angiography (ICA) for suspected CAD prospectively underwent an investigational CCTA examination free from heart rate-lowering medications before they underwent ICA. Blinded core laboratory interpretations of motion-corrected and standard reconstructions for obstructive CAD (≥ 50% stenosis) were compared with ICA findings. Segments unevaluable owing to artifact were considered obstructive. The primary endpoint was per-subject diagnostic accuracy of the intracycle motion correction algorithm for obstructive CAD found at ICA. Among 230 patients who underwent CCTA with the motion correction algorithm and standard reconstruction, 92 (40.0%) had obstructive CAD on the basis of ICA findings. At a mean heart rate of 68.0 ± 11.7 beats/min, the motion correction algorithm reduced the number of nondiagnostic scans compared with standard reconstruction (20.4% vs 34.8%; p < 0.001). Diagnostic accuracy for obstructive CAD with the motion correction algorithm (62%; 95% CI, 56-68%) was not significantly different from that of standard reconstruction on a per-subject basis (59%; 95% CI, 53-66%; p = 0.28) but was superior on a per-vessel basis: 77% (95% CI, 74-80%) versus 72% (95% CI, 69-75%) (p = 0.02). The motion correction algorithm was superior in subgroups of patients with severely obstructive (≥ 70%) stenosis, heart rate ≥ 70 beats/min, and vessels in the atrioventricular groove. The motion correction algorithm studied reduces artifacts and improves diagnostic performance for obstructive CAD on a per-vessel basis and in selected subgroups on a per-subject basis.

  18. Development of New Open-Shell Perturbation and Coupled-Cluster Theories Based on Symmetric Spin Orbitals

    NASA Technical Reports Server (NTRS)

    Lee, Timothy J.; Arnold, James O. (Technical Monitor)

    1994-01-01

    A new spin orbital basis is employed in the development of efficient open-shell coupled-cluster and perturbation theories that are based on a restricted Hartree-Fock (RHF) reference function. The spin orbital basis differs from the standard one in the spin functions that are associated with the singly occupied spatial orbital. The occupied orbital (in the spin orbital basis) is assigned the delta(+) = 1/square root of 2(alpha+Beta) spin function while the unoccupied orbital is assigned the delta(-) = 1/square root of 2(alpha-Beta) spin function. The doubly occupied and unoccupied orbitals (in the reference function) are assigned the standard alpha and Beta spin functions. The coupled-cluster and perturbation theory wave functions based on this set of "symmetric spin orbitals" exhibit much more symmetry than those based on the standard spin orbital basis. This, together with interacting space arguments, leads to a dramatic reduction in the computational cost for both coupled-cluster and perturbation theory. Additionally, perturbation theory based on "symmetric spin orbitals" obeys Brillouin's theorem provided that spin and spatial excitations are both considered. Other properties of the coupled-cluster and perturbation theory wave functions and models will be discussed.

  19. Risk management in the North sea offshore industry: History, status and challenges

    NASA Astrophysics Data System (ADS)

    Smith, E. J.

    1995-10-01

    There have been major changes in the UK and Norwegian offshore safety regimes in the last decade. On the basis of accumulated experience (including some major accidents), there has been a move away from a rigid, prescriptive approach to setting safety standards; it is now recognised that a more flexible, "goal-setting" approach is more suited to achieving cost-effective solutions to offshore safety. In order to adapt to this approach, offshore operators are increasingly using Quantitative Risk Assessment (QRA) techniques as part of their risk management programmes. Structured risk assessment can be used at all stages of a project life-cycle. In the design stages (concept and detailed design), these techniques are valuable tools in ensuring that money is wisely spent on safety-related systems. In the operational stage, QRA can aid the development of procedures. High quality Safety Management Systems (SMSs), covering issues such as training, inspection, and emergency planning, are crucial to maintain "asdesigned" levels of safety and reliability. Audits of SMSs should be carried out all through the operational phase to ensure that risky conditions do not accumulate.

  20. Assessing the Impact of Observations on Numerical Weather Forecasts Using the Adjoint Method

    NASA Technical Reports Server (NTRS)

    Gelaro, Ronald

    2012-01-01

    The adjoint of a data assimilation system provides a flexible and efficient tool for estimating observation impacts on short-range weather forecasts. The impacts of any or all observations can be estimated simultaneously based on a single execution of the adjoint system. The results can be easily aggregated according to data type, location, channel, etc., making this technique especially attractive for examining the impacts of new hyper-spectral satellite instruments and for conducting regular, even near-real time, monitoring of the entire observing system. This talk provides a general overview of the adjoint method, including the theoretical basis and practical implementation of the technique. Results are presented from the adjoint-based observation impact monitoring tool in NASA's GEOS-5 global atmospheric data assimilation and forecast system. When performed in conjunction with standard observing system experiments (OSEs), the adjoint results reveal both redundancies and dependencies between observing system impacts as observations are added or removed from the assimilation system. Understanding these dependencies may be important for optimizing the use of the current observational network and defining requirements for future observing systems

  1. Cloud-based adaptive exon prediction for DNA analysis.

    PubMed

    Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen

    2018-02-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.

  2. A spatial-temporal system for dynamic cadastral management.

    PubMed

    Nan, Liu; Renyi, Liu; Guangliang, Zhu; Jiong, Xie

    2006-03-01

    A practical spatio-temporal database (STDB) technique for dynamic urban land management is presented. One of the STDB models, the expanded model of Base State with Amendments (BSA), is selected as the basis for developing the dynamic cadastral management technique. Two approaches, the Section Fast Indexing (SFI) and the Storage Factors of Variable Granularity (SFVG), are used to improve the efficiency of the BSA model. Both spatial graphic data and attribute data, through a succinct engine, are stored in standard relational database management systems (RDBMS) for the actual implementation of the BSA model. The spatio-temporal database is divided into three interdependent sub-databases: present DB, history DB and the procedures-tracing DB. The efficiency of database operation is improved by the database connection in the bottom layer of the Microsoft SQL Server. The spatio-temporal system can be provided at a low-cost while satisfying the basic needs of urban land management in China. The approaches presented in this paper may also be of significance to countries where land patterns change frequently or to agencies where financial resources are limited.

  3. Superior labrum anterior to posterior tears and glenohumeral instability.

    PubMed

    Virk, Mandeep S; Arciero, Robert A

    2013-01-01

    Cadaver experiments and clinical studies suggest that the superior labrum-biceps complex plays a role in glenohumeral stability. Superior labrum anterior to posterior (SLAP) tears can be present in acute and recurrent glenohumeral dislocations and contribute to glenohumeral instability. Isolated SLAP tears can cause instability, especially in throwing athletes. Diagnosing a SLAP tear on the basis of the clinical examination alone is difficult because of nonspecific history and physical examination findings and the presence of coexisting intra-articular lesions. Magnetic resonance arthrography is the imaging study of choice for diagnosing SLAP tears; however, arthroscopy remains the gold standard for diagnosis. Arthroscopy is the preferred technique for the repair of a type II SLAP tear and its variant types (V through X) in acute glenohumeral dislocations and instability in younger populations. Clinical outcome studies report a low recurrence of glenohumeral instability after the arthroscopic repair of a SLAP tear in addition to a Bankart repair. Long-term follow-up studies and further advances in arthroscopic fixation techniques will allow a better understanding and improvement in outcomes in patients with SLAP tears associated with glenohumeral instability.

  4. Noninvasive prostate cancer screening based on serum surface-enhanced Raman spectroscopy and support vector machine

    NASA Astrophysics Data System (ADS)

    Li, Shaoxin; Zhang, Yanjiao; Xu, Junfa; Li, Linfang; Zeng, Qiuyao; Lin, Lin; Guo, Zhouyi; Liu, Zhiming; Xiong, Honglian; Liu, Songhao

    2014-09-01

    This study aims to present a noninvasive prostate cancer screening methods using serum surface-enhanced Raman scattering (SERS) and support vector machine (SVM) techniques through peripheral blood sample. SERS measurements are performed using serum samples from 93 prostate cancer patients and 68 healthy volunteers by silver nanoparticles. Three types of kernel functions including linear, polynomial, and Gaussian radial basis function (RBF) are employed to build SVM diagnostic models for classifying measured SERS spectra. For comparably evaluating the performance of SVM classification models, the standard multivariate statistic analysis method of principal component analysis (PCA) is also applied to classify the same datasets. The study results show that for the RBF kernel SVM diagnostic model, the diagnostic accuracy of 98.1% is acquired, which is superior to the results of 91.3% obtained from PCA methods. The receiver operating characteristic curve of diagnostic models further confirm above research results. This study demonstrates that label-free serum SERS analysis technique combined with SVM diagnostic algorithm has great potential for noninvasive prostate cancer screening.

  5. Thermal performance - Rangewood Villas. Field monitoring of various conservation construction techniques in the hot-humid area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-06-01

    This report, prepared by researchers at Florida Solar Energy Center, describes data acquired over a complete year of comprehensive thermal performance monitoring. The construction details of the house and instrumentation system are clearly documented. Rangewood Villas in Cocoa, Florida, is an innovative townhouse project that incorporates several energy efficient construction techniques developed at FSEC including vent skin roofs and walls utilizing radiant barriers to substantially lower heat gain through radiant transfer of solar energy. The computer simulation model selected as the basis for data acquisition parameters is the Thermal Analysis Research Program (TARP). The TARP model does not contain humiditymore » correlations which are very important in predicting thermal performance in the warm humid area. These correlations are developed for enhancement of the TARP model through extensive relative humidity measurements in various zones, and enthalpy measurements of the heat pump. The data acquisition system devised for this program provides a standard instrumentation system which can be adapted by others working in the hot humid area and intersted in developing comparative performance data.« less

  6. Finite-size scaling and integer-spin Heisenberg chains

    NASA Astrophysics Data System (ADS)

    Bonner, Jill C.; Müller, Gerhard

    1984-03-01

    Finite-size scaling (phenomenological renormalization) techniques are trusted and widely applied in low-dimensional magnetism and, particularly, in lattice gauge field theory. Recently, investigations have begun which subject the theoretical basis to systematic and intensive scrutiny to determine the validity of finite-size scaling in a variety of situations. The 2D ANNNI model is an example of a situation where finite-size scaling methods encounter difficulty, related to the occurrence of a disorder line (one-dimensional line). A second example concerns the behavior of the spin-1/2 antiferromagnetic XXZ model where the T=0 critical behavior is exactly known and features an essential singularity at the isotropic Heisenberg point. Standard finite-size scaling techniques do not convincingly reproduce the exact phase behavior and this is attributable to the essential singularity. The point is relevant in connection with a finite-size scaling analysis of a spin-one antiferromagnetic XXZ model, which claims to support a conjecture by Haldane that the T=0 phase behavior of integer-spin Heisenberg chains is significantly different from that of half-integer-spin Heisenberg chains.

  7. Error control for reliable digital data transmission and storage systems

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Deng, R. H.

    1985-01-01

    A problem in designing semiconductor memories is to provide some measure of error control without requiring excessive coding overhead or decoding time. In LSI and VLSI technology, memories are often organized on a multiple bit (or byte) per chip basis. For example, some 256K-bit DRAM's are organized in 32Kx8 bit-bytes. Byte oriented codes such as Reed Solomon (RS) codes can provide efficient low overhead error control for such memories. However, the standard iterative algorithm for decoding RS codes is too slow for these applications. In this paper we present some special decoding techniques for extended single-and-double-error-correcting RS codes which are capable of high speed operation. These techniques are designed to find the error locations and the error values directly from the syndrome without having to use the iterative alorithm to find the error locator polynomial. Two codes are considered: (1) a d sub min = 4 single-byte-error-correcting (SBEC), double-byte-error-detecting (DBED) RS code; and (2) a d sub min = 6 double-byte-error-correcting (DBEC), triple-byte-error-detecting (TBED) RS code.

  8. 42 CFR 498.100 - Basis, timing, and authority for reopening an ALJ or Board decision.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION APPEALS PROCEDURES FOR... Law Judges or the Departmental Appeals Board § 498.100 Basis, timing, and authority for reopening an...

  9. 40 CFR 60.53a - Standard for municipal waste combustor organics.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Municipal Waste Combustors for Which Construction is Commenced After December 20, 1989 and on or Before... exceed 30 nanograms per dry standard cubic meter (12 grains per billion dry standard cubic feet), corrected to 7 percent oxygen (dry basis). ...

  10. Optimization of Turbine Blade Design for Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Shyy, Wei

    1998-01-01

    To facilitate design optimization of turbine blade shape for reusable launching vehicles, appropriate techniques need to be developed to process and estimate the characteristics of the design variables and the response of the output with respect to the variations of the design variables. The purpose of this report is to offer insight into developing appropriate techniques for supporting such design and optimization needs. Neural network and polynomial-based techniques are applied to process aerodynamic data obtained from computational simulations for flows around a two-dimensional airfoil and a generic three- dimensional wing/blade. For the two-dimensional airfoil, a two-layered radial-basis network is designed and trained. The performances of two different design functions for radial-basis networks, one based on the accuracy requirement, whereas the other one based on the limit on the network size. While the number of neurons needed to satisfactorily reproduce the information depends on the size of the data, the neural network technique is shown to be more accurate for large data set (up to 765 simulations have been used) than the polynomial-based response surface method. For the three-dimensional wing/blade case, smaller aerodynamic data sets (between 9 to 25 simulations) are considered, and both the neural network and the polynomial-based response surface techniques improve their performance as the data size increases. It is found while the relative performance of two different network types, a radial-basis network and a back-propagation network, depends on the number of input data, the number of iterations required for radial-basis network is less than that for the back-propagation network.

  11. Theoretical and Empirical Underpinnings of the What Works Clearinghouse Attrition Standard for Randomized Controlled Trials

    ERIC Educational Resources Information Center

    Deke, John; Chiang, Hanley

    2014-01-01

    Meeting the What Works Clearinghouse (WWC) attrition standard (or one of the attrition standards based on the WWC standard) is now an important consideration for researchers conducting studies that could potentially be reviewed by the WWC (or other evidence reviews). Understanding the basis of this standard is valuable for anyone seeking to meet…

  12. Molecular structure and spectroscopic characterization of Carbamazepine with experimental techniques and DFT quantum chemical calculations.

    PubMed

    Suhasini, M; Sailatha, E; Gunasekaran, S; Ramkumaar, G R

    2015-04-15

    A systematic vibrational spectroscopic assignment and analysis of Carbamazepine has been carried out by using FT-IR, FT-Raman and UV spectral data. The vibrational analysis were aided by electronic structure calculations - ab initio (RHF) and hybrid density functional methods (B3LYP) performed with standard basis set 6-31G(d,p). Molecular equilibrium geometries, electronic energies, natural bond order analysis, harmonic vibrational frequencies and IR intensities have been computed. A detailed interpretation of the vibrational spectra of the molecule has been made on the basis of the calculated Potential Energy Distribution (PED) by VEDA program. UV-visible spectrum of the compound was also recorded and the electronic properties, such as HOMO and LUMO energies and λmax were determined by HF/6-311++G(d,p) Time-Dependent method. The thermodynamic functions of the title molecule were also performed using the RHF and DFT methods. The restricted Hartree-Fock and density functional theory-based nuclear magnetic resonance (NMR) calculation procedure was also performed, and it was used for assigning the (13)C and (1)H NMR chemical shifts of Carbamazepine. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Determination of trace elements and their concentrations in clay balls: problem of geophagia practice in Ghana.

    PubMed

    Arhin, Emmanuel; Zango, Musah S

    2017-02-01

    Ten samples of 100 g weight were subsampled from 1400 g of the clay balls from which the contained trace element levels were determined by X-ray fluorescence technique. The results of trace elements in the clay balls were calibrated using certified reference materials "MAJMON" and "BH-1." The results showed elevated concentrations but with different concentration levels in the regions, particularly with arsenic, chromium, cobalt, Cs, Zr and La. These trace elements contained in the clay balls are known to be hazardous to human health. Thence the relatively high concentrations of these listed trace elements in clay balls in the three regions, namely Ashanti, Upper East and Volta, which are widely sold in markets in Ghana, could present negative health impact on consumers if consumed at 70 g per day or more and on regular basis. On the basis of these, the study concludes an investigation to establish breakeven range for trace element concentrations in the clay balls as it has been able to demonstrate the uneven and elevated values in them. The standardized safe ranges of trace elements will make the practice safer for the people that ingest clay balls in Ghana.

  14. A parametric model order reduction technique for poroelastic finite element models.

    PubMed

    Lappano, Ettore; Polanz, Markus; Desmet, Wim; Mundo, Domenico

    2017-10-01

    This research presents a parametric model order reduction approach for vibro-acoustic problems in the frequency domain of systems containing poroelastic materials (PEM). The method is applied to the Finite Element (FE) discretization of the weak u-p integral formulation based on the Biot-Allard theory and makes use of reduced basis (RB) methods typically employed for parametric problems. The parametric reduction is obtained rewriting the Biot-Allard FE equations for poroelastic materials using an affine representation of the frequency (therefore allowing for RB methods) and projecting the frequency-dependent PEM system on a global reduced order basis generated with the proper orthogonal decomposition instead of standard modal approaches. This has proven to be better suited to describe the nonlinear frequency dependence and the strong coupling introduced by damping. The methodology presented is tested on two three-dimensional systems: in the first experiment, the surface impedance of a PEM layer sample is calculated and compared with results of the literature; in the second, the reduced order model of a multilayer system coupled to an air cavity is assessed and the results are compared to those of the reference FE model.

  15. Custom instruction set NIOS-based OFDM processor for FPGAs

    NASA Astrophysics Data System (ADS)

    Meyer-Bäse, Uwe; Sunkara, Divya; Castillo, Encarnacion; Garcia, Antonio

    2006-05-01

    Orthogonal Frequency division multiplexing (OFDM) spread spectrum technique, sometimes also called multi-carrier or discrete multi-tone modulation, are used in bandwidth-efficient communication systems in the presence of channel distortion. The benefits of OFDM are high spectral efficiency, resiliency to RF interference, and lower multi-path distortion. OFDM is the basis for the European digital audio broadcasting (DAB) standard, the global asymmetric digital subscriber line (ADSL) standard, in the IEEE 802.11 5.8 GHz band standard, and ongoing development in wireless local area networks. The modulator and demodulator in an OFDM system can be implemented by use of a parallel bank of filters based on the discrete Fourier transform (DFT), in case the number of subchannels is large (e.g. K > 25), the OFDM system are efficiently implemented by use of the fast Fourier transform (FFT) to compute the DFT. We have developed a custom FPGA-based Altera NIOS system to increase the performance, programmability, and low power in mobil wireless systems. The overall gain observed for a 1024-point FFT ranges depending on the multiplier used by the NIOS processor between a factor of 3 and 16. A careful optimization described in the appendix yield a performance gain of up to 77% when compared with our preliminary results.

  16. Calibrating airborne measurements of airspeed, pressure and temperature using a Doppler laser air-motion sensor

    NASA Astrophysics Data System (ADS)

    Cooper, W. A.; Spuler, S. M.; Spowart, M.; Lenschow, D. H.; Friesen, R. B.

    2014-03-01

    A new laser air-motion sensor measures the true airspeed with an uncertainty of less than 0.1 m s-1 (standard error) and so reduces uncertainty in the measured component of the relative wind along the longitudinal axis of the aircraft to about the same level. The calculated pressure expected from that airspeed at the inlet of a pitot tube then provides a basis for calibrating the measurements of dynamic and static pressure, reducing standard-error uncertainty in those measurements to less than 0.3 hPa and the precision applicable to steady flight conditions to about 0.1 hPa. These improved measurements of pressure, combined with high-resolution measurements of geometric altitude from the Global Positioning System, then indicate (via integrations of the hydrostatic equation during climbs and descents) that the offset and uncertainty in temperature measurement for one research aircraft are +0.3 ± 0.3 °C. For airspeed, pressure and temperature these are significant reductions in uncertainty vs. those obtained from calibrations using standard techniques. Finally, it is shown that the new laser air-motion sensor, combined with parametrized fits to correction factors for the measured dynamic and ambient pressure, provides a measurement of temperature that is independent of any other temperature sensor.

  17. Physico-chemical characterization, density functional theory (DFT) studies and Hirshfeld surface analysis of a new organic optical material: 1H-benzo[d]imidazol-3-ium-2,4,6-trinitrobenzene-1,3 bis(olate)

    NASA Astrophysics Data System (ADS)

    Dhamodharan, P.; Sathya, K.; Dhandapani, M.

    2017-10-01

    A novel organic crystal, 1H-benzo[d]imidazol-3-ium-2,4,6-trinitrobenzene-1,3 bis(olate) (BITB), was synthesized. Single crystals of BITB were harvested by solution growth-slow evaporation technique. 1H and 13C NMR spectroscopic techniques were utilized to confirm the presence of various types of carbons and protons in BITB. Single crystal XRD confirms that BITB crystallizes in monoclinic system with a space group of P21/n. The suitability of this material for optical applications was assessed by optical absorption, transmittance, reflectance and refractive index spectroscopic techniques. Gaussian 09 program at B3LYP/6-311++G(d,p) level of basis set as used for the optimization of molecular structure of BITB. Greater first order hyperpolarizability value of BITB is due to intensive hydrogen bond network in the crystal. The value is 15 times greater than that of Urea, a reference standard. Computation of frontier molecular orbitals and electrostatic potential surface helped to understand the electron density and reactive sites in BITB. The material was thermally stable up to 220 °C. Hirshfeld surface analysis was performed to quantify the covalent and non covalent interactions.

  18. An analysis of fracture trace patterns in areas of flat-lying sedimentary rocks for the detection of buried geologic structure. [Kansas and Texas

    NASA Technical Reports Server (NTRS)

    Podwysocki, M. H.

    1974-01-01

    Two study areas in a cratonic platform underlain by flat-lying sedimentary rocks were analyzed to determine if a quantitative relationship exists between fracture trace patterns and their frequency distributions and subsurface structural closures which might contain petroleum. Fracture trace lengths and frequency (number of fracture traces per unit area) were analyzed by trend surface analysis and length frequency distributions also were compared to a standard Gaussian distribution. Composite rose diagrams of fracture traces were analyzed using a multivariate analysis method which grouped or clustered the rose diagrams and their respective areas on the basis of the behavior of the rays of the rose diagram. Analysis indicates that the lengths of fracture traces are log-normally distributed according to the mapping technique used. Fracture trace frequency appeared higher on the flanks of active structures and lower around passive reef structures. Fracture trace log-mean lengths were shorter over several types of structures, perhaps due to increased fracturing and subsequent erosion. Analysis of rose diagrams using a multivariate technique indicated lithology as the primary control for the lower grouping levels. Groupings at higher levels indicated that areas overlying active structures may be isolated from their neighbors by this technique while passive structures showed no differences which could be isolated.

  19. Efficient multiparty quantum-secret-sharing schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao Li; Deng Fuguo; Key Laboratory for Quantum Information and Measurements, MOE, Beijing 100084

    In this work, we generalize the quantum-secret-sharing scheme of Hillery, Buzek, and Berthiaume [Phys. Rev. A 59, 1829 (1999)] into arbitrary multiparties. Explicit expressions for the shared secret bit is given. It is shown that in the Hillery-Buzek-Berthiaume quantum-secret-sharing scheme the secret information is shared in the parity of binary strings formed by the measured outcomes of the participants. In addition, we have increased the efficiency of the quantum-secret-sharing scheme by generalizing two techniques from quantum key distribution. The favored-measuring-basis quantum-secret-sharing scheme is developed from the Lo-Chau-Ardehali technique [H. K. Lo, H. F. Chau, and M. Ardehali, e-print quant-ph/0011056] wheremore » all the participants choose their measuring-basis asymmetrically, and the measuring-basis-encrypted quantum-secret-sharing scheme is developed from the Hwang-Koh-Han technique [W. Y. Hwang, I. G. Koh, and Y. D. Han, Phys. Lett. A 244, 489 (1998)] where all participants choose their measuring basis according to a control key. Both schemes are asymptotically 100% in efficiency, hence nearly all the Greenberger-Horne-Zeilinger states in a quantum-secret-sharing process are used to generate shared secret information.« less

  20. A Demonstration of the Molecular Basis of Sickle-Cell Anemia.

    ERIC Educational Resources Information Center

    Fox, Marty; Gaynor, John J.

    1996-01-01

    Describes a demonstration that permits the separation of different hemoglobin molecules within two to three hours. Introduces students to the powerful technique of gel electrophoresis and illustrates the molecular basis of sickle-cell anemia. (JRH)

  1. 40 CFR 465.23 - New source performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false New source performance standards. 465.23 Section 465.23 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS COIL COATING POINT SOURCE CATEGORY Galvanized Basis Material Subcategory § 465.23...

  2. A trapped mercury 199 ion frequency standard

    NASA Technical Reports Server (NTRS)

    Cutler, L. S.; Giffard, R. P.; Mcguire, M. D.

    1982-01-01

    Mercury 199 ions confined in an RF quadrupole trap and optically pumped by mercury 202 ion resonance light are investigated as the basis for a high performance frequency standard with commercial possibilities. Results achieved and estimates of the potential performance of such a standard are given.

  3. 40 CFR 466.35 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for new sources. 466.35 Section 466.35 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Aluminum Basis Material Subcategory § 466...

  4. 40 CFR 466.43 - New source performance standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false New source performance standards. 466.43 Section 466.43 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Copper Basis Material Subcategory § 466.43...

  5. 40 CFR 466.43 - New source performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false New source performance standards. 466.43 Section 466.43 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Copper Basis Material Subcategory § 466.43...

  6. 40 CFR 466.23 - New source performance standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false New source performance standards. 466.23 Section 466.23 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Cast Iron Basis Material Subcategory § 466...

  7. The Random-Map Technique: Enhancing Mind-Mapping with a Conceptual Combination Technique to Foster Creative Potential

    ERIC Educational Resources Information Center

    Malycha, Charlotte P.; Maier, Günter W.

    2017-01-01

    Although creativity techniques are highly recommended in working environments, their effects have been scarcely investigated. Two cognitive processes are often considered to foster creative potential and are, therefore, taken as a basis for creativity techniques: knowledge activation and conceptual combination. In this study, both processes were…

  8. GASB's Basis of Accounting Project.

    ERIC Educational Resources Information Center

    Kovlak, Daniel L.

    1986-01-01

    In July 1984, the Governmental Accounting Standards Board began its "Measurement Focus/Basis of Accounting" project, which addresses measurement issues and revenue and expenditure recognition problems involving governmental funds. This article explains the project's background, alternatives discussed by the board, and tentative…

  9. 45 CFR 170.100 - Statutory basis and purpose.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ....100 Public Welfare Department of Health and Human Services HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY General Provisions § 170.100 Statutory basis and purpose. The...

  10. 45 CFR 170.100 - Statutory basis and purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ....100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY General Provisions § 170.100 Statutory basis and purpose. The...

  11. 45 CFR 170.100 - Statutory basis and purpose.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ....100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY General Provisions § 170.100 Statutory basis and purpose. The...

  12. 45 CFR 170.100 - Statutory basis and purpose.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ....100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY General Provisions § 170.100 Statutory basis and purpose. The...

  13. 45 CFR 170.100 - Statutory basis and purpose.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ....100 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES HEALTH INFORMATION TECHNOLOGY HEALTH INFORMATION TECHNOLOGY STANDARDS, IMPLEMENTATION SPECIFICATIONS, AND CERTIFICATION CRITERIA AND CERTIFICATION PROGRAMS FOR HEALTH INFORMATION TECHNOLOGY General Provisions § 170.100 Statutory basis and purpose. The...

  14. Jacobian projection reduced-order models for dynamic systems with contact nonlinearities

    NASA Astrophysics Data System (ADS)

    Gastaldi, Chiara; Zucca, Stefano; Epureanu, Bogdan I.

    2018-02-01

    In structural dynamics, the prediction of the response of systems with localized nonlinearities, such as friction dampers, is of particular interest. This task becomes especially cumbersome when high-resolution finite element models are used. While state-of-the-art techniques such as Craig-Bampton component mode synthesis are employed to generate reduced order models, the interface (nonlinear) degrees of freedom must still be solved in-full. For this reason, a new generation of specialized techniques capable of reducing linear and nonlinear degrees of freedom alike is emerging. This paper proposes a new technique that exploits spatial correlations in the dynamics to compute a reduction basis. The basis is composed of a set of vectors obtained using the Jacobian of partial derivatives of the contact forces with respect to nodal displacements. These basis vectors correspond to specifically chosen boundary conditions at the contacts over one cycle of vibration. The technique is shown to be effective in the reduction of several models studied using multiple harmonics with a coupled static solution. In addition, this paper addresses another challenge common to all reduction techniques: it presents and validates a novel a posteriori error estimate capable of evaluating the quality of the reduced-order solution without involving a comparison with the full-order solution.

  15. 48 CFR 9904.401-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.401-50 Section 9904.401-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.401-50 Techniques for application. (a) The standard...

  16. Curve fitting and modeling with splines using statistical variable selection techniques

    NASA Technical Reports Server (NTRS)

    Smith, P. L.

    1982-01-01

    The successful application of statistical variable selection techniques to fit splines is demonstrated. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs, using the B-spline basis, were developed. The program for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  17. Ionospheric propagation correction modeling for satellite altimeters

    NASA Technical Reports Server (NTRS)

    Nesterczuk, G.

    1981-01-01

    The theoretical basis and avaliable accuracy verifications were reviewed and compared for ionospheric correction procedures based on a global ionsopheric model driven by solar flux, and a technique in which measured electron content (using Faraday rotation measurements) for one path is mapped into corrections for a hemisphere. For these two techniques, RMS errors for correcting satellite altimeters data (at 14 GHz) are estimated to be 12 cm and 3 cm, respectively. On the basis of global accuracy and reliability after implementation, the solar flux model is recommended.

  18. Fitting multidimensional splines using statistical variable selection techniques

    NASA Technical Reports Server (NTRS)

    Smith, P. L.

    1982-01-01

    This report demonstrates the successful application of statistical variable selection techniques to fit splines. Major emphasis is given to knot selection, but order determination is also discussed. Two FORTRAN backward elimination programs using the B-spline basis were developed, and the one for knot elimination is compared in detail with two other spline-fitting methods and several statistical software packages. An example is also given for the two-variable case using a tensor product basis, with a theoretical discussion of the difficulties of their use.

  19. Patient selection, echocardiographic screening and treatment strategies for interventional tricuspid repair using the edge-to-edge repair technique.

    PubMed

    Hausleiter, Jörg; Braun, Daniel; Orban, Mathias; Latib, Azeem; Lurz, Philipp; Boekstegers, Peter; von Bardeleben, Ralph Stephan; Kowalski, Marek; Hahn, Rebecca T; Maisano, Francesco; Hagl, Christian; Massberg, Steffen; Nabauer, Michael

    2018-04-24

    Severe tricuspid regurgitation (TR) has long been neglected despite its well known association with mortality. While surgical mortality rates remain high in isolated tricuspid valve surgery, interventional TR repair is rapidly evolving as an alternative to cardiac surgery in selected patients at high surgical risk. Currently, interventional edge-to-edge repair is the most frequently applied technique for TR repair even though the device has not been developed for this particular indication. Due to the inherent differences in tricuspid and mitral valve anatomy and pathology, percutaneous repair of the tricuspid valve is challenging due to a variety of factors including the complexity and variability of tricuspid valve anatomy, echocardiographic visibility of the valve leaflets, and device steering to the tricuspid valve. Furthermore, it remains to be clarified which patients are suitable for a percutaneous tricuspid repair and which features predict a successful procedure. On the basis of the available experience, we describe criteria for patient selection including morphological valve features, a standardized process for echocardiographic screening, and a strategy for clip placement. These criteria will help to achieve standardization of valve assessment and the procedural approach, and to further develop interventional tricuspid valve repair using either currently available devices or dedicated tricuspid edge-to-edge repair devices in the future. In summary, this manuscript will provide guidance for patient selection and echocardiographic screening when considering edge-to-edge repair for severe TR.

  20. Sentinel node mapping for gastric cancer: a prospective multicenter trial in Japan.

    PubMed

    Kitagawa, Yuko; Takeuchi, Hiroya; Takagi, Yu; Natsugoe, Shoji; Terashima, Masanori; Murakami, Nozomu; Fujimura, Takashi; Tsujimoto, Hironori; Hayashi, Hideki; Yoshimizu, Nobunari; Takagane, Akinori; Mohri, Yasuhiko; Nabeshima, Kazuhito; Uenosono, Yoshikazu; Kinami, Shinichi; Sakamoto, Junichi; Morita, Satoshi; Aikou, Takashi; Miwa, Koichi; Kitajima, Masaki

    2013-10-10

    Complicated gastric lymphatic drainage potentially undermines the utility of sentinel node (SN) biopsy in patients with gastric cancer. Encouraged by several favorable single-institution reports, we conducted a multicenter, single-arm, phase II study of SN mapping that used a standardized dual tracer endoscopic injection technique. Patients with previously untreated cT1 or cT2 gastric adenocarcinomas < 4 cm in gross diameter were eligible for inclusion in this study. SN mapping was performed by using a standardized dual tracer endoscopic injection technique. Following biopsy of the identified SNs, mandatory comprehensive D2 or modified D2 gastrectomy was performed according to current Japanese Gastric Cancer Association guidelines. Among 433 patients who gave preoperative consent, 397 were deemed eligible on the basis of surgical findings. SN biopsy was performed in all patients, and the SN detection rate was 97.5% (387 of 397). Of 57 patients with lymph node metastasis by conventional hematoxylin and eosin staining, 93% (53 of 57) had positive SNs, and the accuracy of nodal evaluation for metastasis was 99% (383 of 387). Only four false-negative SN biopsies were observed, and pathologic analysis revealed that three of those biopsies were pT2 or tumors > 4 cm. We observed no serious adverse effects related to endoscopic tracer injection or the SN mapping procedure. The endoscopic dual tracer method for SN biopsy was confirmed as safe and effective when applied to the superficial, relatively small gastric adenocarcinomas included in this study.

  1. Evaluation of purity with its uncertainty value in high purity lead stick by conventional and electro-gravimetric methods

    PubMed Central

    2013-01-01

    Background A conventional gravimetry and electro-gravimetry study has been carried out for the precise and accurate purity determination of lead (Pb) in high purity lead stick and for preparation of reference standard. Reference materials are standards containing a known amount of an analyte and provide a reference value to determine unknown concentrations or to calibrate analytical instruments. A stock solution of approximate 2 kg has been prepared after dissolving approximate 2 g of Pb stick in 5% ultra pure nitric acid. From the stock solution five replicates of approximate 50 g have been taken for determination of purity by each method. The Pb has been determined as PbSO4 by conventional gravimetry, as PbO2 by electro gravimetry. The percentage purity of the metallic Pb was calculated accordingly from PbSO4 and PbO2. Results On the basis of experimental observations it has been concluded that by conventional gravimetry and electro-gravimetry the purity of Pb was found to be 99.98 ± 0.24 and 99.97 ± 0.27 g/100 g and on the basis of Pb purity the concentration of reference standard solutions were found to be 1000.88 ± 2.44 and 1000.81 ± 2.68 mg kg-1 respectively with 95% confidence level (k = 2). The uncertainty evaluation has also been carried out in Pb determination following EURACHEM/GUM guidelines. The final analytical results quantifying uncertainty fulfills this requirement and gives a measure of the confidence level of the concerned laboratory. Conclusions Gravimetry is the most reliable technique in comparison to titremetry and instrumental method and the results of gravimetry are directly traceable to SI unit. Gravimetric analysis, if methods are followed carefully, provides for exceedingly precise analysis. In classical gravimetry the major uncertainties are due to repeatability but in electro-gravimetry several other factors also affect the final results. PMID:23800080

  2. Evaluation of purity with its uncertainty value in high purity lead stick by conventional and electro-gravimetric methods.

    PubMed

    Singh, Nahar; Singh, Niranjan; Tripathy, S Swarupa; Soni, Daya; Singh, Khem; Gupta, Prabhat K

    2013-06-26

    A conventional gravimetry and electro-gravimetry study has been carried out for the precise and accurate purity determination of lead (Pb) in high purity lead stick and for preparation of reference standard. Reference materials are standards containing a known amount of an analyte and provide a reference value to determine unknown concentrations or to calibrate analytical instruments. A stock solution of approximate 2 kg has been prepared after dissolving approximate 2 g of Pb stick in 5% ultra pure nitric acid. From the stock solution five replicates of approximate 50 g have been taken for determination of purity by each method. The Pb has been determined as PbSO4 by conventional gravimetry, as PbO2 by electro gravimetry. The percentage purity of the metallic Pb was calculated accordingly from PbSO4 and PbO2. On the basis of experimental observations it has been concluded that by conventional gravimetry and electro-gravimetry the purity of Pb was found to be 99.98 ± 0.24 and 99.97 ± 0.27 g/100 g and on the basis of Pb purity the concentration of reference standard solutions were found to be 1000.88 ± 2.44 and 1000.81 ± 2.68 mg kg-1 respectively with 95% confidence level (k = 2). The uncertainty evaluation has also been carried out in Pb determination following EURACHEM/GUM guidelines. The final analytical results quantifying uncertainty fulfills this requirement and gives a measure of the confidence level of the concerned laboratory. Gravimetry is the most reliable technique in comparison to titremetry and instrumental method and the results of gravimetry are directly traceable to SI unit. Gravimetric analysis, if methods are followed carefully, provides for exceedingly precise analysis. In classical gravimetry the major uncertainties are due to repeatability but in electro-gravimetry several other factors also affect the final results.

  3. 77 FR 11001 - Small Business Size Standards: Health Care and Social Assistance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-24

    ... the changes in the Federal contracting marketplace and industry structure. The last time SBA conducted... size standards at one time, SBA is reviewing size standards on a Sector by Sector basis. A NAICS Sector... nonmanufacturer rule. See 13 CFR 121.406(b). These long-standing anchor size standards have stood the test of time...

  4. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  5. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  6. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  7. 43 CFR 3809.202 - Under what conditions will BLM defer to State regulation of operations?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... standards on a provision-by-provision basis to determine— (i) Whether non-numerical State standards are functionally equivalent to BLM counterparts; and (ii) Whether numerical State standards are the same as corresponding numerical BLM standards, except that State review and approval time frames do not have to be the...

  8. 48 CFR 9904.413-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.413-50 Section 9904.413-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.413-50 Techniques for application. (a) Assignment of actuarial gains and losses. (1) In accordance with the provisions of Cost Accounting Standard 9904.412...

  9. National Standards and Curriculum Reform: A View from the Department of Education.

    ERIC Educational Resources Information Center

    Ravitch, Diane

    1992-01-01

    Successful standards-based reform requires fundamental reconstruction of educational institutions and systems, so that high standards become the basis of curriculum, textbooks and other instructional materials, assessments, graduation requirements, and teacher education, certification, and training. A great failure of our public education system…

  10. 40 CFR 466.45 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Pretreatment standards for new sources. 466.45 Section 466.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Copper Basis Material Subcategory § 466.45...

  11. 40 CFR 466.45 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for new sources. 466.45 Section 466.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Copper Basis Material Subcategory § 466.45...

  12. 40 CFR 466.25 - Pretreatment standards for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for new sources. 466.25 Section 466.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PORCELAIN ENAMELING POINT SOURCE CATEGORY Cast Iron Basis Material Subcategory § 466...

  13. 78 FR 32459 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Reporting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... for OMB Review; Comment Request; Reporting and Performance Standards for Workforce Investment Act...) revision titled, ``Reporting and Performance Standards for Workforce Investment Act Indian and Native... collections are the basis of the performance standards system for Workforce Investment Act section 166...

  14. Basis and Purpose Document for the proposed NESHAP for Pharmaceuticals Production

    EPA Pesticide Factsheets

    The Basis and Purpose Document provides background information on, and the rationale for, decisions made by EPA related to the proposed standards for the reduction of hazardous air pollutants (HAP) emitted through the manufacture of pharmaceutical products

  15. Linear-scaling explicitly correlated treatment of solids: periodic local MP2-F12 method.

    PubMed

    Usvyat, Denis

    2013-11-21

    Theory and implementation of the periodic local MP2-F12 method in the 3*A fixed-amplitude ansatz is presented. The method is formulated in the direct space, employing local representation for the occupied, virtual, and auxiliary orbitals in the form of Wannier functions (WFs), projected atomic orbitals (PAOs), and atom-centered Gaussian-type orbitals, respectively. Local approximations are introduced, restricting the list of the explicitly correlated pairs, as well as occupied, virtual, and auxiliary spaces in the strong orthogonality projector to the pair-specific domains on the basis of spatial proximity of respective orbitals. The 4-index two-electron integrals appearing in the formalism are approximated via the direct-space density fitting technique. In this procedure, the fitting orbital spaces are also restricted to local fit-domains surrounding the fitted densities. The formulation of the method and its implementation exploits the translational symmetry and the site-group symmetries of the WFs. Test calculations are performed on LiH crystal. The results show that the periodic LMP2-F12 method substantially accelerates basis set convergence of the total correlation energy, and even more so the correlation energy differences. The resulting energies are quite insensitive to the resolution-of-the-identity domain sizes and the quality of the auxiliary basis sets. The convergence with the orbital domain size is somewhat slower, but still acceptable. Moreover, inclusion of slightly more diffuse functions, than those usually used in the periodic calculations, improves the convergence of the LMP2-F12 correlation energy with respect to both the size of the PAO-domains and the quality of the orbital basis set. At the same time, the essentially diffuse atomic orbitals from standard molecular basis sets, commonly utilized in molecular MP2-F12 calculations, but problematic in the periodic context, are not necessary for LMP2-F12 treatment of crystals.

  16. Electronic Structure and Bonding in Transition Metal Inorganic and Organometallic Complexes: New Basis Sets, Linear Semibridging Carbonyls and Thiocarbonyls, and Oxidative Addition of Molecular Hydrogen to Square - Iridium Complexes.

    NASA Astrophysics Data System (ADS)

    Sargent, Andrew Landman

    Approximate molecular orbital and ab initio quantum chemical techniques are used to investigate the electronic structure, bonding and reactivity of several transition metal inorganic and organometallic complexes. Modest-sized basis sets are developed for the second-row transition metal atoms and are designed for use in geometry optimizations of inorganic and organometallic complexes incorporating these atoms. The basis sets produce optimized equilibrium geometries which are slightly better than those produced with standard 3-21G basis sets, and which are significantly better than those produced with effective core potential basis sets. Linear semibridging carbonyl ligands in heterobimetallic complexes which contain a coordinatively unsaturated late transition metal center are found to accept electron density from, rather than donate electron density to, these centers. Only when the secondary metal center is a coordinatively unsaturated early transition metal center does the semibridging ligand donate electron density to this center. Large holes in the d shell around the metal center are more prominent and prevalent in early than in late transition metal centers, and the importance of filling in these holes outweighs the importance of mitigating the charge imbalance due to the dative metal-metal interaction. Semibridging thiocarbonyl ligands are more effective donors of electron density than the carbonyl ligands since the occupied donor orbitals of pi symmetry are higher in energy. The stereoselectivity of H_2 addition to d^8 square-planar transition metal complexes is controlled by the interactions between the ligands in the plane of addition and the concentrations of electronic charge around the metal center as the complex evolves from a four-coordinate to a six-coordinate species. Electron -withdrawing ligands help stabilize the five-coordinate species while strong electron donor ligands contribute only to the destabilizing repulsive interactions. The relative thermodynamic stabilities of the final complexes can be predicted based on the relative orientations of the strongest sigma-donor ligands.

  17. Technology and Technique Standards for Camera-Acquired Digital Dermatologic Images: A Systematic Review.

    PubMed

    Quigley, Elizabeth A; Tokay, Barbara A; Jewell, Sarah T; Marchetti, Michael A; Halpern, Allan C

    2015-08-01

    Photographs are invaluable dermatologic diagnostic, management, research, teaching, and documentation tools. Digital Imaging and Communications in Medicine (DICOM) standards exist for many types of digital medical images, but there are no DICOM standards for camera-acquired dermatologic images to date. To identify and describe existing or proposed technology and technique standards for camera-acquired dermatologic images in the scientific literature. Systematic searches of the PubMed, EMBASE, and Cochrane databases were performed in January 2013 using photography and digital imaging, standardization, and medical specialty and medical illustration search terms and augmented by a gray literature search of 14 websites using Google. Two reviewers independently screened titles of 7371 unique publications, followed by 3 sequential full-text reviews, leading to the selection of 49 publications with the most recent (1985-2013) or detailed description of technology or technique standards related to the acquisition or use of images of skin disease (or related conditions). No universally accepted existing technology or technique standards for camera-based digital images in dermatology were identified. Recommendations are summarized for technology imaging standards, including spatial resolution, color resolution, reproduction (magnification) ratios, postacquisition image processing, color calibration, compression, output, archiving and storage, and security during storage and transmission. Recommendations are also summarized for technique imaging standards, including environmental conditions (lighting, background, and camera position), patient pose and standard view sets, and patient consent, privacy, and confidentiality. Proposed standards for specific-use cases in total body photography, teledermatology, and dermoscopy are described. The literature is replete with descriptions of obtaining photographs of skin disease, but universal imaging standards have not been developed, validated, and adopted to date. Dermatologic imaging is evolving without defined standards for camera-acquired images, leading to variable image quality and limited exchangeability. The development and adoption of universal technology and technique standards may first emerge in scenarios when image use is most associated with a defined clinical benefit.

  18. Techniques in Marriage and Family Counseling. Volume One. The Family Psychology and Counseling Series.

    ERIC Educational Resources Information Center

    Watts, Richard E., Ed.

    This book is designed to bridge the gap between the reality of professional practice and what is being written about it in professional publications. It is divided into three sections, focusing on the techniques of assessment, transgenerational techniques, and constructivist techniques. Section one argues that assessment is the basis of all…

  19. Evaluation of Programmed Instruction Techniques in Medical Interviewing. Final Report, June 15, 1966 to June 15, 1968.

    ERIC Educational Resources Information Center

    Adler, Leta McKinney; And Others

    Since the medical interview is usually considered to be the basis of all diagnosis and treatment in medicine, this study investigated alternative ways of improving medical interview techniques. To test the hypothesis that the visual (videotape) technique would be more effective than the lecturing or audiotape technique, 12 videotaped interviews…

  20. Emerging Techniques 2: Architectural Programming.

    ERIC Educational Resources Information Center

    Evans, Benjamin H.; Wheeler, C. Herbert, Jr.

    A selected collection of architectural programming techniques has been assembled to aid architects in building design. Several exciting and sophisticated techniques for determining a basis for environmental design have been developed in recent years. These extend to the logic of environmental design and lead to more appropriate and useful…

  1. 50 CFR 600.320 - National Standard 3-Management Units.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... comprehensive approach to fishery management. The geographic scope of the fishery, for planning purposes, should....320 National Standard 3—Management Units. (a) Standard 3. To the extent practicable, an individual... portion of a fishery identified in an FMP as relevant to the FMP's management objectives. (1) Basis. The...

  2. 50 CFR 600.320 - National Standard 3-Management Units.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... comprehensive approach to fishery management. The geographic scope of the fishery, for planning purposes, should....320 National Standard 3—Management Units. (a) Standard 3. To the extent practicable, an individual... portion of a fishery identified in an FMP as relevant to the FMP's management objectives. (1) Basis. The...

  3. 40 CFR 80.41 - Standards and requirements for compliance.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... following standards apply for all reformulated gasoline: (1) The standard for heavy metals, including lead or manganese, on a per-gallon basis, is that reformulated gasoline may contain no heavy metals. The Administrator may waive this prohibition for a heavy metal (other than lead) if the Administrator determines...

  4. 40 CFR 80.41 - Standards and requirements for compliance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... following standards apply for all reformulated gasoline: (1) The standard for heavy metals, including lead or manganese, on a per-gallon basis, is that reformulated gasoline may contain no heavy metals. The Administrator may waive this prohibition for a heavy metal (other than lead) if the Administrator determines...

  5. 40 CFR 80.41 - Standards and requirements for compliance.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... following standards apply for all reformulated gasoline: (1) The standard for heavy metals, including lead or manganese, on a per-gallon basis, is that reformulated gasoline may contain no heavy metals. The Administrator may waive this prohibition for a heavy metal (other than lead) if the Administrator determines...

  6. 40 CFR 80.41 - Standards and requirements for compliance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... following standards apply for all reformulated gasoline: (1) The standard for heavy metals, including lead or manganese, on a per-gallon basis, is that reformulated gasoline may contain no heavy metals. The Administrator may waive this prohibition for a heavy metal (other than lead) if the Administrator determines...

  7. 40 CFR 80.41 - Standards and requirements for compliance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... following standards apply for all reformulated gasoline: (1) The standard for heavy metals, including lead or manganese, on a per-gallon basis, is that reformulated gasoline may contain no heavy metals. The Administrator may waive this prohibition for a heavy metal (other than lead) if the Administrator determines...

  8. 7 CFR 1783.7 - What is the grant application process?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) Application for Federal Assistance: Standard Form 424; (2) Budget Information—Non-Construction Programs: Standard Form 424A; (3) Assurances—Non-Construction Programs: Standard Form 424B; (4) Evidence of applicant... of this part. (c) The applicant should submit a narrative establishing the basis for any claims that...

  9. 7 CFR 1783.7 - What is the grant application process?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Application for Federal Assistance: Standard Form 424; (2) Budget Information—Non-Construction Programs: Standard Form 424A; (3) Assurances—Non-Construction Programs: Standard Form 424B; (4) Evidence of applicant... of this part. (c) The applicant should submit a narrative establishing the basis for any claims that...

  10. 7 CFR 1783.7 - What is the grant application process?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Application for Federal Assistance: Standard Form 424; (2) Budget Information—Non-Construction Programs: Standard Form 424A; (3) Assurances—Non-Construction Programs: Standard Form 424B; (4) Evidence of applicant... of this part. (c) The applicant should submit a narrative establishing the basis for any claims that...

  11. The Software Maturity Matrix: A Software Performance Metric

    DTIC Science & Technology

    2003-01-28

    are for Managing n Use Them! n Unused measurements have the same value as last night’s unused hotel room or an empty airline seat. n Be Prepared to...standard measurements are implicit n Organization standard verification is implicit n Organization standard SMM training can be the basis of an

  12. Comparative analysis of economic models in selected solar energy computer programs

    NASA Astrophysics Data System (ADS)

    Powell, J. W.; Barnes, K. A.

    1982-01-01

    The economic evaluation models in five computer programs widely used for analyzing solar energy systems (F-CHART 3.0, F-CHART 4.0, SOLCOST, BLAST, and DOE-2) are compared. Differences in analysis techniques and assumptions among the programs are assessed from the point of view of consistency with the Federal requirements for life cycle costing (10 CFR Part 436), effect on predicted economic performance, and optimal system size, case of use, and general applicability to diverse systems types and building types. The FEDSOL program developed by the National Bureau of Standards specifically to meet the Federal life cycle cost requirements serves as a basis for the comparison. Results of the study are illustrated in test cases of two different types of Federally owned buildings: a single family residence and a low rise office building.

  13. The Role of Esophageal Hypersensitivity in Functional Esophageal Disorders.

    PubMed

    Farmer, Adam D; Ruffle, James K; Aziz, Qasim

    2017-02-01

    The Rome IV diagnostic criteria delineates 5 functional esophageal disorders which include functional chest pain, functional heartburn, reflux hypersensitivity, globus, and functional dysphagia. These are a heterogenous group of disorders which, despite having characteristic symptom profiles attributable to esophageal pathology, fail to demonstrate any structural, motility or inflammatory abnormalities on standard clinical testing. These disorders are associated with a marked reduction in patient quality of life, not least considerable healthcare resources. Furthermore, the pathophysiology of these disorders is incompletely understood. In this narrative review we provide the reader with an introductory primer to the structure and function of esophageal perception, including nociception that forms the basis of the putative mechanisms that may give rise to symptoms in functional esophageal disorders. We also discuss the provocative techniques and outcome measures by which esophageal hypersensitivity can be established.

  14. Results of the EURAMET.RI(II)-S6.I-129 supplementary comparison

    NASA Astrophysics Data System (ADS)

    García-Toraño, Eduardo; Altzitzoglou, Timotheos; Auerbach, Pavel; Bé, Marie-Martine; Lourenço, Valérie; Bobin, Christophe; Cassette, Philippe; Dersch, Rainer; Kossert, Karsten; Nähle, Ole; Peyrés, Virginia; Pommé, Stefaan; Rozkov, Andrej; Sanchez-Cabezudo, Anabel; Sochoro&vacute; , Jana

    2015-01-01

    An international comparison of the long-lived gamma-ray emitter 129I has been recently completed. A total of 5 laboratories measured a solution prepared by Centro de Investigaciones Energéticas, Medioambientales y Tecnológicas (CIEMAT). Aliquots of the master solution were standardized in terms of activity per mass unit by participant laboratories using 4 different techniques. The results of the comparison can be used as the basis for establishing the equivalence among the laboratories. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCRI, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).

  15. Flexibility, Diversity, and Cooperativity: Pillars of Enzyme Catalysis

    PubMed Central

    Hammes, Gordon G.; Benkovic, Stephen J.; Hammes-Schiffer, Sharon

    2011-01-01

    This brief review discusses our current understanding of the molecular basis of enzyme catalysis. A historical development is presented, beginning with steady state kinetics and progressing through modern fast reaction methods, NMR, and single molecule fluorescence techniques. Experimental results are summarized for ribonuclease, aspartate aminotransferase, and especially dihydrofolate reductase (DHFR). Multiple intermediates, multiple conformations, and cooperative conformational changes are shown to be an essential part of virtually all enzyme mechanisms. In the case of DHFR, theoretical investigations have provided detailed information about the movement of atoms within the enzyme-substrate complex as the reaction proceeds along the collective reaction coordinate for hydride transfer. A general mechanism is presented for enzyme catalysis that includes multiple intermediates and a complex, multidimensional standard free energy surface. Protein flexibility, diverse protein conformations, and cooperative conformational changes are important features of this model. PMID:22029278

  16. Finite temperature properties of clusters by replica exchange metadynamics: the water nonamer.

    PubMed

    Zhai, Yingteng; Laio, Alessandro; Tosatti, Erio; Gong, Xin-Gao

    2011-03-02

    We introduce an approach for the accurate calculation of thermal properties of classical nanoclusters. On the basis of a recently developed enhanced sampling technique, replica exchange metadynamics, the method yields the true free energy of each relevant cluster structure, directly sampling its basin and measuring its occupancy in full equilibrium. All entropy sources, whether vibrational, rotational anharmonic, or especially configurational, the latter often forgotten in many cluster studies, are automatically included. For the present demonstration, we choose the water nonamer (H(2)O)(9), an extremely simple cluster, which nonetheless displays a sufficient complexity and interesting physics in its relevant structure spectrum. Within a standard TIP4P potential description of water, we find that the nonamer second relevant structure possesses a higher configurational entropy than the first, so that the two free energies surprisingly cross for increasing temperature.

  17. Report of the FELASA Working Group on evaluation of quality systems for animal units.

    PubMed

    Howard, B; van Herck, H; Guillen, J; Bacon, B; Joffe, R; Ritskes-Hoitinga, M

    2004-04-01

    This report compares and considers the merits of existing, internationally available quality management systems suitable for implementation in experimental animal facilities. These are: the Good Laboratory Practice Guidelines, ISO 9000:2000 (International Organization for Standardization) and AAALAC International (Association for Assessment and Accreditation of Laboratory Animal Care International). Good laboratory practice (GLP) is a legal requirement for institutions undertaking non-clinical health and environmental studies for the purpose of registering or licensing for use and which have to be 'GLP-compliant'. GLP guidelines are often only relevant for and obtainable by those institutions. ISO is primarily an external business standard, which provides a management tool to master and optimize a business activity; it aims to implement and enhance 'customer satisfaction'. AAALAC is primarily a peer-reviewed system of accreditation which evaluates the organization and procedures in programmes of animal care and use to ensure the appropriate use of animals, safeguard animal well-being (ensuring state-of-the-art housing, management, procedural techniques, etc.) as well as the management of health and safety of staff. Management needs to determine, on the basis of a facility's specific goals, whether benefits would arise from the introduction of a quality system and, if so, which system is most appropriate. The successful introduction of a quality system confers peer-recognition against an independent standard, thereby providing assurance of standards of animal care and use, improving the quality of animal studies, and contributing to the three Rs-reduction, refinement and replacement.

  18. European union standards for tuberculosis care.

    PubMed

    Migliori, G B; Zellweger, J P; Abubakar, I; Ibraim, E; Caminero, J A; De Vries, G; D'Ambrosio, L; Centis, R; Sotgiu, G; Menegale, O; Kliiman, K; Aksamit, T; Cirillo, D M; Danilovits, M; Dara, M; Dheda, K; Dinh-Xuan, A T; Kluge, H; Lange, C; Leimane, V; Loddenkemper, R; Nicod, L P; Raviglione, M C; Spanevello, A; Thomsen, V Ø; Villar, M; Wanlin, M; Wedzicha, J A; Zumla, A; Blasi, F; Huitric, E; Sandgren, A; Manissero, D

    2012-04-01

    The European Centre for Disease Prevention and Control (ECDC) and the European Respiratory Society (ERS) jointly developed European Union Standards for Tuberculosis Care (ESTC) aimed at providing European Union (EU)-tailored standards for the diagnosis, treatment and prevention of tuberculosis (TB). The International Standards for TB Care (ISTC) were developed in the global context and are not always adapted to the EU setting and practices. The majority of EU countries have the resources and capacity to implement higher standards to further secure quality TB diagnosis, treatment and prevention. On this basis, the ESTC were developed as standards specifically tailored to the EU setting. A panel of 30 international experts, led by a writing group and the ERS and ECDC, identified and developed the 21 ESTC in the areas of diagnosis, treatment, HIV and comorbid conditions, and public health and prevention. The ISTCs formed the basis for the 21 standards, upon which additional EU adaptations and supplements were developed. These patient-centred standards are targeted to clinicians and public health workers, providing an easy-to-use resource, guiding through all required activities to ensure optimal diagnosis, treatment and prevention of TB. These will support EU health programmes to identify and develop optimal procedures for TB care, control and elimination.

  19. Intensity of the Internal Standard Response as the Basis for Reporting a Test Specimen as Negative or Inconclusive

    DTIC Science & Technology

    2007-08-01

    Documentation Page 1 . Report No. 2. Government Accession No. 3. Recipient’s Catalog No. DOT/FAA/AM-07/23 4. Title and Subtitle 5. Report Date...Organization Code 7. Author(s) 8. Performing Organization Report No. Ray H. Liu, 1 Chih-Hung Wu, 1 Yi-Jun Chen, 1 Chiung-Dan Chang, 1 Jason G...Investigation and the Office of Aerospace Medicine for sponsoring part of this research. 1 IntensIty of the Internal standard response as the BasIs

  20. NASA's Lessons Learned and Technical Standards: A Logical Marriage

    NASA Technical Reports Server (NTRS)

    Gill, Paul; Vaughan, William W.; Garcia, Danny; Weinstein, Richard

    2001-01-01

    Lessons Learned have been the basis for our accomplishments throughout the ages. They have been passed down from father to son, mother to daughter, teacher to pupil, and older to younger worker. Lessons Learned have also been the basis for NASA's accomplishments for more than forty years. Both government and industry have long recognized the need to systematically document and utilize the knowledge gained from past experiences in order to avoid the repetition of failures and mishaps. Lessons Learned have formed the foundation for discoveries, inventions, improvements, textbooks, and Technical Standards.

  1. Federal Guidance Report No. 1: Background Material for the Development of Radiation Protection Standards (Federal Radiation Council)

    EPA Pesticide Factsheets

    This report provides required interim radiation protection recommendations. It includes recommendations for additional research which will provide a firmer basis for the formulation of radiation standards.

  2. Stochastic subset selection for learning with kernel machines.

    PubMed

    Rhinelander, Jason; Liu, Xiaoping P

    2012-06-01

    Kernel machines have gained much popularity in applications of machine learning. Support vector machines (SVMs) are a subset of kernel machines and generalize well for classification, regression, and anomaly detection tasks. The training procedure for traditional SVMs involves solving a quadratic programming (QP) problem. The QP problem scales super linearly in computational effort with the number of training samples and is often used for the offline batch processing of data. Kernel machines operate by retaining a subset of observed data during training. The data vectors contained within this subset are referred to as support vectors (SVs). The work presented in this paper introduces a subset selection method for the use of kernel machines in online, changing environments. Our algorithm works by using a stochastic indexing technique when selecting a subset of SVs when computing the kernel expansion. The work described here is novel because it separates the selection of kernel basis functions from the training algorithm used. The subset selection algorithm presented here can be used in conjunction with any online training technique. It is important for online kernel machines to be computationally efficient due to the real-time requirements of online environments. Our algorithm is an important contribution because it scales linearly with the number of training samples and is compatible with current training techniques. Our algorithm outperforms standard techniques in terms of computational efficiency and provides increased recognition accuracy in our experiments. We provide results from experiments using both simulated and real-world data sets to verify our algorithm.

  3. Tools for rabies serology to monitor the effectiveness of rabies vaccination in domestic and wild carnivores.

    PubMed

    Servat, A; Wasniewski, M; Cliquet, F

    2006-01-01

    Serology remains the only way to monitor the effectiveness of vaccination of humans and animals against rabies. Many techniques for determining the level of rabies antibodies have been described, including seroneutralisation techniques such as tests for fluorescent antibody virus neutralisation (FAVN) and rapid fluorescent focus inhibition (RFFIT), enzyme-linked immunosorbent assay (ELISA), and in-vivo tests (the mouse neutralisation test, MNT). The need to verify the effectiveness of rabies vaccination has become widespread, particularly in the context of international trading of domestic carnivores from infected to rabies-free territories. The standardisation of serological techniques, approval of laboratories and proficiency tests are key concepts to ensure the practicability of such systems. Serological tests for rabies are also often used by laboratories in infected territories to assess the efficacy of campaigns aimed at the eradication of the disease via oral vaccination of wildlife. The adaptation of these methods should provide the means to titrate specific antibodies in dogs during mass parenteral vaccination in countries infected by canine rabies. However, in most cases these serological tests are carried without any standardised procedure. On the basis of our experience in rabies serology and its harmonisation throughout laboratories worldwide, we propose here an adapted standard technique for the serological monitoring for rabies in wildlife at the European level. Such harmonisation would allow the monitoring of vaccination campaigns to be enhanced by increasing the exchange of epidemiological data, with the ultimate goal being the eradication of rabies in Europe.

  4. 42 CFR 486.1 - Basis and scope.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Basis and scope. 486.1 Section 486.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION CONDITIONS FOR COVERAGE OF SPECIALIZED SERVICES FURNISHED BY SUPPLIERS General...

  5. 42 CFR 486.301 - Basis and scope.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Basis and scope. 486.301 Section 486.301 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION CONDITIONS FOR COVERAGE OF SPECIALIZED SERVICES FURNISHED BY SUPPLIERS...

  6. 42 CFR 486.301 - Basis and scope.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Basis and scope. 486.301 Section 486.301 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION CONDITIONS FOR COVERAGE OF SPECIALIZED SERVICES FURNISHED BY SUPPLIERS...

  7. 42 CFR 486.1 - Basis and scope.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Basis and scope. 486.1 Section 486.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION CONDITIONS FOR COVERAGE OF SPECIALIZED SERVICES FURNISHED BY SUPPLIERS General...

  8. 7 CFR 868.253 - Basis of determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... FOR CERTAIN AGRICULTURAL COMMODITIES United States Standards for Brown Rice for Processing Principles... heat, heat-damaged kernels, parboiled kernels in nonparboiled rice, and the special grade Parboiled brown rice for processing shall be on the basis of the brown rice for processing after it has been...

  9. 7 CFR 868.253 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... FOR CERTAIN AGRICULTURAL COMMODITIES United States Standards for Brown Rice for Processing Principles... heat, heat-damaged kernels, parboiled kernels in nonparboiled rice, and the special grade Parboiled brown rice for processing shall be on the basis of the brown rice for processing after it has been...

  10. 45 CFR 1151.18 - Illustrative examples.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... HUMANITIES NATIONAL ENDOWMENT FOR THE ARTS NONDISCRIMINATION ON THE BASIS OF HANDICAP Discrimination... persons are not subjected to discrimination on the basis of handicap either by sub-grantees or by the... handicapped person with experience and expertise equal to qualification standards established by a planning or...

  11. 45 CFR 1151.18 - Illustrative examples.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... HUMANITIES NATIONAL ENDOWMENT FOR THE ARTS NONDISCRIMINATION ON THE BASIS OF HANDICAP Discrimination... persons are not subjected to discrimination on the basis of handicap either by sub-grantees or by the... handicapped person with experience and expertise equal to qualification standards established by a planning or...

  12. 45 CFR 1151.18 - Illustrative examples.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... HUMANITIES NATIONAL ENDOWMENT FOR THE ARTS NONDISCRIMINATION ON THE BASIS OF HANDICAP Discrimination... persons are not subjected to discrimination on the basis of handicap either by sub-grantees or by the... handicapped person with experience and expertise equal to qualification standards established by a planning or...

  13. 45 CFR 1151.18 - Illustrative examples.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... HUMANITIES NATIONAL ENDOWMENT FOR THE ARTS NONDISCRIMINATION ON THE BASIS OF HANDICAP Discrimination... persons are not subjected to discrimination on the basis of handicap either by sub-grantees or by the... handicapped person with experience and expertise equal to qualification standards established by a planning or...

  14. 45 CFR 1151.18 - Illustrative examples.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... HUMANITIES NATIONAL ENDOWMENT FOR THE ARTS NONDISCRIMINATION ON THE BASIS OF HANDICAP Discrimination... persons are not subjected to discrimination on the basis of handicap either by sub-grantees or by the... handicapped person with experience and expertise equal to qualification standards established by a planning or...

  15. 40 CFR 60.56a - Standards for municipal waste combustor operating practices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... large MWC plant shall develop and update on a yearly basis a sitespecific operating manual that shall... standards under this subpart; (2) Description of basic combustion theory applicable to an MWC unit; (3...

  16. 40 CFR 60.263 - Standard for carbon monoxide.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 6 2010-07-01 2010-07-01 false Standard for carbon monoxide. 60.263... Production Facilities § 60.263 Standard for carbon monoxide. (a) On and after the date on which the... furnace any gases which contain, on a dry basis, 20 or greater volume percent of carbon monoxide...

  17. The National Teaching Standard: Route to Rigor Mortis.

    ERIC Educational Resources Information Center

    McNeil, John D.

    The actions of the National Board for Professional Teaching Standards with regard to national teaching standards and an associated examination are critiqued. The board was established on the basis of a recommendation by an advisory council of the Carnegie Forum on Education and the Economy. The board, which is composed of politicians, business…

  18. Standards of Multimedia Graphic Design in Education

    ERIC Educational Resources Information Center

    Aldalalah, Osamah Ahmad; Ababneh, Ziad Waleed Mohamed

    2015-01-01

    This study aims to determine Standards of Multimedia Graphic Design in Education through the analysis of the theoretical basis and previous studies related to this subject. This study has identified the list of standards of Multimedia, Graphic Design, each of which has a set indicator through which the quality of Multimedia can be evaluated in…

  19. 34 CFR 106.43 - Standards for measuring skill or progress in physical education classes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS OR ACTIVITIES RECEIVING FEDERAL FINANCIAL ASSISTANCE Discrimination on the Basis of Sex in Education Programs or Activities Prohibited § 106.43 Standards for measuring skill or progress in physical... 34 Education 1 2011-07-01 2011-07-01 false Standards for measuring skill or progress in physical...

  20. Texas School Libraries: Standards, Resources, Services, and Students' Performance.

    ERIC Educational Resources Information Center

    Smith, Ester G.

    This study of Texas school libraries had three objectives: examine school library resources, services, and use, on the basis of the School Library Programs: Standards and Guidelines for Texas and determine the need for updating these standards and guidelines so that they better serve communities across the state; determine the impact that school…

  1. Arizona Early Childhood Education Standards.

    ERIC Educational Resources Information Center

    Arizona State Dept. of Education, Phoenix.

    In an effort to provide a sound basis for educational accountability for preschool programs, the Arizona Early Childhood Education (ECE) Standards were developed as a framework for literacy-based programs for 3- and 4-year-olds and to provide parents with a basic understanding of indicators of early learning. These standards, to be adopted by…

  2. 40 CFR 60.263 - Standard for carbon monoxide.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 6 2011-07-01 2011-07-01 false Standard for carbon monoxide. 60.263... Production Facilities § 60.263 Standard for carbon monoxide. (a) On and after the date on which the... furnace any gases which contain, on a dry basis, 20 or greater volume percent of carbon monoxide...

  3. The Next Generation of Science Standards: Implications for Biology Education

    ERIC Educational Resources Information Center

    Bybee, Rodger W.

    2012-01-01

    The release of A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (NRC, 2012) provides the basis for the next generation of science standards. This article first describes that foundation for the life sciences; it then presents a draft standard for natural selection and evolution. Finally, there is a…

  4. Background to the development process, Automated Residential Energy Standard (ARES) in support of proposed interim energy conservation voluntary performance standards for new non-federal residential buildings: Volume 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    This report documents the development and testing of a set of recommendations generated to serve as a primary basis for the Congressionally-mandated residential standard. This report treats only the residential building recommendations.

  5. National Health Care Skill Standards.

    ERIC Educational Resources Information Center

    Far West Lab. for Educational Research and Development, San Francisco, CA.

    This booklet contains draft national health care skill standards that were proposed during the National Health Care Skill Standards Project on the basis of input from more than 1,000 representatives of key constituencies of the health care field. The project objectives and structure are summarized in the introduction. Part 1 examines the need for…

  6. 34 CFR 106.43 - Standards for measuring skill or progress in physical education classes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PROGRAMS OR ACTIVITIES RECEIVING FEDERAL FINANCIAL ASSISTANCE Discrimination on the Basis of Sex in Education Programs or Activities Prohibited § 106.43 Standards for measuring skill or progress in physical... 34 Education 1 2014-07-01 2014-07-01 false Standards for measuring skill or progress in physical...

  7. 34 CFR 106.43 - Standards for measuring skill or progress in physical education classes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... PROGRAMS OR ACTIVITIES RECEIVING FEDERAL FINANCIAL ASSISTANCE Discrimination on the Basis of Sex in Education Programs or Activities Prohibited § 106.43 Standards for measuring skill or progress in physical... 34 Education 1 2012-07-01 2012-07-01 false Standards for measuring skill or progress in physical...

  8. 34 CFR 106.43 - Standards for measuring skill or progress in physical education classes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... PROGRAMS OR ACTIVITIES RECEIVING FEDERAL FINANCIAL ASSISTANCE Discrimination on the Basis of Sex in Education Programs or Activities Prohibited § 106.43 Standards for measuring skill or progress in physical... 34 Education 1 2013-07-01 2013-07-01 false Standards for measuring skill or progress in physical...

  9. Patch-based image reconstruction for PET using prior-image derived dictionaries

    NASA Astrophysics Data System (ADS)

    Tahaei, Marzieh S.; Reader, Andrew J.

    2016-09-01

    In PET image reconstruction, regularization is often needed to reduce the noise in the resulting images. Patch-based image processing techniques have recently been successfully used for regularization in medical image reconstruction through a penalized likelihood framework. Re-parameterization within reconstruction is another powerful regularization technique in which the object in the scanner is re-parameterized using coefficients for spatially-extensive basis vectors. In this work, a method for extracting patch-based basis vectors from the subject’s MR image is proposed. The coefficients for these basis vectors are then estimated using the conventional MLEM algorithm. Furthermore, using the alternating direction method of multipliers, an algorithm for optimizing the Poisson log-likelihood while imposing sparsity on the parameters is also proposed. This novel method is then utilized to find sparse coefficients for the patch-based basis vectors extracted from the MR image. The results indicate the superiority of the proposed methods to patch-based regularization using the penalized likelihood framework.

  10. UIAGM Ropehandling Techniques.

    ERIC Educational Resources Information Center

    Cloutier, K. Ross

    The Union Internationale des Associations des Guides de Montagne's (UIAGM) rope handling techniques are intended to form the standard for guiding ropework worldwide. These techniques have become the legal standard for instructional institutions and commercial guiding organizations in UIAGM member countries: Austria, Canada, France, Germany, Great…

  11. 45 CFR 2543.84 - Contract Work Hours and Safety Standards Act.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Contract Work Hours and Safety Standards Act. 2543... laborer on the basis of a standard work week of 40 hours. Work in excess of the standard work week is... pay for all hours worked in excess of 40 hours in the work week. Section 107 of the Act is applicable...

  12. 29 CFR 1910.1001 - Asbestos.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... covered by this standard take place. Certified industrial hygienist (CIH) means one certified in the... TWA employee exposures shall be determined on the basis of one or more samples representing full-shift...-minute short-term employee exposures shall be determined on the basis of one or more samples representing...

  13. 7 CFR 868.203 - Basis of determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... FOR CERTAIN AGRICULTURAL COMMODITIES United States Standards for Rough Rice Principles Governing..., heat-damaged kernels, red rice and damaged kernels, chalky kernels, other types, color, and the special grade Parboiled rough rice shall be on the basis of the whole and large broken kernels of milled rice...

  14. 7 CFR 868.203 - Basis of determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... FOR CERTAIN AGRICULTURAL COMMODITIES United States Standards for Rough Rice Principles Governing..., heat-damaged kernels, red rice and damaged kernels, chalky kernels, other types, color, and the special grade Parboiled rough rice shall be on the basis of the whole and large broken kernels of milled rice...

  15. A Prelinguistic Gestural Universal of Human Communication

    ERIC Educational Resources Information Center

    Liszkowski, Ulf; Brown, Penny; Callaghan, Tara; Takada, Akira; de Vos, Conny

    2012-01-01

    Several cognitive accounts of human communication argue for a language-independent, prelinguistic basis of human communication and language. The current study provides evidence for the universality of a prelinguistic gestural basis for human communication. We used a standardized, semi-natural elicitation procedure in seven very different cultures…

  16. 7 CFR 91.45 - Charges for laboratory services on a contract basis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... basis as will reimburse the Agricultural Marketing Service of the Department for the full cost of... that will reimburse the Agricultural Marketing Service of the Department for the full cost of rendering... MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED...

  17. [Abdominal ultrasound course an introduction to the ultrasound technique. Physical basis. Ultrasound language].

    PubMed

    Segura-Grau, A; Sáez-Fernández, A; Rodríguez-Lorenzo, A; Díaz-Rodríguez, N

    2014-01-01

    Ultrasound is a non-invasive, accessible, and versatile diagnostic technique that uses high frequency ultrasound waves to define outline the organs of the human body, with no ionising radiation, in real time and with the capacity to visual several planes. The high diagnostic yield of the technique, together with its ease of uses plus the previously mentioned characteristics, has currently made it a routine method in daily medical practice. It is for this reason that the multidisciplinary character of this technique is being strengthened every day. To be able to perform the technique correctly requires knowledge of the physical basis of ultrasound, the method and the equipment, as well as of the human anatomy, in order to have the maximum information possible to avoid diagnostic errors due to poor interpretation or lack of information. Copyright © 2013 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España. All rights reserved.

  18. Novel, customizable scoring functions, parameterized using N-PLS, for structure-based drug discovery.

    PubMed

    Catana, Cornel; Stouten, Pieter F W

    2007-01-01

    The ability to accurately predict biological affinity on the basis of in silico docking to a protein target remains a challenging goal in the CADD arena. Typically, "standard" scoring functions have been employed that use the calculated docking result and a set of empirical parameters to calculate a predicted binding affinity. To improve on this, we are exploring novel strategies for rapidly developing and tuning "customized" scoring functions tailored to a specific need. In the present work, three such customized scoring functions were developed using a set of 129 high-resolution protein-ligand crystal structures with measured Ki values. The functions were parametrized using N-PLS (N-way partial least squares), a multivariate technique well-known in the 3D quantitative structure-activity relationship field. A modest correlation between observed and calculated pKi values using a standard scoring function (r2 = 0.5) could be improved to 0.8 when a customized scoring function was applied. To mimic a more realistic scenario, a second scoring function was developed, not based on crystal structures but exclusively on several binding poses generated with the Flo+ docking program. Finally, a validation study was conducted by generating a third scoring function with 99 randomly selected complexes from the 129 as a training set and predicting pKi values for a test set that comprised the remaining 30 complexes. Training and test set r2 values were 0.77 and 0.78, respectively. These results indicate that, even without direct structural information, predictive customized scoring functions can be developed using N-PLS, and this approach holds significant potential as a general procedure for predicting binding affinity on the basis of in silico docking.

  19. 48 CFR 9905.505-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... this cost accounting principle does not require that allocation of unallowable costs to final cost.... 9905.505-50 Section 9905.505-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS FOR EDUCATIONAL INSTITUTIONS 9905.505-50 Techniques for...

  20. 48 CFR 9904.403-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.403-50 Section 9904.403-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.403-50 Techniques for application. (a)(1) Separate...

  1. Stabilization techniques for unpaved roads.

    DOT National Transportation Integrated Search

    2004-01-01

    This study presents the basis for evaluating promising soil stabilization products using the relatively new technique of deeply mixing chemical additives into unpaved roadbeds. The work is in response to an amendment to House Bill 1400, Item 490, No....

  2. Evaluation of digital radiography practice using exposure index tracking

    PubMed Central

    Zhou, Yifang; Allahverdian, Janet; Nute, Jessica L.; Lee, Christina

    2016-01-01

    Some digital radiography (DR) detectors and software allow for remote download of exam statistics, including image reject status, body part, projection, and exposure index (EI). The ability to have automated data collection from multiple DR units is conducive to a quality control (QC) program monitoring institutional radiographic exposures. We have implemented such a QC program with the goal to identify outliers in machine radiation output and opportunities for improvement in radiation dose levels. We studied the QC records of four digital detectors in greater detail on a monthly basis for one year. Although individual patient entrance skin exposure varied, the radiation dose levels to the detectors were made to be consistent via phototimer recalibration. The exposure data stored on each digital detector were periodically downloaded in a spreadsheet format for analysis. EI median and standard deviation were calculated for each protocol (by body part) and EI histograms were created for torso protocols. When histograms of EI values for different units were compared, we observed differences up to 400 in average EI (representing 60% difference in radiation levels to the detector) between units nominally calibrated to the same EI. We identified distinct components of the EI distributions, which in some cases, had mean EI values 300 apart. Peaks were observed at the current calibrated EI, a previously calibrated EI, and an EI representing computed radiography (CR) techniques. Our findings in this ongoing project have allowed us to make useful interventions, from emphasizing the use of phototimers instead of institutional memory of manual techniques to improvements in our phototimer calibration. We believe that this QC program can be implemented at other sites and can reveal problems with radiation levels in the aggregate that are difficult to identify on a case‐by‐case basis. PACS number(s): 87.59.bf PMID:27929507

  3. Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations

    NASA Astrophysics Data System (ADS)

    von Martens, Hans-Jürgen

    2010-05-01

    The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s2). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.

  4. A Chromosome-Scale Assembly of the Bactrocera cucurbitae Genome Provides Insight to the Genetic Basis of white pupae

    PubMed Central

    Sim, Sheina B.; Geib, Scott M.

    2017-01-01

    Genetic sexing strains (GSS) used in sterile insect technique (SIT) programs are textbook examples of how classical Mendelian genetics can be directly implemented in the management of agricultural insect pests. Although the foundation of traditionally developed GSS are single locus, autosomal recessive traits, their genetic basis are largely unknown. With the advent of modern genomic techniques, the genetic basis of sexing traits in GSS can now be further investigated. This study is the first of its kind to integrate traditional genetic techniques with emerging genomics to characterize a GSS using the tephritid fruit fly pest Bactrocera cucurbitae as a model. These techniques include whole-genome sequencing, the development of a mapping population and linkage map, and quantitative trait analysis. The experiment designed to map the genetic sexing trait in B. cucurbitae, white pupae (wp), also enabled the generation of a chromosome-scale genome assembly by integrating the linkage map with the assembly. Quantitative trait loci analysis revealed SNP loci near position 42 MB on chromosome 3 to be tightly linked to wp. Gene annotation and synteny analysis show a near perfect relationship between chromosomes in B. cucurbitae and Muller elements A–E in Drosophila melanogaster. This chromosome-scale genome assembly is complete, has high contiguity, was generated using a minimal input DNA, and will be used to further characterize the genetic mechanisms underlying wp. Knowledge of the genetic basis of genetic sexing traits can be used to improve SIT in this species and expand it to other economically important Diptera. PMID:28450369

  5. 7 CFR 58.132 - Basis for classification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) GRADING AND INSPECTION...

  6. 7 CFR 58.132 - Basis for classification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) GRADING AND INSPECTION...

  7. 7 CFR 58.132 - Basis for classification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) GRADING AND INSPECTION...

  8. Our Hidden Past: Biology, Part 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Ray; Russell, Liane; Mazur, Peter

    In their new home at "The Mouse House" at Y-12, researchers from ORNL's Biology Division conducted studies that led to standards such as dose rate effects that form the basis for current international standards for radiation exposure in humans.

  9. 42 CFR 495.300 - Basis and purpose.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for adopting, implementing, or upgrading certified EHR technology or for meaningful use of such technology. This subpart also provides enhanced Federal financial participation (FFP) to States to administer...) STANDARDS AND CERTIFICATION STANDARDS FOR THE ELECTRONIC HEALTH RECORD TECHNOLOGY INCENTIVE PROGRAM...

  10. 50 CFR 261.103 - Basis for determination of a U.S. Standard for Grades.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... fillet). (2) Product forms, which describe the types, styles and market forms covered by the standard (e...., bruises, blood spots, bones, black spots, coating defects, 1-inch squares, percent by weight, ratios). (8...

  11. 50 CFR 261.103 - Basis for determination of a U.S. Standard for Grades.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... fillet). (2) Product forms, which describe the types, styles and market forms covered by the standard (e...., bruises, blood spots, bones, black spots, coating defects, 1-inch squares, percent by weight, ratios). (8...

  12. 7 CFR 54.1005 - Basis of service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED) REGULATIONS AND STANDARDS UNDER THE AGRICULTURAL MARKETING ACT OF 1946 AND THE EGG PRODUCTS INSPECTION ACT (CONTINUED) MEATS, PREPARED MEATS, AND...

  13. [A web-based integrated clinical database for laryngeal cancer].

    PubMed

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  14. Calibrating airborne measurements of airspeed, pressure and temperature using a Doppler laser air-motion sensor

    NASA Astrophysics Data System (ADS)

    Cooper, W. A.; Spuler, S. M.; Spowart, M.; Lenschow, D. H.; Friesen, R. B.

    2014-09-01

    A new laser air-motion sensor measures the true airspeed with a standard uncertainty of less than 0.1 m s-1 and so reduces uncertainty in the measured component of the relative wind along the longitudinal axis of the aircraft to about the same level. The calculated pressure expected from that airspeed at the inlet of a pitot tube then provides a basis for calibrating the measurements of dynamic and static pressure, reducing standard uncertainty in those measurements to less than 0.3 hPa and the precision applicable to steady flight conditions to about 0.1 hPa. These improved measurements of pressure, combined with high-resolution measurements of geometric altitude from the global positioning system, then indicate (via integrations of the hydrostatic equation during climbs and descents) that the offset and uncertainty in temperature measurement for one research aircraft are +0.3 ± 0.3 °C. For airspeed, pressure and temperature, these are significant reductions in uncertainty vs. those obtained from calibrations using standard techniques. Finally, it is shown that although the initial calibration of the measured static and dynamic pressures requires a measured temperature, once calibrated these measured pressures and the measurement of airspeed from the new laser air-motion sensor provide a measurement of temperature that does not depend on any other temperature sensor.

  15. 45 CFR 160.101 - Statutory basis and purpose.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Statutory basis and purpose. 160.101 Section 160.101 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES ADMINISTRATIVE DATA STANDARDS AND RELATED... requirements of this subchapter implement sections 1171-1180 of the Social Security Act (the Act), sections 262...

  16. 45 CFR 160.101 - Statutory basis and purpose.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Statutory basis and purpose. 160.101 Section 160.101 Public Welfare Department of Health and Human Services ADMINISTRATIVE DATA STANDARDS AND RELATED... requirements of this subchapter implement sections 1171-1180 of the Social Security Act (the Act), sections 262...

  17. 42 CFR 489.1 - Statutory basis.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Statutory basis. 489.1 Section 489.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS... regulations for the administration of the Medicare program. (b) Although section 1866 of the Act speaks only...

  18. 42 CFR 489.1 - Statutory basis.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Statutory basis. 489.1 Section 489.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS... regulations for the administration of the Medicare program. (b) Although section 1866 of the Act speaks only...

  19. 42 CFR 489.1 - Statutory basis.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Statutory basis. 489.1 Section 489.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS... regulations for the administration of the Medicare program. (b) Although section 1866 of the Act speaks only...

  20. 42 CFR 483.400 - Basis and purpose.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 5 2012-10-01 2012-10-01 false Basis and purpose. 483.400 Section 483.400 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION REQUIREMENTS FOR STATES AND LONG TERM CARE FACILITIES Conditions of Participation for Intermediate Care Facilities for...

  1. 42 CFR 483.400 - Basis and purpose.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 5 2013-10-01 2013-10-01 false Basis and purpose. 483.400 Section 483.400 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION REQUIREMENTS FOR STATES AND LONG TERM CARE FACILITIES Conditions of Participation for Intermediate Care Facilities for...

  2. 42 CFR 483.400 - Basis and purpose.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 5 2014-10-01 2014-10-01 false Basis and purpose. 483.400 Section 483.400 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION REQUIREMENTS FOR STATES AND LONG TERM CARE FACILITIES Conditions of Participation for Intermediate Care Facilities for...

  3. 42 CFR 442.1 - Basis and purpose.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Basis and purpose. 442.1 Section 442.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS STANDARDS FOR PAYMENT TO NURSING FACILITIES AND INTERMEDIATE CARE FACILITIES FOR INDIVIDUALS WITH INTELLECTUAL DISABILITIES...

  4. 42 CFR 442.1 - Basis and purpose.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 4 2013-10-01 2013-10-01 false Basis and purpose. 442.1 Section 442.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS STANDARDS FOR PAYMENT TO NURSING FACILITIES AND INTERMEDIATE CARE FACILITIES FOR INDIVIDUALS WITH INTELLECTUAL DISABILITIES...

  5. 40 CFR 60.54a - Standard for municipal waste combustor acid gases.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... for Municipal Waste Combustors for Which Construction is Commenced After December 20, 1989 and on or... weight or volume) or 30 parts per million by volume, corrected to 7 percent oxygen (dry basis), whichever... by volume, corrected to 7 percent oxygen (dry basis), whichever is less stringent. ...

  6. 5 CFR 900.603 - Standards for a merit system of personnel administration.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... administration. 900.603 Section 900.603 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL... such merit principles as— (a) Recruiting, selecting, and advancing employees on the basis of their... high quality performance. (d) Retaining employees on the basis of the adequacy of their performance...

  7. 42 CFR 489.1 - Statutory basis.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Statutory basis. 489.1 Section 489.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS... regulations for the administration of the Medicare program. (b) Although section 1866 of the Act speaks only...

  8. 42 CFR 489.1 - Statutory basis.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 5 2011-10-01 2011-10-01 false Statutory basis. 489.1 Section 489.1 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS... regulations for the administration of the Medicare program. (b) Although section 1866 of the Act speaks only...

  9. Correlation consistent basis sets for actinides. I. The Th and U atoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Kirk A., E-mail: kipeters@wsu.edu

    New correlation consistent basis sets based on both pseudopotential (PP) and all-electron Douglas-Kroll-Hess (DKH) Hamiltonians have been developed from double- to quadruple-zeta quality for the actinide atoms thorium and uranium. Sets for valence electron correlation (5f6s6p6d), cc − pV nZ − PP and cc − pV nZ − DK3, as well as outer-core correlation (valence + 5s5p5d), cc − pwCV nZ − PP and cc − pwCV nZ − DK3, are reported (n = D, T, Q). The -PP sets are constructed in conjunction with small-core, 60-electron PPs, while the -DK3 sets utilized the 3rd-order Douglas-Kroll-Hess scalar relativistic Hamiltonian. Bothmore » series of basis sets show systematic convergence towards the complete basis set limit, both at the Hartree-Fock and correlated levels of theory, making them amenable to standard basis set extrapolation techniques. To assess the utility of the new basis sets, extensive coupled cluster composite thermochemistry calculations of ThF{sub n} (n = 2 − 4), ThO{sub 2}, and UF{sub n} (n = 4 − 6) have been carried out. After accurately accounting for valence and outer-core correlation, spin-orbit coupling, and even Lamb shift effects, the final 298 K atomization enthalpies of ThF{sub 4}, ThF{sub 3}, ThF{sub 2}, and ThO{sub 2} are all within their experimental uncertainties. Bond dissociation energies of ThF{sub 4} and ThF{sub 3}, as well as UF{sub 6} and UF{sub 5}, were similarly accurate. The derived enthalpies of formation for these species also showed a very satisfactory agreement with experiment, demonstrating that the new basis sets allow for the use of accurate composite schemes just as in molecular systems composed only of lighter atoms. The differences between the PP and DK3 approaches were found to increase with the change in formal oxidation state on the actinide atom, approaching 5-6 kcal/mol for the atomization enthalpies of ThF{sub 4} and ThO{sub 2}. The DKH3 atomization energy of ThO{sub 2} was calculated to be smaller than the DKH2 value by ∼1 kcal/mol.« less

  10. Discussion on the Criterion for the Safety Certification Basis Compilation - Brazilian Space Program Case

    NASA Astrophysics Data System (ADS)

    Niwa, M.; Alves, N. C.; Caetano, A. O.; Andrade, N. S. O.

    2012-01-01

    The recent advent of the commercial launch and re- entry activities, for promoting the expansion of human access to space for tourism and hypersonic travel, in the already complex ambience of the global space activities, brought additional difficulties over the development of a harmonized framework of international safety rules. In the present work, with the purpose of providing some complementary elements for global safety rule development, the certification-related activities conducted in the Brazilian space program are depicted and discussed, focusing mainly on the criterion for certification basis compilation. The results suggest that the composition of a certification basis with the preferential use of internationally-recognized standards, as is the case of ISO standards, can be a first step toward the development of an international safety regulation for commercial space activities.

  11. 48 CFR 9904.409-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.409-50 Section 9904.409-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.409-50 Techniques for application. (a) Determination of... of consumption of services in the cost accounting periods included in such life. In selecting service...

  12. 48 CFR 9904.414-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the case of process cost accounting systems, the contracting parties may agree to substitute an.... 9904.414-50 Section 9904.414-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD... ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.414-50 Techniques for application. (a) The investment...

  13. 48 CFR 9904.404-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.404-50 Section 9904.404-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.404-50 Techniques for application. (a) The cost to...

  14. 48 CFR 9904.405-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.405-50 Section 9904.405-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.405-50 Techniques for application. (a) The detail and...

  15. 48 CFR 9904.406-50 - Techniques for application.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... 9904.406-50 Section 9904.406-50 Federal Acquisition Regulations System COST ACCOUNTING STANDARDS BOARD, OFFICE OF FEDERAL PROCUREMENT POLICY, OFFICE OF MANAGEMENT AND BUDGET PROCUREMENT PRACTICES AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.406-50 Techniques for application. (a) The cost of an...

  16. Passive Sampling in Regulatory Chemical Monitoring of Nonpolar Organic Compounds in the Aquatic Environment.

    PubMed

    Booij, Kees; Robinson, Craig D; Burgess, Robert M; Mayer, Philipp; Roberts, Cindy A; Ahrens, Lutz; Allan, Ian J; Brant, Jan; Jones, Lisa; Kraus, Uta R; Larsen, Martin M; Lepom, Peter; Petersen, Jördis; Pröfrock, Daniel; Roose, Patrick; Schäfer, Sabine; Smedes, Foppe; Tixier, Céline; Vorkamp, Katrin; Whitehouse, Paul

    2016-01-05

    We reviewed compliance monitoring requirements in the European Union, the United States, and the Oslo-Paris Convention for the protection of the marine environment of the North-East Atlantic, and evaluated if these are met by passive sampling methods for nonpolar compounds. The strengths and shortcomings of passive sampling are assessed for water, sediments, and biota. Passive water sampling is a suitable technique for measuring concentrations of freely dissolved compounds. This method yields results that are incompatible with the EU's quality standard definition in terms of total concentrations in water, but this definition has little scientific basis. Insufficient quality control is a present weakness of passive sampling in water. Laboratory performance studies and the development of standardized methods are needed to improve data quality and to encourage the use of passive sampling by commercial laboratories and monitoring agencies. Successful prediction of bioaccumulation based on passive sampling is well documented for organisms at the lower trophic levels, but requires more research for higher levels. Despite the existence of several knowledge gaps, passive sampling presently is the best available technology for chemical monitoring of nonpolar organic compounds. Key issues to be addressed by scientists and environmental managers are outlined.

  17. Study to validate the outcome goal, competencies and educational objectives for use in intensive care orientation programs.

    PubMed

    Boyle, M; Butcher, R; Kenney, C

    1998-03-01

    Intensive care orientation programs have become an accepted component of intensive care education. To date, however, there have been no Australian-based standards defining the appropriate level of competence to be attained upon completion of orientation. The aim of this study was to validate a set of aims, competencies and educational objectives that could form the basis of intensive care orientation and which would ensure an outcome standard of safe and effective practice. An initial document containing a statement of the desired outcome goal, six competency statements and 182 educational objectives was developed through a review of the orientation programs developed by the investigators. The Delphi technique was used to gain consensus among 13 nurses recognised for their expertise in intensive care education. The expert group rated the acceptability of each of the study items and provided suggestions for objectives to be included. An approval rating of 80 per cent was required to retain each of the study items, with the document refined through three Delphi rounds. The final document contains a validated statement of outcome goal, competencies and educational objectives for intensive care orientation programs.

  18. Artifact-Based Transformation of IBM Global Financing

    NASA Astrophysics Data System (ADS)

    Chao, Tian; Cohn, David; Flatgard, Adrian; Hahn, Sandy; Linehan, Mark; Nandi, Prabir; Nigam, Anil; Pinel, Florian; Vergo, John; Wu, Frederick Y.

    IBM Global Financing (IGF) is transforming its business using the Business Artifact Method, an innovative business process modeling technique that identifies key business artifacts and traces their life cycles as they are processed by the business. IGF is a complex, global business operation with many business design challenges. The Business Artifact Method is a fundamental shift in how to conceptualize, design and implement business operations. The Business Artifact Method was extended to solve the problem of designing a global standard for a complex, end-to-end process while supporting local geographic variations. Prior to employing the Business Artifact method, process decomposition, Lean and Six Sigma methods were each employed on different parts of the financing operation. Although they provided critical input to the final operational model, they proved insufficient for designing a complete, integrated, standard operation. The artifact method resulted in a business operations model that was at the right level of granularity for the problem at hand. A fully functional rapid prototype was created early in the engagement, which facilitated an improved understanding of the redesigned operations model. The resulting business operations model is being used as the basis for all aspects of business transformation in IBM Global Financing.

  19. Noise Estimation and Adaptive Encoding for Asymmetric Quantum Error Correcting Codes

    NASA Astrophysics Data System (ADS)

    Florjanczyk, Jan; Brun, Todd; CenterQuantum Information Science; Technology Team

    We present a technique that improves the performance of asymmetric quantum error correcting codes in the presence of biased qubit noise channels. Our study is motivated by considering what useful information can be learned from the statistics of syndrome measurements in stabilizer quantum error correcting codes (QECC). We consider the case of a qubit dephasing channel where the dephasing axis is unknown and time-varying. We are able to estimate the dephasing angle from the statistics of the standard syndrome measurements used in stabilizer QECC's. We use this estimate to rotate the computational basis of the code in such a way that the most likely type of error is covered by the highest distance of the asymmetric code. In particular, we use the [ [ 15 , 1 , 3 ] ] shortened Reed-Muller code which can correct one phase-flip error but up to three bit-flip errors. In our simulations, we tune the computational basis to match the estimated dephasing axis which in turn leads to a decrease in the probability of a phase-flip error. With a sufficiently accurate estimate of the dephasing axis, our memory's effective error is dominated by the much lower probability of four bit-flips. Aro MURI Grant No. W911NF-11-1-0268.

  20. Representational analysis of extended disorder in atomistic ensembles derived from total scattering data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neilson, James R.; McQueen, Tyrel M.

    With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less

  1. Representational analysis of extended disorder in atomistic ensembles derived from total scattering data

    DOE PAGES

    Neilson, James R.; McQueen, Tyrel M.

    2015-09-20

    With the increased availability of high-intensity time-of-flight neutron and synchrotron X-ray scattering sources that can access wide ranges of momentum transfer, the pair distribution function method has become a standard analysis technique for studying disorder of local coordination spheres and at intermediate atomic separations. In some cases, rational modeling of the total scattering data (Bragg and diffuse) becomes intractable with least-squares approaches, necessitating reverse Monte Carlo simulations using large atomistic ensembles. However, the extraction of meaningful information from the resulting atomistic ensembles is challenging, especially at intermediate length scales. Representational analysis is used here to describe the displacements of atomsmore » in reverse Monte Carlo ensembles from an ideal crystallographic structure in an approach analogous to tight-binding methods. Rewriting the displacements in terms of a local basis that is descriptive of the ideal crystallographic symmetry provides a robust approach to characterizing medium-range order (and disorder) and symmetry breaking in complex and disordered crystalline materials. Lastly, this method enables the extraction of statistically relevant displacement modes (orientation, amplitude and distribution) of the crystalline disorder and provides directly meaningful information in a locally symmetry-adapted basis set that is most descriptive of the crystal chemistry and physics.« less

  2. Amobarbital treatment of multiple personality. Use of structured video tape interviews as a basis for intensive psychotherapy.

    PubMed

    Hall, R C; LeCann, A F; Schoolar, J C

    1978-09-01

    The case of a 30-year-old woman with five distinct personalities is presented. The patient was treated, using a system of structured video taped sodium amobarbital interviews, in which areas to be explored were developed in psychotherapy. Tapes were played for the patient after each session. The taped material was used as the basis for psychotherapeutic investigation. The patient evidenced many of the features previously reported in cases of multiple personality, specifically: being the product of an unwanted pregnancy in a repressively rigid family; emotional distancing by one parent; strong sibling rivalry with an adopted sib; family history of mental illness; a traumatic first sexual experience (rape); a marriage to a maladjusted individual in an attempt to escape the parental home; a high internalized standard of performance and an inability to display anger or negative feelings toward the parents. In the course of treatment, the patient's personalties fused and she was able to accept each component as part of herself. No further fragmentation has occurred during the year following discharge. The therapy technique minimized dependency, and the possiblity of addiction to amobarbital interviews permitted more active patient therapy involvement, and set clear-cut goals and expectations for improvement before further amobarbital interviews could be conducted.

  3. Development of Laboratory Investigations in Disorders of Sex Development.

    PubMed

    Audí, Laura; Camats, Núria; Fernández-Cancio, Mónica; Granada, María L

    2018-01-01

    Scientific knowledge to understand the biological basis of sex development was prompted by the observation of variants different from the 2 most frequent body types, and this became one of the fields first studied by modern pediatric endocrinology. The clinical observation was supported by professionals working in different areas of laboratory sciences which led to the description of adrenal and gonadal steroidogenesis, the enzymes involved, and the different deficiencies. Steroid hormone measurements evolved from colorimetry to radioimmunoassay (RIA) and automated immunoassays, although gas and liquid chromatography coupled to mass spectrometry are now the gold standard techniques for steroid measurements. Peptide hormones and growth factors were purified, and their measurement evolved from RIA to automated immunoassays. Hormone action mechanisms were described, and their specific receptors were characterized and assayed in experimental materials and in patient tissues and cell cultures. The discovery of the genetic basis for variant sex developments began with the description of the sex chromosomes. Molecular technology allowed cloning of genes coding for the different proteins involved in sex determination and development. Experimental animal models aided in verifying the roles of proteins and also suggested new genes to be investigated. New candidate genes continue to be described based on experimental models and on next-generation sequencing of patient DNAs. © 2017 S. Karger AG, Basel.

  4. Integration of GMR Sensors with Different Technologies

    PubMed Central

    Cubells-Beltrán, María-Dolores; Reig, Càndid; Madrenas, Jordi; De Marcellis, Andrea; Santos, Joana; Cardoso, Susana; Freitas, Paulo P.

    2016-01-01

    Less than thirty years after the giant magnetoresistance (GMR) effect was described, GMR sensors are the preferred choice in many applications demanding the measurement of low magnetic fields in small volumes. This rapid deployment from theoretical basis to market and state-of-the-art applications can be explained by the combination of excellent inherent properties with the feasibility of fabrication, allowing the real integration with many other standard technologies. In this paper, we present a review focusing on how this capability of integration has allowed the improvement of the inherent capabilities and, therefore, the range of application of GMR sensors. After briefly describing the phenomenological basis, we deal on the benefits of low temperature deposition techniques regarding the integration of GMR sensors with flexible (plastic) substrates and pre-processed CMOS chips. In this way, the limit of detection can be improved by means of bettering the sensitivity or reducing the noise. We also report on novel fields of application of GMR sensors by the recapitulation of a number of cases of success of their integration with different heterogeneous complementary elements. We finally describe three fully functional systems, two of them in the bio-technology world, as the proof of how the integrability has been instrumental in the meteoric development of GMR sensors and their applications. PMID:27338415

  5. Psychological tools for knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Rueter, Henry H.; Olson, Judith Reitman

    1988-01-01

    Knowledge acquisition is said to be the biggest bottleneck in the development of expert systems. The problem is getting the knowledge out of the expert's head and into a computer. In cognitive psychology, characterizing metal structures and why experts are good at what they do is an important research area. Is there some way that the tools that psychologists have developed to uncover mental structure can be used to benefit knowledge engineers? We think that the way to find out is to browse through the psychologist's toolbox to see what there is in it that might be of use to knowledge engineers. Expert system developers have relied on two standard methods for extracting knowledge from the expert: (1) the knowledge engineer engages in an intense bout of interviews with the expert or experts, or (2) the knowledge engineer becomes an expert himself, relying on introspection to uncover the basis of his own expertise. Unfortunately, these techniques have the difficulty that often the expert himself isn't consciously aware of the basis of his expertise. If the expert himself isn't conscious of how he solves problems, introspection is useless. Cognitive psychology has faced similar problems for many years and has developed exploratory methods that can be used to discover cognitive structure from simple data.

  6. Application of p-Multigrid to Discontinuous Galerkin Formulations of the Poisson Equation

    NASA Technical Reports Server (NTRS)

    Helenbrook, B. T.; Atkins, H. L.

    2006-01-01

    We investigate p-multigrid as a solution method for several different discontinuous Galerkin (DG) formulations of the Poisson equation. Different combinations of relaxation schemes and basis sets have been combined with the DG formulations to find the best performing combination. The damping factors of the schemes have been determined using Fourier analysis for both one and two-dimensional problems. One important finding is that when using DG formulations, the standard approach of forming the coarse p matrices separately for each level of multigrid is often unstable. To ensure stability the coarse p matrices must be constructed from the fine grid matrices using algebraic multigrid techniques. Of the relaxation schemes, we find that the combination of Jacobi relaxation with the spectral element basis is fairly effective. The results using this combination are p sensitive in both one and two dimensions, but reasonable convergence rates can still be achieved for moderate values of p and isotropic meshes. A competitive alternative is a block Gauss-Seidel relaxation. This actually out performs a more expensive line relaxation when the mesh is isotropic. When the mesh becomes highly anisotropic, the implicit line method and the Gauss-Seidel implicit line method are the only effective schemes. Adding the Gauss-Seidel terms to the implicit line method gives a significant improvement over the line relaxation method.

  7. Cloud-based adaptive exon prediction for DNA analysis

    PubMed Central

    Putluri, Srinivasareddy; Fathima, Shaik Yasmeen

    2018-01-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database. PMID:29515813

  8. Rapid authentication of the precious herb saffron by loop-mediated isothermal amplification (LAMP) based on internal transcribed spacer 2 (ITS2) sequence

    PubMed Central

    Zhao, Mingming; Shi, Yuhua; Wu, Lan; Guo, Licheng; Liu, Wei; Xiong, Chao; Yan, Song; Sun, Wei; Chen, Shilin

    2016-01-01

    Saffron is one of the most expensive species of Chinese herbs and has been subjected to various types of adulteration because of its high price and limited production. The present study introduces a loop-mediated isothermal amplification (LAMP) technique for the differentiation of saffron from its adulterants. This novel technique is sensitive, efficient and simple. Six specific LAMP primers were designed on the basis of the nucleotide sequence of the internal transcribed spacer 2 (ITS2) nuclear ribosomal DNA of Crocus sativus. All LAMP amplifications were performed successfully, and visual detection occurred within 60 min at isothermal conditions of 65 °C. The results indicated that the LAMP primers are accurate and highly specific for the discrimination of saffron from its adulterants. In particular, 10 fg of genomic DNA was determined to be the limit for template accuracy of LAMP in saffron. Thus, the proposed novel, simple, and sensitive LAMP assay is well suited for immediate on-site discrimination of herbal materials. Based on the study, a practical standard operating procedure (SOP) for utilizing the LAMP protocol for herbal authentication is provided. PMID:27146605

  9. Palatal versus vestibular piezoelectric window osteotomy for maxillary sinus elevation: a comparative clinical study of two surgical techniques.

    PubMed

    Stübinger, Stefan; Saldamli, Belma; Seitz, Oliver; Sader, Robert; Landes, Constantin A

    2009-05-01

    The goal of this study was to compare the surgical advantages and disadvantages of a new palatal access osteotomy for sinus elevation with a conventional lateral approach. In 32 patients, either a palatal (n = 16) or a lateral (n = 16) osteotomy to the maxillary sinus was performed under local anesthesia. The palatal access included a circular paramarginal incision and elevation of a palatal mucosal flap based on a median pedicle. The lateral access was performed by vestibular standard incision and development of a mucoperiosteal flap with a vestibular and superior basis. For all osteotomies a piezoelectric device was used. The sinus cavity was augmented with synthetic nanostructured hydroxyapatite graft material. Intraoperative complications during both procedures were minimal and wound healing was uneventful. Membrane perforation occurred in 19% of the palatal group and in 19% of the lateral group. Soft tissue management of the palatal technique was superior to that of the lateral approach, because the vestibular anatomy was not altered and consequently no disharmonious soft tissue scarring and no postoperative swelling occurred. The palatal approach permitted higher postoperative comfort, especially for edentulous patients, because full dentures could be incorporated directly after surgery with almost perfect fit.

  10. Management of hepatoblastoma: an update.

    PubMed

    Kremer, Nathalie; Walther, Ashley E; Tiao, Gregory M

    2014-06-01

    To summarize the current standards and guidelines for the diagnosis and management of hepatoblastoma, a rare pediatric liver tumor. Hepatoblastoma is the most common malignant liver tumor in childhood. International collaborative efforts have led to uniform implementation of the pretreatment extent of disease (PRETEXT) staging system as a means to establish consensus classification and assess upfront resectability. Additionally, current histopathological classification, in light of more advanced molecular profiling and immunohistochemical techniques and integration of tumor biomarkers into risk stratification, is reviewed. Multimodal therapy is composed of chemotherapy and surgical intervention. Achievement of complete surgical resection plays a key role in successful treatment for hepatoblastoma. Overall, outcomes have greatly improved over the past four decades because of advances in chemotherapeutic agents and administration protocols as well as innovations of surgical approach, including the use of vascular exclusion, ultrasonic dissection techniques, and liver transplantation. Challenges remain in management of high-risk patients as well as patients with recurrent or metastatic disease. Eventually, a more individualized approach to treating the different types of the heterogeneous spectrum of hepatoblastoma, in terms of different chemotherapeutic protocols and timing as well as type and extent of surgery, may become the basis of successful treatment in the more complex or advanced types of hepatoblastoma.

  11. An Improved Framework for Confound Regression and Filtering for Control of Motion Artifact in the Preprocessing of Resting-State Functional Connectivity Data

    PubMed Central

    Satterthwaite, Theodore D.; Elliott, Mark A.; Gerraty, Raphael T.; Ruparel, Kosha; Loughead, James; Calkins, Monica E.; Eickhoff, Simon B.; Hakonarson, Hakon; Gur, Ruben C.; Gur, Raquel E.; Wolf, Daniel H.

    2013-01-01

    Several recent reports in large, independent samples have demonstrated the influence of motion artifact on resting-state functional connectivity MRI (rsfc-MRI). Standard rsfc-MRI preprocessing typically includes regression of confounding signals and band-pass filtering. However, substantial heterogeneity exists in how these techniques are implemented across studies, and no prior study has examined the effect of differing approaches for the control of motion-induced artifacts. To better understand how in-scanner head motion affects rsfc-MRI data, we describe the spatial, temporal, and spectral characteristics of motion artifacts in a sample of 348 adolescents. Analyses utilize a novel approach for describing head motion on a voxelwise basis. Next, we systematically evaluate the efficacy of a range of confound regression and filtering techniques for the control of motion-induced artifacts. Results reveal that the effectiveness of preprocessing procedures on the control of motion is heterogeneous, and that improved preprocessing provides a substantial benefit beyond typical procedures. These results demonstrate that the effect of motion on rsfc-MRI can be substantially attenuated through improved preprocessing procedures, but not completely removed. PMID:22926292

  12. Rapid authentication of the precious herb saffron by loop-mediated isothermal amplification (LAMP) based on internal transcribed spacer 2 (ITS2) sequence.

    PubMed

    Zhao, Mingming; Shi, Yuhua; Wu, Lan; Guo, Licheng; Liu, Wei; Xiong, Chao; Yan, Song; Sun, Wei; Chen, Shilin

    2016-05-05

    Saffron is one of the most expensive species of Chinese herbs and has been subjected to various types of adulteration because of its high price and limited production. The present study introduces a loop-mediated isothermal amplification (LAMP) technique for the differentiation of saffron from its adulterants. This novel technique is sensitive, efficient and simple. Six specific LAMP primers were designed on the basis of the nucleotide sequence of the internal transcribed spacer 2 (ITS2) nuclear ribosomal DNA of Crocus sativus. All LAMP amplifications were performed successfully, and visual detection occurred within 60 min at isothermal conditions of 65 °C. The results indicated that the LAMP primers are accurate and highly specific for the discrimination of saffron from its adulterants. In particular, 10 fg of genomic DNA was determined to be the limit for template accuracy of LAMP in saffron. Thus, the proposed novel, simple, and sensitive LAMP assay is well suited for immediate on-site discrimination of herbal materials. Based on the study, a practical standard operating procedure (SOP) for utilizing the LAMP protocol for herbal authentication is provided.

  13. Application of copulas to improve covariance estimation for partial least squares.

    PubMed

    D'Angelo, Gina M; Weissfeld, Lisa A

    2013-02-20

    Dimension reduction techniques, such as partial least squares, are useful for computing summary measures and examining relationships in complex settings. Partial least squares requires an estimate of the covariance matrix as a first step in the analysis, making this estimate critical to the results. In addition, the covariance matrix also forms the basis for other techniques in multivariate analysis, such as principal component analysis and independent component analysis. This paper has been motivated by an example from an imaging study in Alzheimer's disease where there is complete separation between Alzheimer's and control subjects for one of the imaging modalities. This separation occurs in one block of variables and does not occur with the second block of variables resulting in inaccurate estimates of the covariance. We propose the use of a copula to obtain estimates of the covariance in this setting, where one set of variables comes from a mixture distribution. Simulation studies show that the proposed estimator is an improvement over the standard estimators of covariance. We illustrate the methods from the motivating example from a study in the area of Alzheimer's disease. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Building the School Attendance Boundary Information System (SABINS): Collecting, Processing, and Modeling K to 12 Educational Geography

    PubMed Central

    Saporito, Salvatore; Van Riper, David; Wakchaure, Ashwini

    2017-01-01

    The School Attendance Boundary Information System is a social science data infrastructure project that assembles, processes, and distributes spatial data delineating K through 12th grade school attendance boundaries for thousands of school districts in U.S. Although geography is a fundamental organizing feature of K to 12 education, until now school attendance boundary data have not been made readily available on a massive basis and in an easy-to-use format. The School Attendance Boundary Information System removes these barriers by linking spatial data delineating school attendance boundaries with tabular data describing the demographic characteristics of populations living within those boundaries. This paper explains why a comprehensive GIS database of K through 12 school attendance boundaries is valuable, how original spatial information delineating school attendance boundaries is collected from local agencies, and techniques for modeling and storing the data so they provide maximum flexibility to the user community. An important goal of this paper is to share the techniques used to assemble the SABINS database so that local and state agencies apply a standard set of procedures and models as they gather data for their regions. PMID:29151773

  15. [MRI of focal liver lesions using a 1.5 turbo-spin-echo technique compared with spin-echo technique].

    PubMed

    Steiner, S; Vogl, T J; Fischer, P; Steger, W; Neuhaus, P; Keck, H

    1995-08-01

    The aim of our study was to evaluate a T2-weighted turbo-spinecho sequence in comparison to a T2-weighted spinecho sequence in imaging focal liver lesions. In our study 35 patients with suspected focal liver lesions were examined. Standardised imaging protocol included a conventional T2-weighted SE sequence (TR/TE = 2000/90/45, acquisition time = 10.20) as well as a T2-weighted TSE sequence (TR/TE = 4700/90, acquisition time = 6.33). Calculation of S/N and C/N ratio as a basis of quantitative evaluation was done using standard methods. A diagnostic score was implemented to enable qualitative assessment. In 7% (n = 2) the TSE sequence enabled detection of further liver lesions showing a size of less than 1 cm in diameter. Comparing anatomical details the TSE sequence was superior. S/N and C/N ratio of anatomic and pathologic structures of the TSE sequence were higher compared to results of the SE sequence. Our results indicate that the T2-weighted turbo-spinecho sequence is well appropriate for imaging focal liver lesions, and leads to reduction of imaging time.

  16. A tale of two species: neural integration in zebrafish and monkeys

    PubMed Central

    Joshua, Mati; Lisberger, Stephen G.

    2014-01-01

    Selection of a model organism creates a tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called “neural integration” that occurs in the brainstems of larval zebrafish, non-human primates, and species “in between”. We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for the details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. PMID:24797331

  17. A tale of two species: Neural integration in zebrafish and monkeys.

    PubMed

    Joshua, M; Lisberger, S G

    2015-06-18

    Selection of a model organism creates tension between competing constraints. The recent explosion of modern molecular techniques has revolutionized the analysis of neural systems in organisms that are amenable to genetic techniques. Yet, the non-human primate remains the gold-standard for the analysis of the neural basis of behavior, and as a bridge to the operation of the human brain. The challenge is to generalize across species in a way that exposes the operation of circuits as well as the relationship of circuits to behavior. Eye movements provide an opportunity to cross the bridge from mechanism to behavior through research on diverse species. Here, we review experiments and computational studies on a circuit function called "neural integration" that occurs in the brainstems of larval zebrafish, primates, and species "in between". We show that analysis of circuit structure using modern molecular and imaging approaches in zebrafish has remarkable explanatory power for details of the responses of integrator neurons in the monkey. The combination of research from the two species has led to a much stronger hypothesis for the implementation of the neural integrator than could have been achieved using either species alone. Copyright © 2014 IBRO. Published by Elsevier Ltd. All rights reserved.

  18. Efficiency of methods for Karl Fischer determination of water in oils based on oven evaporation and azeotropic distillation.

    PubMed

    Larsson, William; Jalbert, Jocelyn; Gilbert, Roland; Cedergren, Anders

    2003-03-15

    The efficiency of azeotropic distillation and oven evaporation techniques for trace determination of water in oils has recently been questioned by the National Institute of Standards and Technology (NIST), on the basis of measurements of the residual water found after the extraction step. The results were obtained by volumetric Karl Fischer (KF) titration in a medium containing a large excess of chloroform (> or = 65%), a proposed prerequisite to ensure complete release of water from the oil matrix. In this work, the extent of this residual water was studied by means of a direct zero-current potentiometric technique using a KF medium containing more than 80% chloroform, which is well above the concentration recommended by NIST. A procedure is described that makes it possible to correct the results for dilution errors as well as for chemical interference effects caused by the oil matrix. The corrected values were found to be in the range of 0.6-1.5 ppm, which should be compared with the 12-34 ppm (uncorrected values) reported by NIST for the same oils. From this, it is concluded that the volumetric KF method used by NIST gives results that are much too high.

  19. Diffusion MRI in early cancer therapeutic response assessment

    PubMed Central

    Galbán, C. J.; Hoff, B. A.; Chenevert, T. L.; Ross, B. D.

    2016-01-01

    Imaging biomarkers for the predictive assessment of treatment response in patients with cancer earlier than standard tumor volumetric metrics would provide new opportunities to individualize therapy. Diffusion-weighted MRI (DW-MRI), highly sensitive to microenvironmental alterations at the cellular level, has been evaluated extensively as a technique for the generation of quantitative and early imaging biomarkers of therapeutic response and clinical outcome. First demonstrated in a rodent tumor model, subsequent studies have shown that DW-MRI can be applied to many different solid tumors for the detection of changes in cellularity as measured indirectly by an increase in the apparent diffusion coefficient (ADC) of water molecules within the lesion. The introduction of quantitative DW-MRI into the treatment management of patients with cancer may aid physicians to individualize therapy, thereby minimizing unnecessary systemic toxicity associated with ineffective therapies, saving valuable time, reducing patient care costs and ultimately improving clinical outcome. This review covers the theoretical basis behind the application of DW-MRI to monitor therapeutic response in cancer, the analytical techniques used and the results obtained from various clinical studies that have demonstrated the efficacy of DW-MRI for the prediction of cancer treatment response. PMID:26773848

  20. Building the School Attendance Boundary Information System (SABINS): Collecting, Processing, and Modeling K to 12 Educational Geography.

    PubMed

    Saporito, Salvatore; Van Riper, David; Wakchaure, Ashwini

    2013-01-01

    The School Attendance Boundary Information System is a social science data infrastructure project that assembles, processes, and distributes spatial data delineating K through 12 th grade school attendance boundaries for thousands of school districts in U.S. Although geography is a fundamental organizing feature of K to 12 education, until now school attendance boundary data have not been made readily available on a massive basis and in an easy-to-use format. The School Attendance Boundary Information System removes these barriers by linking spatial data delineating school attendance boundaries with tabular data describing the demographic characteristics of populations living within those boundaries. This paper explains why a comprehensive GIS database of K through 12 school attendance boundaries is valuable, how original spatial information delineating school attendance boundaries is collected from local agencies, and techniques for modeling and storing the data so they provide maximum flexibility to the user community. An important goal of this paper is to share the techniques used to assemble the SABINS database so that local and state agencies apply a standard set of procedures and models as they gather data for their regions.

Top