Sharlow, Elizabeth R.; Close, David; Shun, Tongying; Leimgruber, Stephanie; Reed, Robyn; Mustata, Gabriela; Wipf, Peter; Johnson, Jacob; O'Neil, Michael; Grögl, Max; Magill, Alan J.; Lazo, John S.
2009-01-01
Patients with clinical manifestations of leishmaniasis, including cutaneous leishmaniasis, have limited treatment options, and existing therapies frequently have significant untoward liabilities. Rapid expansion in the diversity of available cutaneous leishmanicidal chemotypes is the initial step in finding alternative efficacious treatments. To this end, we combined a low-stringency Leishmania major promastigote growth inhibition assay with a structural computational filtering algorithm. After a rigorous assay validation process, we interrogated ∼200,000 unique compounds for L. major promastigote growth inhibition. Using iterative computational filtering of the compounds exhibiting >50% inhibition, we identified 553 structural clusters and 640 compound singletons. Secondary confirmation assays yielded 93 compounds with EC50s ≤ 1 µM, with none of the identified chemotypes being structurally similar to known leishmanicidals and most having favorable in silico predicted bioavailability characteristics. The leishmanicidal activity of a representative subset of 15 chemotypes was confirmed in two independent assay formats, and L. major parasite specificity was demonstrated by assaying against a panel of human cell lines. Thirteen chemotypes inhibited the growth of a L. major axenic amastigote-like population. Murine in vivo efficacy studies using one of the new chemotypes document inhibition of footpad lesion development. These results authenticate that low stringency, large-scale compound screening combined with computational structure filtering can rapidly expand the chemotypes targeting in vitro and in vivo Leishmania growth and viability. PMID:19888337
Wimpee, C F; Nadeau, T L; Nealson, K H
1991-01-01
By using two highly conserved region of the luxA gene as primers, polymerase chain reaction amplification methods were used to prepare species-specific probes against the luciferase gene from four major groups of marine luminous bacteria. Laboratory studies with test strains indicated that three of the four probes cross-reacted with themselves and with one or more of the other species at low stringencies but were specific for members of their own species at high stringencies. The fourth probe, generated from Vibrio harveyi DNA, cross-reacted with DNAs from two closely related species, V. orientalis and V. vulnificus. When nonluminous cultures were tested with the species-specific probes, no false-positive results were observed, even at low stringencies. Two field isolates were correctly identified as Photobacterium phosphoreum by using the species-specific hybridization probes at high stringency. A mixed probe (four different hybridization probes) used at low stringency gave positive results with all of the luminous bacteria tested, including the terrestrial species, Xenorhabdus luminescens, and the taxonomically distinct marine bacterial species Shewanella hanedai; minimal cross-hybridization with these species was seen at higher stringencies. Images PMID:1854194
Practical and Theoretical Requirements for Controlling Rater Stringency in Peer Review.
ERIC Educational Resources Information Center
Cason, Gerald J.; Cason, Carolyn L.
This study describes a computer based, performance rating information processing system, performance rating theory, and programs for the application of the theory to obtain ratings free from the effects of reviewer stringency in reviewing abstracts of conference papers. Originally, the Performance Rating (PR) System was used to evaluate the…
40 CFR 51.351 - Enhanced I/M performance standard.
Code of Federal Regulations, 2011 CFR
2011-07-01
... vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver... 2001 and newer vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year... which will be achieved by the I/M program design in the SIP to those of the model program described in...
40 CFR 51.351 - Enhanced I/M performance standard.
Code of Federal Regulations, 2012 CFR
2012-07-01
... vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver... 2001 and newer vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year... which will be achieved by the I/M program design in the SIP to those of the model program described in...
40 CFR 51.351 - Enhanced I/M performance standard.
Code of Federal Regulations, 2010 CFR
2010-07-01
... vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver... 2001 and newer vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year... which will be achieved by the I/M program design in the SIP to those of the model program described in...
40 CFR 51.351 - Enhanced I/M performance standard.
Code of Federal Regulations, 2014 CFR
2014-07-01
... vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver... 2001 and newer vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year... which will be achieved by the I/M program design in the SIP to those of the model program described in...
40 CFR 51.351 - Enhanced I/M performance standard.
Code of Federal Regulations, 2013 CFR
2013-07-01
... vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year vehicles. (11) Waiver... 2001 and newer vehicles. (10) Stringency. A 20% emission test failure rate among pre-1981 model year... which will be achieved by the I/M program design in the SIP to those of the model program described in...
Certificate of need legislation and the dissemination of robotic surgery for prostate cancer.
Jacobs, Bruce L; Zhang, Yun; Skolarus, Ted A; Wei, John T; Montie, James E; Schroeck, Florian R; Hollenbeck, Brent K
2013-01-01
The uncertainty about the incremental benefit of robotic prostatectomy and its higher associated costs makes it an ideal target for state based certificate of need laws, which have been enacted in several states. We studied the relationship between certificate of need laws and market level adoption of robotic prostatectomy. We used SEER (Surveillance, Epidemiology, and End Results)-Medicare data from 2003 through 2007 to identify men 66 years old or older treated with prostatectomy for prostate cancer. Using data from the American Health Planning Association, we categorized Health Service Areas according to the stringency of certificate of need regulations (ie low vs high stringency) presiding over that market. We assessed our outcomes (probability of adopting robotic prostatectomy and propensity for robotic prostatectomy use in adopting Health Service Areas) using Cox proportional hazards and Poisson regression models, respectively. Compared to low stringency markets, high stringency markets were more racially diverse (54% vs 15% nonwhite, p <0.01), and had similar population densities (886 vs 861 people per square mile, p = 0.97) and median incomes ($42,344 vs $39,770, p = 0.56). In general, both market types had an increase in the adoption and utilization of robotic prostatectomy. However, the probability of robotic prostatectomy adoption (p = 0.22) did not differ based on a market's certificate of need stringency and use was lower in high stringency markets (p <0.01). State based certificate of need regulations were ineffective in constraining robotic surgery adoption. Despite decreased use in high stringency markets, similar adoption rates suggest that other factors impact the diffusion of robotic prostatectomy. Copyright © 2013 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
The first successful use of a low stringency familial match in a French criminal investigation.
Pham-Hoai, Emmanuel; Crispino, Frank; Hampikian, Greg
2014-05-01
We describe how a very simple application of familial searching resolved a decade-old, high-profile rape/murder in France. This was the first use of familial searching in a criminal case using the French STR DNA database, which contains approximately 1,800,000 profiles. When an unknown forensic profile (18 loci) was searched against the French arrestee/offender database using CODIS configured for a low stringency search, a single low stringency match was identified. This profile was attributed to the father of the man suspected to be the source of the semen recovered from the murder victim Elodie Kulik. The identification was confirmed using Y-chromosome DNA from the putative father, an STR profile from the mother, and finally a tissue sample from the exhumed body of the man who left the semen. Because of this identification, the investigators are now pursuing possible co-conspirators. © 2014 American Academy of Forensic Sciences.
McManus, IC; Thompson, M; Mollon, J
2006-01-01
Background A potential problem of clinical examinations is known as the hawk-dove problem, some examiners being more stringent and requiring a higher performance than other examiners who are more lenient. Although the problem has been known qualitatively for at least a century, we know of no previous statistical estimation of the size of the effect in a large-scale, high-stakes examination. Here we use FACETS to carry out a multi-facet Rasch modelling of the paired judgements made by examiners in the clinical examination (PACES) of MRCP(UK), where identical candidates were assessed in identical situations, allowing calculation of examiner stringency. Methods Data were analysed from the first nine diets of PACES, which were taken between June 2001 and March 2004 by 10,145 candidates. Each candidate was assessed by two examiners on each of seven separate tasks. with the candidates assessed by a total of 1,259 examiners, resulting in a total of 142,030 marks. Examiner demographics were described in terms of age, sex, ethnicity, and total number of candidates examined. Results FACETS suggested that about 87% of main effect variance was due to candidate differences, 1% due to station differences, and 12% due to differences between examiners in leniency-stringency. Multiple regression suggested that greater examiner stringency was associated with greater examiner experience and being from an ethnic minority. Male and female examiners showed no overall difference in stringency. Examination scores were adjusted for examiner stringency and it was shown that for the present pass mark, the outcome for 95.9% of candidates would be unchanged using adjusted marks, whereas 2.6% of candidates would have passed, even though they had failed on the basis of raw marks, and 1.5% of candidates would have failed, despite passing on the basis of raw marks. Conclusion Examiners do differ in their leniency or stringency, and the effect can be estimated using Rasch modelling. The reasons for differences are not clear, but there are some demographic correlates, and the effects appear to be reliable across time. Account can be taken of differences, either by adjusting marks or, perhaps more effectively and more justifiably, by pairing high and low stringency examiners, so that raw marks can be used in the determination of pass and fail. PMID:16919156
Measuring the stringency of states' indoor tanning regulations: instrument development and outcomes.
Woodruff, Susan I; Pichon, Latrice C; Hoerster, Katherine D; Forster, Jean L; Gilmer, Todd; Mayer, Joni A
2007-05-01
We sought to describe the development of an instrument to quantify the stringency of state indoor tanning legislation in the United States, and the instrument's psychometric properties. The instrument was then used to rate the stringency of state laws. A 35-item instrument was developed. An overall stringency measure and 9 stringency subscales were developed, including one measuring minors' access to indoor tanning. Stringency measures showed good internal consistency and interrater reliability. In all, 55% of the 50 states and the District of Columbia had any indoor tanning law, and 41% had any law addressing minors' access. Oregon, Illinois, South Carolina, Florida, Indiana, Iowa, and Rhode Island had high overall stringency scores, and Texas and New Hampshire were the most restrictive with regard to minors' access. Measurement of actual enforcement of the laws was not included in this study. The instrument appears to be an easy-to-use, reliable, and valid methodology. Application of the instrument to actual laws showed that, in general, state laws are relatively weak, although there was considerable variability by state.
Liu, Tingting; Sin, Mandy L. Y.; Pyne, Jeff D.; Gau, Vincent; Liao, Joseph C.; Wong, Pak Kin
2013-01-01
Rapid detection of bacterial pathogens is critical toward judicious management of infectious diseases. Herein, we demonstrate an in situ electrokinetic stringency control approach for a self-assembled monolayer-based electrochemical biosensor toward urinary tract infection diagnosis. The in situ electrokinetic stringency control technique generates Joule heating induced temperature rise and electrothermal fluid motion directly on the sensor to improve its performance for detecting bacterial 16S rRNA, a phylogenetic biomarker. The dependence of the hybridization efficiency reveals that in situ electrokinetic stringency control is capable of discriminating single-base mismatches. With electrokinetic stringency control, the background noise due to the matrix effects of clinical urine samples can be reduced by 60%. The applicability of the system is demonstrated by multiplex detection of three uropathogenic clinical isolates with similar 16S rRNA sequences. The results demonstrate that electrokinetic stringency control can significantly improve the signal-to-noise ratio of the biosensor for multiplex urinary tract infection diagnosis. PMID:23891989
Liu, Tingting; Sin, Mandy L Y; Pyne, Jeff D; Gau, Vincent; Liao, Joseph C; Wong, Pak Kin
2014-01-01
Rapid detection of bacterial pathogens is critical toward judicious management of infectious diseases. Herein, we demonstrate an in situ electrokinetic stringency control approach for a self-assembled monolayer-based electrochemical biosensor toward urinary tract infection diagnosis. The in situ electrokinetic stringency control technique generates Joule heating induced temperature rise and electrothermal fluid motion directly on the sensor to improve its performance for detecting bacterial 16S rRNA, a phylogenetic biomarker. The dependence of the hybridization efficiency reveals that in situ electrokinetic stringency control is capable of discriminating single-base mismatches. With electrokinetic stringency control, the background noise due to the matrix effects of clinical urine samples can be reduced by 60%. The applicability of the system is demonstrated by multiplex detection of three uropathogenic clinical isolates with similar 16S rRNA sequences. The results demonstrate that electrokinetic stringency control can significantly improve the signal-to-noise ratio of the biosensor for multiplex urinary tract infection diagnosis. Urinary tract infections remain a significant cause of mortality and morbidity as secondary conditions often related to chronic diseases or to immunosuppression. Rapid and sensitive identification of the causative organisms is critical in the appropriate management of this condition. These investigators demonstrate an in situ electrokinetic stringency control approach for a self-assembled monolayer-based electrochemical biosensor toward urinary tract infection diagnosis, establishing that such an approach significantly improves the biosensor's signal-to-noise ratio. © 2013.
Cohen-Khait, Ruth; Schreiber, Gideon
2016-01-01
Protein–protein interactions occur via well-defined interfaces on the protein surface. Whereas the location of homologous interfaces is conserved, their composition varies, suggesting that multiple solutions may support high-affinity binding. In this study, we examined the plasticity of the interface of TEM1 β-lactamase with its protein inhibitor BLIP by low-stringency selection of a random TEM1 library using yeast surface display. Our results show that most interfacial residues could be mutated without a loss in binding affinity, protein stability, or enzymatic activity, suggesting plasticity in the interface composition supporting high-affinity binding. Interestingly, many of the selected mutations promoted faster association. Further selection for faster binders was achieved by drastically decreasing the library–ligand incubation time to 30 s. Preequilibrium selection as suggested here is a novel methodology for specifically selecting faster-associating protein complexes. PMID:27956635
Stringency and relaxation among the halobacteria.
Cimmino, C; Scoarughi, G L; Donini, P
1993-01-01
Accumulation of stable RNA and production of guanosine polyphosphates (ppGpp and pppGpp) were studied during amino acid starvation in four species of halobacteria. In two of the four species, stable RNA was under stringent control, whereas one of the remaining two species was relaxed and the other gave an intermediate phenotype. The stringent reaction was reversed by anisomycin, an effect analogous to the chloroamphenicol-induced reversal of stringency in the eubacteria. During the stringent response, neither ppGpp nor pppGpp accumulation took place during starvation. In both growing and starved cells a very low basal level of the two polyphosphates appeared to be present. In the stringent species the intracellular concentration of GTP did not diminish but actually increased during the course of the stringent response. These data demonstrate that (i) wild-type halobacteria can have either the stringent or the relaxed phenotype (all wild-type eubacteria tested have been shown to be stringent); (ii) stringency in the halobacteria is dependent on the deaminoacylation of tRNA, as in the eubacteria; and (iii) in the halobacteria, ppGpp is not an effector of stringent control over stable-RNA synthesis. Images PMID:7691798
Pena, S D; Barreto, G; Vago, A R; De Marco, L; Reinach, F C; Dias Neto, E; Simpson, A J
1994-01-01
Low-stringency single specific primer PCR (LSSP-PCR) is an extremely simple PCR-based technique that detects single or multiple mutations in gene-sized DNA fragments. A purified DNA fragment is subjected to PCR using high concentrations of a single specific oligonucleotide primer, large amounts of Taq polymerase, and a very low annealing temperature. Under these conditions the primer hybridizes specifically to its complementary region and nonspecifically to multiple sites within the fragment, in a sequence-dependent manner, producing a heterogeneous set of reaction products resolvable by electrophoresis. The complex banding pattern obtained is significantly altered by even a single-base change and thus constitutes a unique "gene signature." Therefore LSSP-PCR will have almost unlimited application in all fields of genetics and molecular medicine where rapid and sensitive detection of mutations and sequence variations is important. The usefulness of LSSP-PCR is illustrated by applications in the study of mutants of smooth muscle myosin light chain, analysis of a family with X-linked nephrogenic diabetes insipidus, and identity testing using human mitochondrial DNA. Images PMID:8127912
Hecht, S J; Stedman, K E; Carlson, J O; DeMartini, J C
1996-01-01
The jaagsiekte sheep retrovirus (JSRV), which appears to be a type B/D retrovirus chimera, has been incriminated as the cause of ovine pulmonary carcinoma. Recent studies suggest that the sequences related to this virus are found in the genomes of normal sheep and goats. To learn whether there are breeds of sheep that lack the endogenous viral sequences and to study their distribution among other groups of mammals, we surveyed several domestic sheep and goat breeds, other ungulates, and various mammal groups for sequences related to JSRV. Probes prepared from the envelope (SU) region of JSRV and the capsid (CA) region of a Peruvian type D virus related to JSRV were used in Southern blot hybridization with genomic DNA followed by low- and high-stringency washes. Fifteen to 20 CA and SU bands were found in all members of the 13 breeds of domestic sheep and 6 breeds of goats tested. There were similar findings in 6 wild Ovis and Capra genera. Within 22 other genera of Bovidae including domestic cattle, and 7 other families of Artiodactyla including Cervidae, there were usually a few CA or SU bands at low stringency and rare bands at high stringency. Among 16 phylogenetically distant genera, there were generally fewer bands hybridizing with either probe. These results reveal wide-spread phylogenetic distribution of endogenous type B and type D retroviral sequences related to JSRV among mammals and argue for further investigation of their potential role in disease. Images Fig. 1 Fig. 2 Fig. 3 Fig. 4 Fig. 5 PMID:8622932
Risk of breast cancer with CXCR4-using HIV defined by V3 loop sequencing.
Goedert, James J; Swenson, Luke C; Napolitano, Laura A; Haddad, Mojgan; Anastos, Kathryn; Minkoff, Howard; Young, Mary; Levine, Alexandra; Adeyemi, Oluwatoyin; Seaberg, Eric C; Aouizerat, Bradley; Rabkin, Charles S; Harrigan, P Richard; Hessol, Nancy A
2015-01-01
Evaluate the risk of female breast cancer associated with HIV-CXCR4 (X4) tropism as determined by various genotypic measures. A breast cancer case-control study, with pairwise comparisons of tropism determination methods, was conducted. From the Women's Interagency HIV Study repository, one stored plasma specimen was selected from 25 HIV-infected cases near the breast cancer diagnosis date and 75 HIV-infected control women matched for age and calendar date. HIV-gp120 V3 sequences were derived by Sanger population sequencing (PS) and 454-pyro deep sequencing (DS). Sequencing-based HIV-X4 tropism was defined using the geno2pheno algorithm, with both high-stringency DS [false-positive rate (3.5) and 2% X4 cutoff], and lower stringency DS (false-positive rate, 5.75 and 15% X4 cutoff). Concordance of tropism results by PS, DS, and previously performed phenotyping was assessed with kappa (κ) statistics. Case-control comparisons used exact P values and conditional logistic regression. In 74 women (19 cases, 55 controls) with complete results, prevalence of HIV-X4 by PS was 5% in cases vs 29% in controls (P = 0.06; odds ratio, 0.14; confidence interval: 0.003 to 1.03). Smaller case-control prevalence differences were found with high-stringency DS (21% vs 36%, P = 0.32), lower stringency DS (16% vs 35%, P = 0.18), and phenotyping (11% vs 31%, P = 0.10). HIV-X4 tropism concordance was best between PS and lower stringency DS (93%, κ = 0.83). Other pairwise concordances were 82%-92% (κ = 0.56-0.81). Concordance was similar among cases and controls. HIV-X4 defined by population sequencing (PS) had good agreement with lower stringency DS and was significantly associated with lower odds of breast cancer.
1986-06-01
our preliminary studies hybridization with the Droso- phila actin probe required such low stringency conditions that the signal to noise ratio made...Balabacensis complex of Southeast Asia (Diptera: Culicidae). Genetica 57:81-86. (14) Mahon RJ and PM Miethke. 1982. Anopheles farauti No. 3, a hitherto un
Human papillomavirus type 16 DNA in periungual squamous cell carcinomas
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moy, R.L.; Eliezri, Y.D.; Bennett, R.G.
1989-05-12
Ten squamous cell carcinomas (in situ or invasive) of the fingernail region were analyzed for the presence of DNA sequences homologous to human papilloma-virus (HPV) by dot blot hybridization. In most patients, the lesions were verrucae of long-term duration that were refractory to conventional treatment methods. Eight of the lesions contained HPV DNA sequences, and in six of these the sequences were related to HPV 16 as deduced from low-stringency nucleic acid hybridization followed by low- and high-stringency washes. Furthermore, the restriction endonuclease digestion pattern of DNA isolated from four of these lesions was diagnostic of episomal HPV 16. Themore » high-frequency association of HPV 16 with periungual squamous cell carcinoma is similar to that reported for HPV 16 with squamous cell carcinomas on mucous membranes at other sites, notably the genital tract. The findings suggest that HPV 16 may play an important role in the development of squamous cell carcinomas of the finger, most notably those lesions that are chronic and located in the periungual area.« less
NASA Astrophysics Data System (ADS)
Garrett, Rachael D.; Carlson, Kimberly M.; Rueda, Ximena; Noojipady, Praveen
2016-04-01
Multi-stakeholder roundtables offering certification programs are promising voluntary governance mechanisms to address sustainability issues associated with international agricultural supply chains. Yet, little is known about whether roundtable certifications confer additionality, the benefits of certification beyond what would be expected from policies and practices currently in place. Here, we examine the potential additionality of the Round table on Responsible Soybeans (RTRS) and the Roundtable on Sustainable Palm Oil (RSPO) in mitigating conversion of native vegetation to cropland. We develop a metric of additionality based on business as usual land cover change dynamics and roundtable standard stringency relative to existing policies. We apply this metric to all countries with RTRS (n = 8) and RSPO (n = 12) certified production in 2013-2014, as well as countries that have no certified production but are among the top ten global producers in terms of soy (n = 2) and oil palm (n = 2). We find RSPO and RTRS both have substantially higher levels of stringency than existing national policies except in Brazil and Uruguay. In regions where these certification standards are adopted, the mean estimated rate of tree cover conversion to the target crop is similar for both standards. RTRS has higher mean relative stringency than the RSPO, yet RSPO countries have slightly higher enforcement levels. Therefore, mean potential additionality of RTRS and RSPO is similar across regions. Notably, countries with the highest levels of additionality have some adoption. However, with extremely low adoption rates (0.41% of 2014 global harvested area), RTRS likely has lower impact than RSPO (14%). Like most certification programs, neither roundtable is effectively targeting smallholder producers. To improve natural ecosystem protection, roundtables could target adoption to regions with low levels of environmental governance and high rates of forest-to-cropland conversion.
78 FR 66648 - Approval and Promulgation of Implementation Plans; Texas; Procedures for Stringency...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-06
... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 52 [EPA-R06-OAR-2010-0335; FRL-9902-50-Region 6] Approval and Promulgation of Implementation Plans; Texas; Procedures for Stringency Determinations and Minor Permit Revisions for Federal Operating Permits AGENCY: Environmental Protection Agency (EPA...
78 FR 55234 - Approval and Promulgation of Implementation Plans; Texas; Procedures for Stringency...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 52 [EPA-R06-OAR-2010-0335; FRL-9900-81-Region6] Approval and Promulgation of Implementation Plans; Texas; Procedures for Stringency Determinations and Minor Permit Revisions for Federal Operating Permits AGENCY: Environmental Protection Agency (EPA...
ERIC Educational Resources Information Center
Wei, Xin
2012-01-01
This study developed a comprehensive measure of the stringency level of NCLB states' accountability systems, including the strength of their annual measurable objectives, confidence intervals, performance indexing, retesting, minimum subgroup size, and the difficulty levels of proficiency standards. This study related accountability stringency in…
ERIC Educational Resources Information Center
Peelo, Moira
2013-01-01
This paper explores one practitioner's learning development work with PhD students in a changing university context in which managerialism and financial stringency have combined. It questions how learning development practitioners can maintain their professional goals while negotiating issues arising from managerialism, financial stringency,…
Risk of Breast Cancer with CXCR4-using HIV Defined by V3-Loop Sequencing
Goedert, James J.; Swenson, Luke C.; Napolitano, Laura A.; Haddad, Mojgan; Anastos, Kathryn; Minkoff, Howard; Young, Mary; Levine, Alexandra; Adeyemi, Oluwatoyin; Seaberg, Eric C.; Aouizerat, Bradley; Rabkin, Charles S.; Harrigan, P. Richard; Hessol, Nancy A.
2014-01-01
Objective Evaluate the risk of female breast cancer associated with HIV-CXCR4 (X4) tropism as determined by various genotypic measures. Methods A breast cancer case-control study, with pairwise comparisons of tropism determination methods, was conducted. From the Women's Interagency HIV Study repository, one stored plasma specimen was selected from 25 HIV-infected cases near the breast cancer diagnosis date and 75 HIV-infected control women matched for age and calendar date. HIVgp120-V3 sequences were derived by Sanger population sequencing (PS) and 454-pyro deep sequencing (DS). Sequencing-based HIV-X4 tropism was defined using the geno2pheno algorithm, with both high-stringency DS [False-Positive-Rate (FPR 3.5) and 2% X4 cutoff], and lower stringency DS (FPR 5.75, 15% X4 cut-off). Concordance of tropism results by PS, DS, and previously performed phenotyping was assessed with kappa (κ) statistics. Case-control comparisons used exact P-values and conditional logistic regression. Results In 74 women (19 cases, 55 controls) with complete results, prevalence of HIV-X4 by PS was 5% in cases vs 29% in controls (P=0.06, odds ratio 0.14, confidence interval 0.003-1.03). Smaller case-control prevalence differences were found with high-stringency DS (21% vs 36%, P=0.32), lower-stringency DS (16% vs 35%, P=0.18), and phenotyping (11% vs 31%, P=0.10). HIV-X4-tropism concordance was best between PS and lower-stringency DS (93%, κ=0.83). Other pairwise concordances were 82%-92% (κ=0.56-0.81). Concordance was similar among cases and controls. Conclusions HIV-X4 defined by population sequencing (PS) had good agreement with lower stringency deep sequencing and was significantly associated with lower odds of breast cancer. PMID:25321183
Vinner, Lasse; Mourier, Tobias; Friis-Nielsen, Jens; Gniadecki, Robert; Dybkaer, Karen; Rosenberg, Jacob; Langhoff, Jill Levin; Cruz, David Flores Santa; Fonager, Jannik; Izarzugaza, Jose M G; Gupta, Ramneek; Sicheritz-Ponten, Thomas; Brunak, Søren; Willerslev, Eske; Nielsen, Lars Peter; Hansen, Anders Johannes
2015-08-19
Although nearly one fifth of all human cancers have an infectious aetiology, the causes for the majority of cancers remain unexplained. Despite the enormous data output from high-throughput shotgun sequencing, viral DNA in a clinical sample typically constitutes a proportion of host DNA that is too small to be detected. Sequence variation among virus genomes complicates application of sequence-specific, and highly sensitive, PCR methods. Therefore, we aimed to develop and characterize a method that permits sensitive detection of sequences despite considerable variation. We demonstrate that our low-stringency in-solution hybridization method enables detection of <100 viral copies. Furthermore, distantly related proviral sequences may be enriched by orders of magnitude, enabling discovery of hitherto unknown viral sequences by high-throughput sequencing. The sensitivity was sufficient to detect retroviral sequences in clinical samples. We used this method to conduct an investigation for novel retrovirus in samples from three cancer types. In accordance with recent studies our investigation revealed no retroviral infections in human B-cell lymphoma cells, cutaneous T-cell lymphoma or colorectal cancer biopsies. Nonetheless, our generally applicable method makes sensitive detection possible and permits sequencing of distantly related sequences from complex material.
Shah, H N; Gharbia, S E; Scully, C; Finegold, S M
1995-03-01
Eight oligonucleotides based upon regions of the small subunit 16S ribosomal RNA gene sequences were analysed against a background of their position within the molecule and their two-dimensional structure to rationalise their use in recognising Prevotella intermedia and Prevotella nigrescens. The 41 clinical isolates from both oral and respiratory sites and two reference strains were subjected to DNA-DNA hybridisation and multilocus enzyme electrophoresis to confirm their identity. Alignment of oligonucleotide probes designated I Bi-2 to I Bi-6 (for P. intermedia) and 2Bi-2 (for P. nigrescens) with the 16S rRNA suggested that these probes lacked specificity or were constructed from hypervariable regions. A 52-mer oligonucleotide (designated Bi) reliably detected both species. Because of the high degree of concordance between the 16S rRNAs of both species, it was necessary to vary the stringency of hybridisation conditions for detection of both species. Thus probe I Bi-I recognised P. intermedia while I Bi-I detected both P. intermedia and P. nigrescens at low stringency. However, under conditions of high stringency only P. nigrescens was recognised by probe 2Bi-I. These probes were highly specific and did not hybridise with DNA from the closely related P. corporis, nor other periodontal pathogens such as Fusobacterium nucleatum, Actinobacillus actinomycetemcomitans, Treponema denticola and several pigmented species such as Prevotella melaninogenica, P. denticola, P. loescheii, Porphyromonas asaccharolytica, Py. endodontalis, Py. gingivalis, Py. levii, and Py. macacae.
The Effect of State Regulatory Stringency on Nursing Home Quality
Mukamel, Dana B; Weimer, David L; Harrington, Charlene; Spector, William D; Ladd, Heather; Li, Yue
2012-01-01
Objective To test the hypothesis that more stringent quality regulations contribute to better quality nursing home care and to assess their cost-effectiveness. Data Sources/Setting Primary and secondary data from all states and U.S. nursing homes between 2005 and 2006. Study Design We estimated seven models, regressing quality measures on the Harrington Regulation Stringency Index and control variables. To account for endogeneity between regulation and quality, we used instrumental variables techniques. Quality was measured by staffing hours by type per case-mix adjusted day, hotel expenditures, and risk-adjusted decline in activities of daily living, high-risk pressure sores, and urinary incontinence. Data Collection All states' licensing and certification offices were surveyed to obtain data about deficiencies. Secondary data included the Minimum Data Set, Medicare Cost Reports, and the Economic Freedom Index. Principal Findings Regulatory stringency was significantly associated with better quality for four of the seven measures studied. The cost-effectiveness for the activities-of-daily-living measure was estimated at about 72,000 in 2011/ Quality Adjusted Life Year. Conclusions Quality regulations lead to better quality in nursing homes along some dimensions, but not all. Our estimates of cost-effectiveness suggest that increased regulatory stringency is in the ballpark of other acceptable cost-effective practices. PMID:22946859
NASA Astrophysics Data System (ADS)
Noirel, Josselin; Simonson, Thomas
2008-11-01
Following Kimura's neutral theory of molecular evolution [M. Kimura, The Neutral Theory of Molecular Evolution (Cambridge University Press, Cambridge, 1983) (reprinted in 1986)], it has become common to assume that the vast majority of viable mutations of a gene confer little or no functional advantage. Yet, in silico models of protein evolution have shown that mutational robustness of sequences could be selected for, even in the context of neutral evolution. The evolution of a biological population can be seen as a diffusion on the network of viable sequences. This network is called a "neutral network." Depending on the mutation rate μ and the population size N, the biological population can evolve purely randomly (μN ≪1) or it can evolve in such a way as to select for sequences of higher mutational robustness (μN ≫1). The stringency of the selection depends not only on the product μN but also on the exact topology of the neutral network, the special arrangement of which was named "superfunnel." Even though the relation between mutation rate, population size, and selection was thoroughly investigated, a study of the salient topological features of the superfunnel that could affect the strength of the selection was wanting. This question is addressed in this study. We use two different models of proteins: on lattice and off lattice. We compare neutral networks computed using these models to random networks. From this, we identify two important factors of the topology that determine the stringency of the selection for mutationally robust sequences. First, the presence of highly connected nodes ("hubs") in the network increases the selection for mutationally robust sequences. Second, the stringency of the selection increases when the correlation between a sequence's mutational robustness and its neighbors' increases. The latter finding relates a global characteristic of the neutral network to a local one, which is attainable through experiments or molecular modeling.
Noirel, Josselin; Simonson, Thomas
2008-11-14
Following Kimura's neutral theory of molecular evolution [M. Kimura, The Neutral Theory of Molecular Evolution (Cambridge University Press, Cambridge, 1983) (reprinted in 1986)], it has become common to assume that the vast majority of viable mutations of a gene confer little or no functional advantage. Yet, in silico models of protein evolution have shown that mutational robustness of sequences could be selected for, even in the context of neutral evolution. The evolution of a biological population can be seen as a diffusion on the network of viable sequences. This network is called a "neutral network." Depending on the mutation rate mu and the population size N, the biological population can evolve purely randomly (muN<1) or it can evolve in such a way as to select for sequences of higher mutational robustness (muN>1). The stringency of the selection depends not only on the product muN but also on the exact topology of the neutral network, the special arrangement of which was named "superfunnel." Even though the relation between mutation rate, population size, and selection was thoroughly investigated, a study of the salient topological features of the superfunnel that could affect the strength of the selection was wanting. This question is addressed in this study. We use two different models of proteins: on lattice and off lattice. We compare neutral networks computed using these models to random networks. From this, we identify two important factors of the topology that determine the stringency of the selection for mutationally robust sequences. First, the presence of highly connected nodes ("hubs") in the network increases the selection for mutationally robust sequences. Second, the stringency of the selection increases when the correlation between a sequence's mutational robustness and its neighbors' increases. The latter finding relates a global characteristic of the neutral network to a local one, which is attainable through experiments or molecular modeling.
A process for the quantification of aircraft noise and emissions interdependencies
NASA Astrophysics Data System (ADS)
de Luis, Jorge
The main purpose of this dissertation is to develop a process to improve actual policy-making procedures in terms of aviation environmental effects. This research work expands current practices with physics based publicly available models. The current method uses solely information provided by industry members, and this information is usually proprietary, and not physically intuitive. The process herein proposed provides information regarding the interdependencies between the environmental effects of aircraft. These interdependencies are also tied to the actual physical parameters of the aircraft and the engine, making it more intuitive for decision-makers to understand the impacts to the vehicle due to different policy scenarios. These scenarios involve the use of fleet analysis tools in which the existing aircraft are used to predict the environmental effects of imposing new stringency levels. The aircraft used are reduced to a series of coefficients that represent their performance, in terms of flight characteristics, fuel burn, noise, and emissions. These coefficients are then utilized to model flight operations and calculate what the environmental impacts of those aircraft are. If a particular aircraft does not meet the stringency to be analyzed, a technology response is applied to it, in order to meet that stringency. Depending on the level of reduction needed, this technology response can have an effect on the fuel burn characteristic of the aircraft. Another important point of the current stringency analysis process is that it does not take into account both noise and emissions concurrently, but instead, it considers them separately, one at a time. This assumes that the interdependencies between the two do not exists, which is not realistic. The latest stringency process delineated in 2004 imposed a 2% fuel burn penalty for any required improvements on NOx, no matter the type of aircraft or engine, assuming that no company had the ability to produce a vehicle with similar characteristics. This left all the performance characteristics of the aircraft untouched, except for the fuel burn, including the noise performance. The proposed alternative is to create a fleet of replacement aircraft to the current fleet that does not meet stringency. These replacement aircraft represent the achievable physical limits for state of the art systems. In this research work, the interdependencies between NOx, noise, and fuel burn are not neglected, and it is in fact necessary to take all three into account, simultaneously, to capture the physical limits that can be attained during a stringency analysis. In addition, the replacement aircraft show the linkage between environmental effects and fundamental aircraft and engine characteristics, something that has been neglected in previous policy making procedures. Another aspect that has been ignored is the creation of the coefficients used for the fleet analyses. In current literature, a defined process for the creation of those coefficients does not exist, but this research work develops a process to do so and demonstrates that the characteristics of the aircraft can be propagated to the coefficients and to the fleet analysis tools. The implementation of the process proposed shows that, first, the environmental metrics can be linked to the physical attributes of the aircraft using non-proprietary, physics based tools, second, those interdependencies can be propagated to fleet level tools, and third, this propagation provides an improvement in the policy making process, by showing what needs to change in an aircraft to meet different stringency levels.
Bio-Inspired Engineering of Protein-Based Heat Sensors
2004-01-01
of Thermosensitive Proteins. 23 3.1 Introduction 23 3.2 Low Stringency PCR Identification of TRPV1 Homologues from Pit Viper Trigeminal Ganglion...Methods and Results. 24 3.3 Directed Evolution of TRPV1 Protein. 25 3.4 Methods and Results 25 3.5 References 27 Pappas, TC F49620-01-1-0552 3 1. Unique...cation channel TRPV1 . Thermal nociceptive neurons are fairly plentiful, and thus benefited studies linking TRPVI to thermal responses. The snake pit
Genome-Wide Discovery of Long Non-Coding RNAs in Rainbow Trout.
Al-Tobasei, Rafet; Paneru, Bam; Salem, Mohamed
2016-01-01
The ENCODE project revealed that ~70% of the human genome is transcribed. While only 1-2% of the RNAs encode for proteins, the rest are non-coding RNAs. Long non-coding RNAs (lncRNAs) form a diverse class of non-coding RNAs that are longer than 200 nt. Emerging evidence indicates that lncRNAs play critical roles in various cellular processes including regulation of gene expression. LncRNAs show low levels of gene expression and sequence conservation, which make their computational identification in genomes difficult. In this study, more than two billion Illumina sequence reads were mapped to the genome reference using the TopHat and Cufflinks software. Transcripts shorter than 200 nt, with more than 83-100 amino acids ORF, or with significant homologies to the NCBI nr-protein database were removed. In addition, a computational pipeline was used to filter the remaining transcripts based on a protein-coding-score test. Depending on the filtering stringency conditions, between 31,195 and 54,503 lncRNAs were identified, with only 421 matching known lncRNAs in other species. A digital gene expression atlas revealed 2,935 tissue-specific and 3,269 ubiquitously-expressed lncRNAs. This study annotates the lncRNA rainbow trout genome and provides a valuable resource for functional genomics research in salmonids.
1987-11-15
analysis. However, in our preliminary studies, hybridization with the DPro.5ohil actin probe required such low stringency conditions that the signal to...rDNA genes and could therefore contain seOuencec tjhich, under normal DNA hybridization conditions , behave in a species-specific mrnner. We theref’-e...pAGr23B) behave as species-specific probes under the conditions normally used for DNA hybridization. These sequences could be used to design specific
Underage alcohol policies across 50 California cities: an assessment of best practices
2012-01-01
Background We pursue two primary goals in this article: (1) to test a methodology and develop a dataset on U.S. local-level alcohol policy ordinances, and (2) to evaluate the presence, comprehensiveness, and stringency of eight local alcohol policies in 50 diverse California cities in relationship to recommended best practices in both public health literature and governmental recommendations to reduce underage drinking. Methods Following best practice recommendations from a wide array of authoritative sources, we selected eight local alcohol policy topics (e.g., conditional use permits, responsible beverage service training, social host ordinances, window/billboard advertising ordinances), and determined the presence or absence as well as the stringency (restrictiveness) and comprehensiveness (number of provisions) of each ordinance in each of the 50 cities in 2009. Following the alcohol policy literature, we created scores for each city on each type of ordinance and its associated components. We used these data to evaluate the extent to which recommendations for best practices to reduce underage alcohol use are being followed. Results (1) Compiling datasets of local-level alcohol policy laws and their comprehensiveness and stringency is achievable, even absent comprehensive, on-line, or other legal research tools. (2) We find that, with some exceptions, most of the 50 cities do not have high scores for presence, comprehensiveness, or stringency across the eight key policies. Critical policies such as responsible beverage service and deemed approved ordinances are uncommon, and, when present, they are generally neither comprehensive nor stringent. Even within policies that have higher adoption rates, central elements are missing across many or most cities’ ordinances. Conclusion This study demonstrates the viability of original legal data collection in the U.S. pertaining to local ordinances and of creating quantitative scores for each policy type to reflect comprehensiveness and stringency. Analysis of the resulting dataset reveals that, although the 50 cities have taken important steps to improve public health with regard to underage alcohol use and abuse, there is a great deal more that needs to be done to bring these cities into compliance with best practice recommendations. PMID:22734468
Leachman, Sancy A.; Tigelaar, Robert E.; Shlyankevich, Mark; Slade, Martin D.; Irwin, Michele; Chang, Ed; Wu, T. C.; Xiao, Wei; Pazhani, Sundaram; Zelterman, Daniel; Brandsma, Janet L.
2000-01-01
A cottontail rabbit papillomavirus (CRPV) E6 DNA vaccine that induces significant protection against CRPV challenge was used in a superior vaccination regimen in which the cutaneous sites of vaccination were primed with an expression vector encoding granulocyte-macrophage colony-stimulating factor (GM-CSF), a cytokine that induces differentiation and local recruitment of professional antigen-presenting cells. This treatment induced a massive influx of major histocompatibility complex class II-positive cells. In a vaccination-challenge experiment, rabbit groups were treated by E6 DNA vaccination, GM-CSF DNA inoculation, or a combination of both treatments. After two immunizations, rabbits were challenged with CRPV at low, moderate, and high stringencies and monitored for papilloma formation. As expected, all clinical outcomes were monotonically related to the stringency of the viral challenge. The results demonstrate that GM-CSF priming greatly augmented the effects of CRPV E6 vaccination. First, challenge sites in control rabbits (at the moderate challenge stringency) had a 0% probability of remaining disease free, versus a 50% probability in E6-vaccinated rabbits, and whereas GM-CSF alone had no effect, the interaction between GM-CSF priming and E6 vaccination increased disease-free survival to 67%. Second, the incubation period before papilloma onset was lengthened by E6 DNA vaccination alone or to some extent by GM-CSF DNA inoculation alone, and the combination of treatments induced additive effects. Third, the rate of papilloma growth was reduced by E6 vaccination and, to a lesser extent, by GM-CSF treatment. In addition, the interaction between the E6 and GM-CSF treatments was synergistic and yielded more than a 99% reduction in papilloma volume. Finally, regression occurred among the papillomas that formed in rabbits treated with the E6 vaccine and/or with GM-CSF, with the highest regression frequency occurring in rabbits that received the combination treatment. PMID:10954571
High-stringency screening of target-binding partners using a microfluidic device
Soh, Hyongsok; Lou, Xinhui; Lagally, Eric
2015-12-01
The invention provides a method of screening a library of candidate agents by contacting the library with a target in a reaction mixture under a condition of high stringency, wherein the target includes a tag that responds to a controllable force applied to the tag, and passing the members of the library through a microfluidic device in a manner that exposes the library members to the controllable force, thereby displacing members of the library that are bound to the target relative to their unbound counterparts. Kits and systems for use with the methods of the invention are also provided.
Flores, J; Midthun, K; Hoshino, Y; Green, K; Gorziglia, M; Kapikian, A Z; Chanock, R M
1986-01-01
RNA-RNA hybridization was performed to assess the extent of genetic relatedness among human rotaviruses isolated from children with gastroenteritis and from asymptomatic newborn infants. 32P-labeled single-stranded RNAs produced by in vitro transcription from viral cores of the different strains tested were used as probes in two different hybridization assays: undenatured genomic RNAs were resolved by polyacrylamide gel electrophoresis, denatured in situ, electrophoretically transferred to diazobenzyloxymethyl-paper (Northern blots), and then hybridized to the probes under two different conditions of stringency; and denatured genomic double-stranded RNAs were hybridized to the probes in solution and the hybrids which formed were identified by polyacrylamide gel electrophoresis. When analyzed by Northern blot hybridization at a low level of stringency, all genes from the strains tested cross-hybridized, providing evidence for some sequence homology in each of the corresponding genes. However, when hybridization stringency was increased, a difference in gene 4 sequence was detected between strains recovered from asymptomatic newborn infants ("nursery strains") and strains recovered from infants and young children with diarrhea. Although the nursery strains exhibited serotypic diversity (i.e., each of the four strains tested belonged to a different serotype), the fourth gene appeared to be highly conserved. Similarly, each of the virulent strains tested belonged to a different serotype; nonetheless, there was significant conservation of sequence among the fourth genes of three of these viruses. Significantly, the conserved fourth genes of the nursery strains were distinct from the fourth gene of each of the virulent viruses. These results were confirmed and extended during experiments in which the RNA-RNA hybridization was carried out in solution and the resulting hybrids were analyzed by polyacrylamide gel electrophoresis. Under these conditions, the fourth genes of the nursery strains were closely related to each other but not to the fourth genes of the virulent viruses. Full-length hybrids did not form between the fourth genes from the nursery strains and the corresponding genes from the strains recovered from symptomatic infants and young children. Images PMID:3023685
Petersen, Jesper; Poulsen, Lena; Birgens, Henrik; Dufva, Martin
2009-01-01
The development of DNA microarray assays is hampered by two important aspects: processing of the microarrays is done under a single stringency condition, and characteristics such as melting temperature are difficult to predict for immobilized probes. A technical solution to these limitations is to use a thermal gradient and information from melting curves, for instance to score genotypes. However, application of temperature gradients normally requires complicated equipment, and the size of the arrays that can be investigated is restricted due to heat dissipation. Here we present a simple microfluidic device that creates a gradient comprising zones of defined ionic strength over a glass slide, in which each zone corresponds to a subarray. Using this device, we demonstrated that ionic strength gradients function in a similar fashion as corresponding thermal gradients in assay development. More specifically, we noted that (i) the two stringency modulators generated melting curves that could be compared, (ii) both led to increased assay robustness, and (iii) both were associated with difficulties in genotyping the same mutation. These findings demonstrate that ionic strength stringency buffers can be used instead of thermal gradients. Given the flexibility of design of ionic gradients, these can be created over all types of arrays, and encompass an attractive alternative to temperature gradients, avoiding curtailment of the size or spacing of subarrays on slides associated with temperature gradients. PMID:19277213
Petersen, Jesper; Poulsen, Lena; Birgens, Henrik; Dufva, Martin
2009-01-01
The development of DNA microarray assays is hampered by two important aspects: processing of the microarrays is done under a single stringency condition, and characteristics such as melting temperature are difficult to predict for immobilized probes. A technical solution to these limitations is to use a thermal gradient and information from melting curves, for instance to score genotypes. However, application of temperature gradients normally requires complicated equipment, and the size of the arrays that can be investigated is restricted due to heat dissipation. Here we present a simple microfluidic device that creates a gradient comprising zones of defined ionic strength over a glass slide, in which each zone corresponds to a subarray. Using this device, we demonstrated that ionic strength gradients function in a similar fashion as corresponding thermal gradients in assay development. More specifically, we noted that (i) the two stringency modulators generated melting curves that could be compared, (ii) both led to increased assay robustness, and (iii) both were associated with difficulties in genotyping the same mutation. These findings demonstrate that ionic strength stringency buffers can be used instead of thermal gradients. Given the flexibility of design of ionic gradients, these can be created over all types of arrays, and encompass an attractive alternative to temperature gradients, avoiding curtailment of the size or spacing of subarrays on slides associated with temperature gradients.
Edwards, W. Barry
2013-01-01
The aim of this study was to identify potential ligands of PSMA suitable for further development as novel PSMA-targeted peptides using phage display technology. The human PSMA protein was immobilized as a target followed by incubation with a 15-mer phage display random peptide library. After one round of prescreening and two rounds of screening, high-stringency screening at the third round of panning was performed to identify the highest affinity binders. Phages which had a specific binding activity to PSMA in human prostate cancer cells were isolated and the DNA corresponding to the 15-mers were sequenced to provide three consensus sequences: GDHSPFT, SHFSVGS and EVPRLSLLAVFL as well as other sequences that did not display consensus. Two of the peptide sequences deduced from DNA sequencing of binding phages, SHSFSVGSGDHSPFT and GRFLTGGTGRLLRIS were labeled with 5-carboxyfluorescein and shown to bind and co-internalize with PSMA on human prostate cancer cells by fluorescence microscopy. The high stringency requirements yielded peptides with affinities KD∼1 µM or greater which are suitable starting points for affinity maturation. While these values were less than anticipated, the high stringency did yield peptide sequences that apparently bound to different surfaces on PSMA. These peptide sequences could be the basis for further development of peptides for prostate cancer tumor imaging and therapy. PMID:23935860
Raymond, James T; Lamm, Marnie; Nordhausen, Robert; Latimer, Ken; Garner, Michael M
2003-04-01
In March 2000, an approximately 30-yr-old, male coastal mountain kingsnake (Lampropeltis zonata multifasciata) presented with disequilibrium and unresponsiveness to stimuli that ultimately lead to euthanasia. Histologically, there were foci of gliosis primarily within the caudal cerebrum, brainstem, and cervical spinal cord. Several glial cells and endothelial cells contained magenta, intranuclear inclusion bodies. Electron microscopy of the inclusions revealed paracrystalline arrays of 79-82 nm, viral-like particles. DNA in situ hybridization of sections of formalin-fixed brain using a mixture of two digoxigenin-end-labeled, adenovirus specific, oligonucleotide probes at low and high stringency was positive for adenovirus.
Riley, D E; Wagner, B; Polley, L; Krieger, J N
1995-01-01
The protozoan parasite Tritrichomonas foetus causes infertility and spontaneous abortion in cattle. In Saskatchewan, Canada, the culture prevalence of trichomonads was 65 of 1,048 (6%) among 1,048 bulls tested within a 1-year period ending in April 1994. Saskatchewan was previously thought to be free of the parasite. To confirm the culture results, possible T. foetus DNA presence was determined by the PCR. All of the 16 culture-positive isolates tested were PCR positive by a single-band test, but one PCR product was weak. DNA fingerprinting by both T17 PCR and randomly amplified polymorphic DNA PCR revealed genetic variation or polymorphism among the T. foetus isolates. T17 PCR also revealed conserved loci that distinguished these T. foetus isolates from Trichomonas vaginalis, from a variety of other protozoa, and from prokaryotes. TCO-1 PCR, a PCR test designed to sample DNA sequence homologous to the 5' flank of a highly conserved cell division control gene, detected genetic polymorphism at low stringency and a conserved, single locus at higher stringency. These findings suggested that T. foetus isolates exhibit both conserved genetic loci and polymorphic loci detectable by independent PCR methods. Both conserved and polymorphic genetic loci may prove useful for improved clinical diagnosis of T. foetus. The polymorphic loci detected by PCR suggested either a long history of infection or multiple lines of T. foetus infection in Saskatchewan. Polymorphic loci detected by PCR may provide data for epidemiologic studies of T. foetus. PMID:7615746
Marfil-Santana, Miguel David; O'Connor-Sánchez, Aileen; Ramírez-Prado, Jorge Humberto; De Los Santos-Briones, Cesar; López-Aguiar; Lluvia, Korynthia; Rojas-Herrera, Rafael; Lago-Lestón, Asunción; Prieto-Davó, Alejandra
2016-11-01
The need for new antibiotics has sparked a search for the microbes that might potentially produce them. Current sequencing technologies allow us to explore the biotechnological potential of microbial communities in diverse environments without the need for cultivation, benefitting natural product discovery in diverse ways. A relatively recent method to search for the possible production of novel compounds includes studying the diverse genes belonging to polyketide synthase pathways (PKS), as these complex enzymes are an important source of novel therapeutics. In order to explore the biotechnological potential of the microbial community from the largest underground aquifer in the world located in the Yucatan, we used a polyphasic approach in which a simple, non-computationally intensive method was coupled with direct amplification of environmental DNA to assess the diversity and novelty of PKS type I ketosynthase (KS) domains. Our results suggest that the bioinformatic method proposed can indeed be used to assess the novelty of KS enzymes; nevertheless, this in silico study did not identify some of the KS diversity due to primer bias and stringency criteria outlined by the metagenomics pipeline. Therefore, additionally implementing a method involving the direct cloning of KS domains enhanced our results. Compared to other freshwater environments, the aquifer was characterized by considerably less diversity in relation to known ketosynthase domains; however, the metagenome included a family of KS type I domains phylogenetically related, but not identical, to those found in the curamycin pathway, as well as an outstanding number of thiolases. Over all, this first look into the microbial community found in this large Yucatan aquifer and other fresh water free living microbial communities highlights the potential of these previously overlooked environments as a source of novel natural products.
Eighty routes to a ribonucleotide world; dispersion and stringency in the decisive selection.
Yarus, Michael
2018-05-21
We examine the initial emergence of genetics; that is, of an inherited chemical capability. The crucial actors are ribonucleotides, occasionally meeting in a prebiotic landscape. Previous work identified six influential variables during such random ribonucleotide pooling. Geochemical pools can be in periodic danger (e.g., from tides) or constant danger (e.g., from unfavorable weather). Such pools receive Gaussian nucleotide amounts sporadically, at random times, or get varying substrates simultaneously. Pools use cross-templated RNA synthesis (5'-5' product from 5'-3' template) or para-templated (5'-5' product from 5'-5' template) synthesis. Pools can undergo mild or strong selection, and be recently initiated (early) or late in age. Considering > 80 combinations of these variables, selection calculations identify a superior route. Most likely, an early, sporadically fed, cross-templating pool in constant danger, receiving ≥ 1 mM nucleotides while under strong selection for a coenzyme-like product will host selection of the first encoded biochemical functions. Predominantly templated products emerge from a critical event, the starting bloc selection, which exploits inevitable differences among early pools. Favorable selection has a simple rationale; it is increased by product dispersion (sd/mean), by selection intensity (mild or strong), or by combining these factors as stringency, reciprocal fraction of pools selected (1/sfsel). To summarize: chance utility, acting via a preference for disperse, templated coenzyme-like dinucleotides, uses stringent starting bloc selection to quickly establish majority encoded/genetic expression. Despite its computational origin, starting bloc selection is largely independent of specialized assumptions. This ribodinucleotide route to inheritance may also have facilitated 5'-3' chemical RNA replication. Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Stringency of workplace air contaminant exposure limits: a case study of OSHA risk management.
Hakes, J K
1999-12-01
Political context may play a large role in influencing the efficiency of environmental and health regulations. This case study uses data from a 1989 update of the Occupational Safety and Health Administration (OSHA) Permissible Exposure Limits (PELs) program to determine the relative effects of legislative mandates, costly acquisition of information by the agency, and pressure applied by special interest groups upon exposure standards. The empirical analysis suggests that federal agencies successfully thwart legislative attempts to limit agency discretion, and that agencies exercise bounded rationality by placing greater emphasis on more easily obtained information. The 1989 PELs were less significantly related to more costly information, contained "safety factors" for chemicals presenting relatively more ambiguous risks, and the proposed standard stringencies showed evidence of being influenced by vying industry and labor interests.
Hu, Jiangfeng; Wang, Zhao; Lian, Yuehan; Huang, Qinghua
2018-01-29
This study examines the spillover effects of foreign direct investment (FDI) on green technology progress rate (as measured by the green total factor productivity). The analysis utilizes two measures of FDI, labor-based FDI and capital-based FDI, and separately investigates four sets of industry classifications-high/low discharge regulation and high/low emission standard regulation. The results indicate that in the low discharge regulation and low emission standard regulation industry, labor-based FDI has a significant negative spillover effect, and capital-based FDI has a significant positive spillover effect. However, in the high-intensity environmental regulation industry, the negative influence of labor-based FDI is completely restrained, and capital-based FDI continues to play a significant positive green technological spillover effects. These findings have clear policy implications: the government should be gradually reducing the labor-based FDI inflow or increasing stringency of environmental regulation in order to reduce or eliminate the negative spillover effect of the labor-based FDI.
Hu, Jiangfeng; Wang, Zhao; Lian, Yuehan; Huang, Qinghua
2018-01-01
This study examines the spillover effects of foreign direct investment (FDI) on green technology progress rate (as measured by the green total factor productivity). The analysis utilizes two measures of FDI, labor-based FDI and capital-based FDI, and separately investigates four sets of industry classifications—high/low discharge regulation and high/low emission standard regulation. The results indicate that in the low discharge regulation and low emission standard regulation industry, labor-based FDI has a significant negative spillover effect, and capital-based FDI has a significant positive spillover effect. However, in the high-intensity environmental regulation industry, the negative influence of labor-based FDI is completely restrained, and capital-based FDI continues to play a significant positive green technological spillover effects. These findings have clear policy implications: the government should be gradually reducing the labor-based FDI inflow or increasing stringency of environmental regulation in order to reduce or eliminate the negative spillover effect of the labor-based FDI. PMID:29382112
Process of labeling specific chromosomes using recombinant repetitive DNA
Moyzis, R.K.; Meyne, J.
1988-02-12
Chromosome preferential nucleotide sequences are first determined from a library of recombinant DNA clones having families of repetitive sequences. Library clones are identified with a low homology with a sequence of repetitive DNA families to which the first clones respectively belong and variant sequences are then identified by selecting clones having a pattern of hybridization with genomic DNA dissimilar to the hybridization pattern shown by the respective families. In another embodiment, variant sequences are selected from a sequence of a known repetitive DNA family. The selected variant sequence is classified as chromosome specific, chromosome preferential, or chromosome nonspecific. Sequences which are classified as chromosome preferential are further sequenced and regions are identified having a low homology with other regions of the chromosome preferential sequence or with known sequences of other family members and consensus sequences of the repetitive DNA families for the chromosome preferential sequences. The selected low homology regions are then hybridized with chromosomes to determine those low homology regions hybridized with a specific chromosome under normal stringency conditions.
Acoustic flight test of the Piper Lance
DOT National Transportation Integrated Search
1986-12-01
Research is being conducted to refine current noise regulation of propeller-driven small airplanes. Studies are examining the prospect of a substituting a takeoff procedure of equal stringency for the level flyover certification test presently requir...
A tailing genome walking method suitable for genomes with high local GC content.
Liu, Taian; Fang, Yongxiang; Yao, Wenjuan; Guan, Qisai; Bai, Gang; Jing, Zhizhong
2013-10-15
The tailing genome walking strategies are simple and efficient. However, they sometimes can be restricted due to the low stringency of homo-oligomeric primers. Here we modified their conventional tailing step by adding polythymidine and polyguanine to the target single-stranded DNA (ssDNA). The tailed ssDNA was then amplified exponentially with a specific primer in the known region and a primer comprising 5' polycytosine and 3' polyadenosine. The successful application of this novel method for identifying integration sites mediated by φC31 integrase in goat genome indicates that the method is more suitable for genomes with high complexity and local GC content. Copyright © 2013 Elsevier Inc. All rights reserved.
Ahmed, Khalid; Ahmed, Sidrah
2018-03-28
This study takes environmental policy stringency and economic activity as the controlling variables and forecasts the CO 2 emissions in China up to 2022. In doing so, an application of corrected grey model with convolution is used over the annual time series data between 1990 and 2012. The simulation results show that (1) between 2012 and 2022, CO 2 emissions in China is expected to increase at an average rate of 17.46% annually, raising the emissions intensity from 7.04 in 2012 to 25.461 metric tons per capita by 2022; (2) stringent environmental policies reduce CO 2 emissions-whereas, GDP tends to increase the emissions intensity in China; (3) stringent environmental policies are found to have a negative impact on GDP in China. Based on the empirical findings, the study also provides some policy suggestions to reduce emissions intensity in China.
ERIC Educational Resources Information Center
Pournelle, Jerry
1984-01-01
Discussion of software license agreements implies that they actually contribute to software piracy because of their stringency and indicates that competition in the software publishing field will eventually eliminate the piracy problem. (MBR)
The Political Economy of Part-Time Academic Work in Canada.
ERIC Educational Resources Information Center
Rajagopal, Indhu; Farr, William D.
1989-01-01
Under continuing financial stringency, the university administration negotiates concessions with full-time faculty to satisfy their interests and maintain the stability of the system. Part-timers, excluded from the collegium, remain peripheral to these arrangements. (Author/MLW)
Intelligent neuroprocessors for in-situ launch vehicle propulsion systems health management
NASA Technical Reports Server (NTRS)
Gulati, S.; Tawel, R.; Thakoor, A. P.
1993-01-01
Efficacy of existing on-board propulsion systems health management systems (HMS) are severely impacted by computational limitations (e.g., low sampling rates); paradigmatic limitations (e.g., low-fidelity logic/parameter redlining only, false alarms due to noisy/corrupted sensor signatures, preprogrammed diagnostics only); and telemetry bandwidth limitations on space/ground interactions. Ultra-compact/light, adaptive neural networks with massively parallel, asynchronous, fast reconfigurable and fault-tolerant information processing properties have already demonstrated significant potential for inflight diagnostic analyses and resource allocation with reduced ground dependence. In particular, they can automatically exploit correlation effects across multiple sensor streams (plume analyzer, flow meters, vibration detectors, etc.) so as to detect anomaly signatures that cannot be determined from the exploitation of single sensor. Furthermore, neural networks have already demonstrated the potential for impacting real-time fault recovery in vehicle subsystems by adaptively regulating combustion mixture/power subsystems and optimizing resource utilization under degraded conditions. A class of high-performance neuroprocessors, developed at JPL, that have demonstrated potential for next-generation HMS for a family of space transportation vehicles envisioned for the next few decades, including HLLV, NLS, and space shuttle is presented. Of fundamental interest are intelligent neuroprocessors for real-time plume analysis, optimizing combustion mixture-ratio, and feedback to hydraulic, pneumatic control systems. This class includes concurrently asynchronous reprogrammable, nonvolatile, analog neural processors with high speed, high bandwidth electronic/optical I/O interfaced, with special emphasis on NASA's unique requirements in terms of performance, reliability, ultra-high density ultra-compactness, ultra-light weight devices, radiation hardened devices, power stringency, and long life terms.
Towards Flexibility in Academic Labour Markets?
ERIC Educational Resources Information Center
Nieuwenhuysen, John
1985-01-01
It is argued that Australia's relatively uniform and consistent academic salary structure and personnel policies should be more flexible and competitive in order to alleviate current problems of academic labor market stagnation, uneven faculty distribution, and other results of financial stringency. (MSE)
Gene Trapping Using Gal4 in Zebrafish
Balciuniene, Jorune; Balciunas, Darius
2013-01-01
Large clutch size and external development of optically transparent embryos make zebrafish an exceptional vertebrate model system for in vivo insertional mutagenesis using fluorescent reporters to tag expression of mutated genes. Several laboratories have constructed and tested enhancer- and gene-trap vectors in zebrafish, using fluorescent proteins, Gal4- and lexA- based transcriptional activators as reporters 1-7. These vectors had two potential drawbacks: suboptimal stringency (e.g. lack of ability to differentiate between enhancer- and gene-trap events) and low mutagenicity (e.g. integrations into genes rarely produced null alleles). Gene Breaking Transposon (GBTs) were developed to address these drawbacks 8-10. We have modified one of the first GBT vectors, GBT-R15, for use with Gal4-VP16 as the primary gene trap reporter and added UAS:eGFP as the secondary reporter for direct detection of gene trap events. Application of Gal4-VP16 as the primary gene trap reporter provides two main advantages. First, it increases sensitivity for genes expressed at low expression levels. Second, it enables researchers to use gene trap lines as Gal4 drivers to direct expression of other transgenes in very specific tissues. This is especially pertinent for genes with non-essential or redundant functions, where gene trap integration may not result in overt phenotypes. The disadvantage of using Gal4-VP16 as the primary gene trap reporter is that genes coding for proteins with N-terminal signal sequences are not amenable to trapping, as the resulting Gal4-VP16 fusion proteins are unlikely to be able to enter the nucleus and activate transcription. Importantly, the use of Gal4-VP16 does not pre-select for nuclear proteins: we recovered gene trap mutations in genes encoding proteins which function in the nucleus, the cytoplasm and the plasma membrane. PMID:24121167
Jannotti-Passos, Liana Konovaloffi; Dos Santos Carvalho, Omar
2017-01-01
The low stringency-polymerase chain reaction (LS-PCR) and loop-mediated isothermal amplification (LAMP) assays were used to detect the presence of S. mansoni DNA in (1) Brazilian intermediate hosts (Biomphalaria glabrata, B. straminea, and B. tenagophila) with patent S. mansoni infections, (2) B. glabrata snails with prepatent S. mansoni infections, (3) various mixtures of infected and noninfected snails; and (4) snails infected with other trematode species. The assays showed high sensitivity and specificity and could detect S. mansoni DNA when one positive snail was included in a pool of 1,000 negative specimens of Biomphalaria. These molecular approaches can provide a low-cost, effective, and rapid method for detecting the presence of S. mansoni in pooled samples of field-collected Biomphalaria. These assays should aid mapping of transmission sites in endemic areas, especially in low prevalence regions and improve schistosomiasis surveillance. It will be a useful tool to monitor low infection rates of snails in areas where control interventions are leading towards the elimination of schistosomiasis. PMID:28246533
Harasym, Peter H; Woloschuk, Wayne; Cunning, Leslie
2008-12-01
Physician-patient communication is a clinical skill that can be learned and has a positive impact on patient satisfaction and health outcomes. A concerted effort at all medical schools is now directed at teaching and evaluating this core skill. Student communication skills are often assessed by an Objective Structure Clinical Examination (OSCE). However, it is unknown what sources of error variance are introduced into examinee communication scores by various OSCE components. This study primarily examined the effect different examiners had on the evaluation of students' communication skills assessed at the end of a family medicine clerkship rotation. The communication performance of clinical clerks from Classes 2005 and 2006 were assessed using six OSCE stations. Performance was rated at each station using the 28-item Calgary-Cambridge guide. Item Response Theory analysis using a Multifaceted Rasch model was used to partition the various sources of error variance and generate a "true" communication score where the effects of examiner, case, and items are removed. Variance and reliability of scores were as follows: communication scores (.20 and .87), examiner stringency/leniency (.86 and .91), case (.03 and .96), and item (.86 and .99), respectively. All facet scores were reliable (.87-.99). Examiner variance (.86) was more than four times the examinee variance (.20). About 11% of the clerks' outcome status shifted using "true" rather than observed/raw scores. There was large variability in examinee scores due to variation in examiner stringency/leniency behaviors that may impact pass-fail decisions. Exploring the benefits of examiner training and employing "true" scores generated using Item Response Theory analyses prior to making pass/fail decisions are recommended.
Donoviel, M. S.; Young, E. T.
1996-01-01
Two cis-acting elements have been identified that act synergistically to regulate expression of the glucose-repressed alcohol dehydrogenase 2 (ADH2) gene. UAS1 is bound by the trans-activator Adr1p. UAS2 is thought to be the binding site for an unidentified regulatory protein. A genetic selection based on a UAS2-dependent ADH2 reporter was devised to isolate genes capable of activating UAS2-dependent transcription. One set of UAS2-dependent genes contained SPT6/CRE2/SSN20. Multicopy SPT6 caused improper expression of chromosomal ADH2. A second set of UAS2-dependent clones contained a previously uncharacterized open reading frame designated MEU1 (Multicopy Enhancer of UAS2). A frame shift mutation in MEU1 abolished its ability to activate UAS2-dependent gene expression. Multicopy MEU1 expression suppressed the constitutive ADH2 expression caused by cre2-1. Disruption of MEU1 reduced endogenous ADH2 expression about twofold but had no effect on cell viability or growth. No homologues of MEU1 were identified by low-stringency Southern hybridization of yeast genomic DNA, and no significant homologues were found in the sequence data bases. A MEU1/β-gal fusion protein was not localized to a particular region of the cell. MEU1 is linked to PPR1 on chromosome XII. PMID:8807288
FUEL ECONOMY AND CO2 EMISSIONS STANDARDS, MANUFACTURER PRICING STRATEGIES, AND FEEBATES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Changzheng; Greene, David L; Bunch, Dr David S.
2012-01-01
Corporate Average Fuel Economy (CAFE) standards and CO2 emissions standards for 2012 to 2016 have significantly increased the stringency of requirements for new light-duty vehicle fuel efficiency. This study investigates the role of technology adoption and pricing strategies in meeting new standards, as well as the impact of feebate policies. The analysis is carried out by means of a dynamic optimization model that simulates manufacturer decisions with the objective of maximizing social surplus while simultaneously considering consumer response and meeting CAFE and emissions standards. The results indicate that technology adoption plays the major role and that the provision of compliancemore » flexibility and the availability of cost-effective advanced technologies help manufacturers reduce the need for pricing to induce changes in the mix of vehicles sold. Feebates, when implemented along with fuel economy and emissions standards, can bring additional fuel economy improvement and emissions reduction, but the benefit diminishes with the increasing stringency of the standards.« less
ERIC Educational Resources Information Center
Grimley, Michael; Green, Richard; Nilsen, Trond; Thompson, David
2012-01-01
Computer games are purported to be effective instructional tools that enhance motivation and improve engagement. The aim of this study was to investigate how tertiary student experiences change when instruction was computer game based compared to lecture based, and whether experiences differed between high and low achieving students. Participants…
Delavat, François; Lett, Marie-Claire; Lièvremont, Didier
2013-10-01
Acid mine drainages (AMDs) are often thought to harbour low biodiversity, yet little is known about the diversity distribution along the drainages. Using culture-dependent approaches, the microbial diversity from the Carnoulès AMD sediment was investigated for the first time along a transect showing progressive environmental stringency decrease. In total, 20 bacterial genera were detected, highlighting a higher bacterial diversity than previously thought. Moreover, this approach led to the discovery of 16 yeast species, demonstrating for the first time the presence of this important phylogenetic group in this AMD. All in all, the location of the microbes along the transect helps to better understand their distribution in a pollution gradient. Copyright © 2013 Elsevier B.V. All rights reserved.
Detection of Human Papillomavirus Type 2 Related Sequence in Oral Papilloma
Yamaguchi, Taihei; Shindoh, Masanobu; Amemiya, Akira; Inoue, Nobuo; Kawamura, Masaaki; Sakaoka, Hiroshi; Inoue, Masakazu; Fujinaga, Kei
1998-01-01
Oral papilloma is a benign tumourous lesion. Part of this lesion is associated with human papillomavirus (HPV) infection. We analysed the genetical and histopathological evidence for HPV type 2 infection in three oral papillomas. Southern blot hybridization showed HPV 2a sequence in one lesion. Cells of the positive specimen appeared to contain high copy numbers of the viral DNA in an episomal state. In situ staining demonstrated virus capsid antigen in koilocytotic cells and surrounding cells in the hyperplastic epithelial layer. Two other specimens contained no HPV sequences by labeled probe of full length linear HPVs 2a, 6b, 11, 16, 18, 31 and 33 DNA under low stringency hybridization conditions. These results showed the possibility that HPV 2 plays a role in oral papilloma. PMID:9699941
Off-Setting Differences in Reviewer Stringency.
ERIC Educational Resources Information Center
Cason, Carolyn L.; And Others
A total of 1,071 rating sheets were completed by individual reviewers evaluating abstracts submitted to Sigma Theta Tau International Honor Society for the International Research Congress in Edinburgh, Scotland. Only 972 sheets contained usable data. Reviewers indicated a total of 12 ratings with possible comments for each abstract on machine…
The effects of climate sensitivity and carbon cycle interactions on mitigation policy stringency
Climate sensitivity and climate-carbon cycle feedbacks interact to determine how global carbon and energy cycles will change in the future. While the science of these connections is well documented, their economic implications are not well understood. Here we examine the effect o...
12 CFR 1204.8 - How are records secured?
Code of Federal Regulations, 2010 CFR
2010-01-01
... § 1204.8 How are records secured? (a) What controls must FHFA have in place? Each FHFA office must establish administrative and physical controls to prevent unauthorized access to its systems of records... stringency of these controls should correspond to the sensitivity of the records that the controls protect...
Achieving More, Spending Less in Schools, Districts, and States
ERIC Educational Resources Information Center
Walberg, Herbert J.
2011-01-01
In an era of financial stringency and demands for better school performance, it is useful to think about several means of raising school productivity: (1) increase learning effectiveness without increasing costs; (2) reduce costs without diminishing effectiveness; or (3) both, that is, increase effectiveness and simultaneously reduce costs. The…
Does certificate of need law enhance competition in inpatient care market? An empirical analysis.
Paul, Jomon A; Ni, Huan; Bagchi, Aniruddha
2017-06-29
This article investigates the impact of Certificate of Need (CON) laws on competition in the inpatient care market. One of the major criticisms of these laws is that it may hinder competition in the health care market, which can lead to higher prices. However, from a theoretical standpoint, CON laws could also promote competition by limiting excessive expansion from incumbents. Our main conclusion is that CON laws by and large enhanced competition in the inpatient market during the period of our study. This indicates that the effects of CON laws to hinder predatory behavior could dominate its effects of preventing new entrants into the inpatient care market. We do not find statistically significant evidence to reject the exogeneity assumption of either CON laws or their stringency in our study. We also find factors such as proportion of population aged 18-44, proportion of Asian American population, obesity rate, political environment, etc., in a state significantly impact competition. Our findings could shed some light to public policy makers when deciding the appropriate health programs or legislative framework to promote health care market competition and thereby facilitate quality health care.
Effects of Mastery Criterion on the Emergence of Derived Equivalence Relations
ERIC Educational Resources Information Center
Fienup, Daniel M.; Brodsky, Julia
2017-01-01
In this study, we manipulated mastery criterion form (rolling or block) and stringency (across 6 or 12 trials) and measured the emergence of derived relations. College students learned neuroanatomy equivalence classes and experienced one of two rolling mastery criteria (6 or 12 consecutive correct responses) or a block mastery criterion (12 trials…
Harvard and Money; A Memorandum on Issues and Choices.
ERIC Educational Resources Information Center
Harvard Univ., Cambridge , MA. Univ. Committee on Governance.
This report discusses some of the financial issues and choices with which Harvard University will have to cope in an environment of increased stringency: issues of money-allocation, money raising, and money management. Part I presents highlights of Harvard's recent financial history and its prospects in quantitative terms. Part II presents some…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-26
....) Since for asphalt concrete plants and mineral crushers this revision (ARM 17.8.743(1)(b)) reduces the... plants and mineral crushers reduces the stringency of the current SIP approved regulations. We commented... of Subjects in 40 CFR Part 52 Environmental protection, Air pollution control, Carbon monoxide...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-04
.... Collectively, the five rulemakings elevated the stringency of exhaust emission standards and test procedures... Act for the emission standards and related test procedures contained in its urban bus regulations, as... Standards; Urban Buses; Request for Waiver of Preemption; Opportunity for Public Hearing and Comment AGENCY...
Making Decisions in a Time of Fiscal Stringency: The Longer-Term Implications.
ERIC Educational Resources Information Center
Bowen, Frank M.
The concept of traditional planning, programming, and budgeting systems (PPBS) is defined and compared with imperative planning, a term used to refer to whatever procedures higher education officials use to integrate program planning and budgeting. The University of Wisconsin system is described as an example of emerging budgetary practice in…
USDA-ARS?s Scientific Manuscript database
We have evaluated the new Swine Protein-Annotated Oligonucleotide Microarray (http://www.pigoligoarray.org) by analyzing transcriptional profiles for longissimus dorsi muscle (LD), Bronchial lymph node (BLN) and Lung. Four LD samples were used to assess the stringency of hybridization conditions com...
Garcés-Eisele, J.; Ruiz-Argüelles, A.; Estrada-Marín, Larisa; Reyes-Núñez, Virginia; Vázquez-Pérez, R.; Guzmán-García, Olga; Coutiño-Medina, R.; Acosta-Sandria, Leticia; Cedillo-Carvallo, Beatriz
2014-01-01
Clinical response to clopidogrel varies widely due to under-dosing, drug interactions and intrinsic interindividual differences resulting from genetic polymorphisms. Cytochrome P450-2C19 is the principal enzyme involved in the activation of the prodrug and loss-of-function alleles have been described. Upon expiration of the pharmaceutical patent of clopidogrel, generic manufacturers have started to subject interchangeable formulations to bioequivalence studies. The purpose of the current investigation was to study the effect of selection of volunteers homozygous for the CYP2C19*1 haplotype on the bioavailability of clopidogrel. A regular 2×2 bioequivalence study between two formulations of clopidogrel was performed in volunteers selected and unselected for relevant CYP2C19 haplotypes for the Mexican population. It was found that selection of volunteers homozygous for the CYP2C19*1 haplotype, increased the stringency of bioequivalence statistics and resulted in bioinequivalence of a generic clopidogrel compound that otherwise proved equivalent when tested in an open unselected population. Augmentation of bioequivalence strictness is expected to result from pharmacogenetic selection of volunteers. PMID:25284925
Land classification of south-central Iowa from computer enhanced images
NASA Technical Reports Server (NTRS)
Lucas, J. R.; Taranik, J. V.; Billingsley, F. C. (Principal Investigator)
1977-01-01
The author has identified the following significant results. Enhanced LANDSAT imagery was most useful for land classification purposes, because these images could be photographically printed at large scales such as 1:63,360. The ability to see individual picture elements was no hindrance as long as general image patterns could be discerned. Low cost photographic processing systems for color printings have proved to be effective in the utilization of computer enhanced LANDSAT products for land classification purposes. The initial investment for this type of system was very low, ranging from $100 to $200 beyond a black and white photo lab. The technical expertise can be acquired from reading a color printing and processing manual.
Gotzes, F; Balfanz, S; Baumann, A
1994-01-01
Members of the superfamily of G-protein coupled receptors share significant similarities in sequence and transmembrane architecture. We have isolated a Drosophila homologue of the mammalian dopamine receptor family using a low stringency hybridization approach. The deduced amino acid sequence is approximately 70% homologous to the human D1/D5 receptors. When expressed in HEK 293 cells, the Drosophila receptor stimulates cAMP production in response to dopamine application. This effect was mimicked by SKF 38393, a specific D1 receptor agonist, but inhibited by dopaminergic antagonists such as butaclamol and flupentixol. In situ hybridization revealed that the Drosophila dopamine receptor is highly expressed in the somata of the optic lobes. This suggests that the receptor might be involved in the processing of visual information and/or visual learning in invertebrates.
ERIC Educational Resources Information Center
Rosner, Yotam; Perlman, Amotz
2018-01-01
Introduction: The Israel Ministry of Social Affairs and Social Services subsidizes computer-based assistive devices for individuals with visual impairments (that is, those who are blind or have low vision) to assist these individuals in their interactions with computers and thus to enhance their independence and quality of life. The aim of this…
Glass-Kaastra, Shiona K; Finley, Rita; Hutchinson, Jim; Patrick, David M; Weiss, Karl; Conly, John
2014-01-01
The financial accessibility of antimicrobial drugs to the outpatient community in Canada is governed at the provincial level through formularies. Each province may choose to list particular drugs or impose restriction criteria on products in order to guide prescribing and/or curtail costs. Although changes to formularies have been shown to change patterns in the use of individual products and alter costs, no comparison has been made among the provincial antimicrobial formularies with regards to flexibility/stringency, or an assessment of how these formularies impact overall antimicrobial use in the provinces. To summarize provincial antimicrobial formularies and assess whether their relative flexibility/stringency had a statistical impact upon provincial prescription volume during a one year period. Provincial drug plan formularies were accessed and summarized for all prescribed antimicrobials in Canada during 2010. The number of general and restricted benefits for each plan was compiled by antimicrobial classification. Population-adjusted prescription rates for all individual antimicrobials and by antimicrobial class were obtained from the Canadian Integrated Program for Antimicrobial Resistance Surveillance. Correlations between the number of general benefits, restricted benefits, and total benefits with the prescription rate in the provinces were assessed by Spearman rank correlation coefficients. Formularies varied considerably among the Canadian provinces. Quebec had the most flexible formulary, offering the greatest number of general benefits and fewest restrictions. In contrast, Saskatchewan's formulary displayed the lowest number of general benefits and most restrictions. Correlation analyses detected a single significant result; macrolide prescription rates decreased as the number of general macrolide benefits increased. All other rates of provincial antimicrobial prescribing and measures of flexibility/stringency revealed no significant correlations. Although antimicrobial formulary listings are used to guide prescribing rates within a province, our analysis of one year's data of the impact of the antimicrobial formulary structure did not correlate with antimicrobial prescribing rates, and other factors are likely to be at play.
Evasion of affinity-based selection in germinal centers by Epstein-Barr virus LMP2A.
Minamitani, Takeharu; Yasui, Teruhito; Ma, Yijie; Zhou, Hufeng; Okuzaki, Daisuke; Tsai, Chiau-Yuang; Sakakibara, Shuhei; Gewurz, Benjamin E; Kieff, Elliott; Kikutani, Hitoshi
2015-09-15
Epstein-Barr virus (EBV) infects germinal center (GC) B cells and establishes persistent infection in memory B cells. EBV-infected B cells can cause B-cell malignancies in humans with T- or natural killer-cell deficiency. We now find that EBV-encoded latent membrane protein 2A (LMP2A) mimics B-cell antigen receptor (BCR) signaling in murine GC B cells, causing altered humoral immune responses and autoimmune diseases. Investigation of the impact of LMP2A on B-cell differentiation in mice that conditionally express LMP2A in GC B cells or all B-lineage cells found LMP2A expression enhanced not only BCR signals but also plasma cell differentiation in vitro and in vivo. Conditional LMP2A expression in GC B cells resulted in preferential selection of low-affinity antibody-producing B cells despite apparently normal GC formation. GC B-cell-specific LMP2A expression led to systemic lupus erythematosus-like autoimmune phenotypes in an age-dependent manner. Epigenetic profiling of LMP2A B cells found increased H3K27ac and H3K4me1 signals at the zinc finger and bric-a-brac, tramtrack domain-containing protein 20 locus. We conclude that LMP2A reduces the stringency of GC B-cell selection and may contribute to persistent EBV infection and pathogenesis by providing GC B cells with excessive prosurvival effects.
Reliability of Fault Tolerant Control Systems. Part 2
NASA Technical Reports Server (NTRS)
Wu, N. Eva
2000-01-01
This paper reports Part II of a two part effort that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability properties peculiar to fault-tolerant control systems are emphasized, such as the presence of analytic redundancy in high proportion, the dependence of failures on control performance, and high risks associated with decisions in redundancy management due to multiple sources of uncertainties and sometimes large processing requirements. As a consequence, coverage of failures through redundancy management can be severely limited. The paper proposes to formulate the fault tolerant control problem as an optimization problem that maximizes coverage of failures through redundancy management. Coverage modeling is attempted in a way that captures its dependence on the control performance and on the diagnostic resolution. Under the proposed redundancy management policy, it is shown that an enhanced overall system reliability can be achieved with a control law of a superior robustness, with an estimator of a higher resolution, and with a control performance requirement of a lesser stringency.
78 FR 55221 - Approval and Promulgation of Implementation Plans; Texas; Procedures for Stringency...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
..., including those pertaining to the minor permit revisions. TCEQ subsequently adopted amendments to 30 TAC 122... were adopted or submitted to the Texas SIP in this rule package. 1. Minor permit revisions at 30 TAC... approaches only to the extent that such minor permit revision procedures are explicitly provided for in the...
Relationships and Values among Students and Staff in British and German Higher Education
ERIC Educational Resources Information Center
Pritchard, Rosalind
2005-01-01
Financial stringency and neo-liberal influences in higher education are impacting upon relationships and academic values in higher education. The aim of the present paper was to analyse how these forces operate differentially in the UK and Germany. The British students were significantly more satisfied than were their German counterparts with the…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... Promulgation of Implementation Plans; State of Missouri; Control of Sulfur Emissions From Stationary Boilers.... Louis nonattainment area by limiting sulfur dioxide (SO 2 ) emissions (a precursor pollutant to PM 2.5... stringency of the SIP. Missouri's revision adds 10 CSR 10- 5.570 Control of Sulfur Emissions from Stationary...
Verde, Franco; Hruban, Ralph H; Fishman, Elliot K
Small bowel gastrointestinal stromal tumors (SB-GISTs) are rare lesions with a variable appearance on computed tomography (CT). This case series analyzes the CT enhancement pattern with the histologic risk assessment of tumor progression. Local institutional pathology database was searched for SB-GISTs from 2000 to 2015. Pathology reports and clinical notes were reviewed. Imaging was qualitatively reviewed for pattern of enhancement categorized into homogeneous or heterogeneous groups. Nonparametric statistical analysis was performed comparing enhancement to segment of bowel involved, presence of necrosis, tumor size, histologic grade (ie, G1 or G2), and histologic risk of progression (ie low, moderate, high). For simplicity, risk of progression was binned into low-risk or non-low-risk groups. Twenty-six pathology-proven, first presentation, nonmetastatic SB-GISTs were included into study. Seventeen were located in duodenum, 7 in jejunum, and 2 within the ileum. Dual phase (arterial and venous) CT imaging was available for 22 cases. Four cases did not have dual phase (three venous phase and one arterial phase only). Seventeen cases demonstrated heterogeneous enhancement and 9 cases homogeneous enhancement. Statistically significant difference was found between size versus enhancement groups (3.1 cm for homogeneous versus 6.8 cm for heterogeneous) (Mann-Whitney U test, n = 26, P = 0.002). Presence of necrosis versus enhancement group was statistically significant (Pearson χ, P = 0.001). Low-risk and non-low-risk groups versus enhancement groups was very significant (P = 0.001). Bowel segment involvement and histologic grading versus enhancement group did not reach statistical significance (P = 0.174 and P = 0.07, respectively). This case series reveals an important significant association between heterogeneous enhancement and non-low risk (ie, moderate/high) SB-GISTs. Beyond just describing the tumor, using enhancing pattern, the interpreting radiologist can preoperatively suggest additional prognostic information, potentially helpful for surgical planning.
CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.
Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro
2017-03-30
Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.
Effects of a rater training on rating accuracy in a physical examination skills assessment
Weitz, Gunther; Vinzentius, Christian; Twesten, Christoph; Lehnert, Hendrik; Bonnemeier, Hendrik; König, Inke R.
2014-01-01
Background: The accuracy and reproducibility of medical skills assessment is generally low. Rater training has little or no effect. Our knowledge in this field, however, relies on studies involving video ratings of overall clinical performances. We hypothesised that a rater training focussing on the frame of reference could improve accuracy in grading the curricular assessment of a highly standardised physical head-to-toe examination. Methods: Twenty-one raters assessed the performance of 242 third-year medical students. Eleven raters had been randomly assigned to undergo a brief frame-of-reference training a few days before the assessment. 218 encounters were successfully recorded on video and re-assessed independently by three additional observers. Accuracy was defined as the concordance between the raters' grade and the median of the observers' grade. After the assessment, both students and raters filled in a questionnaire about their views on the assessment. Results: Rater training did not have a measurable influence on accuracy. However, trained raters rated significantly more stringently than untrained raters, and their overall stringency was closer to the stringency of the observers. The questionnaire indicated a higher awareness of the halo effect in the trained raters group. Although the self-assessment of the students mirrored the assessment of the raters in both groups, the students assessed by trained raters felt more discontent with their grade. Conclusions: While training had some marginal effects, it failed to have an impact on the individual accuracy. These results in real-life encounters are consistent with previous studies on rater training using video assessments of clinical performances. The high degree of standardisation in this study was not suitable to harmonize the trained raters’ grading. The data support the notion that the process of appraising medical performance is highly individual. A frame-of-reference training as applied does not effectively adjust the physicians' judgement on medical students in real-live assessments. PMID:25489341
Effects of a rater training on rating accuracy in a physical examination skills assessment.
Weitz, Gunther; Vinzentius, Christian; Twesten, Christoph; Lehnert, Hendrik; Bonnemeier, Hendrik; König, Inke R
2014-01-01
The accuracy and reproducibility of medical skills assessment is generally low. Rater training has little or no effect. Our knowledge in this field, however, relies on studies involving video ratings of overall clinical performances. We hypothesised that a rater training focussing on the frame of reference could improve accuracy in grading the curricular assessment of a highly standardised physical head-to-toe examination. Twenty-one raters assessed the performance of 242 third-year medical students. Eleven raters had been randomly assigned to undergo a brief frame-of-reference training a few days before the assessment. 218 encounters were successfully recorded on video and re-assessed independently by three additional observers. Accuracy was defined as the concordance between the raters' grade and the median of the observers' grade. After the assessment, both students and raters filled in a questionnaire about their views on the assessment. Rater training did not have a measurable influence on accuracy. However, trained raters rated significantly more stringently than untrained raters, and their overall stringency was closer to the stringency of the observers. The questionnaire indicated a higher awareness of the halo effect in the trained raters group. Although the self-assessment of the students mirrored the assessment of the raters in both groups, the students assessed by trained raters felt more discontent with their grade. While training had some marginal effects, it failed to have an impact on the individual accuracy. These results in real-life encounters are consistent with previous studies on rater training using video assessments of clinical performances. The high degree of standardisation in this study was not suitable to harmonize the trained raters' grading. The data support the notion that the process of appraising medical performance is highly individual. A frame-of-reference training as applied does not effectively adjust the physicians' judgement on medical students in real-live assessments.
2011-01-01
Background The purpose of this study was to design and evaluate fluorescent in situ hybridization (FISH) probes for the single-cell detection and enumeration of lactic acid bacteria, in particular organisms belonging to the major phylogenetic groups and species of oral lactobacilli and to Abiotrophia/Granulicatella. Results As lactobacilli are known for notorious resistance to probe penetration, probe-specific assay protocols were experimentally developed to provide maximum cell wall permeability, probe accessibility, hybridization stringency, and fluorescence intensity. The new assays were then applied in a pilot study to three biofilm samples harvested from variably demineralized bovine enamel discs that had been carried in situ for 10 days by different volunteers. Best probe penetration and fluorescent labeling of reference strains were obtained after combined lysozyme and achromopeptidase treatment followed by exposure to lipase. Hybridization stringency had to be established strictly for each probe. Thereafter all probes showed the expected specificity with reference strains and labeled the anticipated morphotypes in dental plaques. Applied to in situ grown biofilms the set of probes detected only Lactobacillus fermentum and bacteria of the Lactobacillus casei group. The most cariogenic biofilm contained two orders of magnitude higher L. fermentum cell numbers than the other biofilms. Abiotrophia/Granulicatella and streptococci from the mitis group were found in all samples at high levels, whereas Streptococcus mutans was detected in only one sample in very low numbers. Conclusions Application of these new group- and species-specific FISH probes to oral biofilm-forming lactic acid bacteria will allow a clearer understanding of the supragingival biome, its spatial architecture and of structure-function relationships implicated during plaque homeostasis and caries development. The probes should prove of value far beyond the field of oral microbiology, as many of them detect non-oral species and phylogenetic groups of importance in a variety of medical conditions and the food industry. PMID:21247450
A Fine-Grained and Privacy-Preserving Query Scheme for Fog Computing-Enhanced Location-Based Service
Yin, Fan; Tang, Xiaohu
2017-01-01
Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching. PMID:28696395
Yang, Xue; Yin, Fan; Tang, Xiaohu
2017-07-11
Location-based services (LBS), as one of the most popular location-awareness applications, has been further developed to achieve low-latency with the assistance of fog computing. However, privacy issues remain a research challenge in the context of fog computing. Therefore, in this paper, we present a fine-grained and privacy-preserving query scheme for fog computing-enhanced location-based services, hereafter referred to as FGPQ. In particular, mobile users can obtain the fine-grained searching result satisfying not only the given spatial range but also the searching content. Detailed privacy analysis shows that our proposed scheme indeed achieves the privacy preservation for the LBS provider and mobile users. In addition, extensive performance analyses and experiments demonstrate that the FGPQ scheme can significantly reduce computational and communication overheads and ensure the low-latency, which outperforms existing state-of-the art schemes. Hence, our proposed scheme is more suitable for real-time LBS searching.
Southern Higher Education and the 1977 State Legislatures. Regional Spotlight Vol. XXI, No. 1.
ERIC Educational Resources Information Center
Anderson, Laura
While 1977 was hardly a banner year for southern higher education in terms of legislative appropriations, it was an improvement over the most recent years. Financial stringency in recent years has given way to modest revenue gains in most southern states, with many reporting small but larger than expected surpluses. Budget increases ranged from 6…
The Impact of the Economy on Libraries and the Impact of Libraries on the Economy of New York State.
ERIC Educational Resources Information Center
Johnson, Shirley B.
New York's library system, which helps to attract desirable information related industries, is endangered by inflation and budget stringency. Library costs have increased more rapidly than the general price level. Technological improvements, cost-saving in the long run, require initial capital outlays. The growth of book collections must keep pace…
SIP Shear Walls: Cyclic Performance of High-Aspect-Ratio Segments and Perforated Walls
Vladimir Kochkin; Douglas R. Rammer; Kevin Kauffman; Thomas Wiliamson; Robert J. Ross
2015-01-01
Increasing stringency of energy codes and the growing market demand for more energy efficient buildings gives structural insulated panel (SIP) construction an opportunity to increase its use in commercial and residential buildings. However, shear wall aspect ratio limitations and lack of knowledge on how to design SIPs with window and door openings are barriers to the...
ERIC Educational Resources Information Center
Harasym, Peter H.; Woloschuk, Wayne; Cunning, Leslie
2008-01-01
Physician-patient communication is a clinical skill that can be learned and has a positive impact on patient satisfaction and health outcomes. A concerted effort at all medical schools is now directed at teaching and evaluating this core skill. Student communication skills are often assessed by an Objective Structure Clinical Examination (OSCE).…
Social Spending in Latin America: The Story of the 1980s. World Bank Discussion Papers No. 106.
ERIC Educational Resources Information Center
Grosh, Margaret E.
This study traces public sector expenditures for nine Latin American countries in the 1980s in order to determine how social services and social well-being fared during the economic stringencies of the decade. The countries included are Argentina, Bolivia, Brazil, Chile, Costa Rica, the Dominican Republic, El Salvador, Jamaica, and Venezuela. The…
Video enhancement of X-ray and neutron radiographs
NASA Technical Reports Server (NTRS)
Vary, A.
1973-01-01
System was devised for displaying radiographs on television screen and enhancing fine detail in picture. System uses analog-computer circuits to process television signal from low-noise television camera. Enhanced images are displayed in black and white and can be controlled to vary degree of enhancement and magnification of details in either radiographic transparencies or opaque photographs.
Blom, Mozes P K
2015-08-05
Recently developed molecular methods enable geneticists to target and sequence thousands of orthologous loci and infer evolutionary relationships across the tree of life. Large numbers of genetic markers benefit species tree inference but visual inspection of alignment quality, as traditionally conducted, is challenging with thousands of loci. Furthermore, due to the impracticality of repeated visual inspection with alternative filtering criteria, the potential consequences of using datasets with different degrees of missing data remain nominally explored in most empirical phylogenomic studies. In this short communication, I describe a flexible high-throughput pipeline designed to assess alignment quality and filter exonic sequence data for subsequent inference. The stringency criteria for alignment quality and missing data can be adapted based on the expected level of sequence divergence. Each alignment is automatically evaluated based on the stringency criteria specified, significantly reducing the number of alignments that require visual inspection. By developing a rapid method for alignment filtering and quality assessment, the consistency of phylogenetic estimation based on exonic sequence alignments can be further explored across distinct inference methods, while accounting for different degrees of missing data.
Nealson, K. H.; Wimpee, B.; Wimpee, C.
1993-01-01
Hybridization probes specific for the luxA genes of four groups of luminous bacteria were used to screen luminous isolates obtained from the Persian Gulf, near Al Khiran, Kuwait Nine of these isolates were identified as Vibrio harveyi, a commonly encountered planktonic isolate, while three others showed no hybridization to any of the four probes (V. harveyi, Vibrio fischeri, Photobacterium phosphoreum, or Photobacterium leiognathi) under high-stringency conditions. Polymerase chain reaction amplification was used to prepare a luxA probe against one of these isolates, K-1, and this probe was screened under high-stringency conditions against a collection of DNAs from luminous bacteria; it was found to hybridize specifically to the DNA of the species Vibrio splendidus. A probe prepared against the type strain of V. splendidus (ATCC 33369) was tested against the collection of luminous bacterial DNA preparations and against the Kuwait isolates and was found to hybridize only against the type strain and the three unidentified Kuwait isolates. Extensive taxonomic analysis by standard methods confirmed the identification of the 13 isolates. Images PMID:16349023
Hybridization parameters revisited: solutions containing SDS.
Rose, Ken; Mason, John O; Lathe, Richard
2002-07-01
Salt concentration governs nucleic acid hybridization according to the Schildkraut-Lifson equation. High concentrations of SDS are used in some common protocols, but the effects of SDS on hybridization stringency have not been reported. We investigated hybridization parameters in solutions containing SDS. With targets immobilized on nylon membranes and PCR- or transcription-generated probes, we report that the 50% dissociation temperature (Tm*) in the absence of SDS was 15 degrees C-17degrees C lower than the calculated Tm. SDS had only modest effects on Tm* [1% (w/v) equating to 8 mM NaCl]. RNA/DNA hybrids were approximately 11 degrees C more stable than DNA/DNA hybrids. Incomplete homology (69%) significantly reduced the Tm* for DNA/DNA hybrids (approximately /4degrees C; 0.45 degrees C/% nonhomology) but far less so for RNA/DNA hybrids (approximately 2.3 degrees C; approximately 0.07 degrees C/% non-homology); incomplete homology also markedly reduced the extent of hybridization. On these nylonfilters, SDS had a major effect on nonspecific binding. Buffers lacking SDS, or with low salt concentration, gave high hybridization backgrounds; buffers containing SDS, or high-salt buffers, gave reproducibly low backgrounds.
Frederick C. Meinzer; David R. Woodruff; Danielle E. Marias; Duncan D. Smith; Katherine A. McCulloh; Ava R. Howard; Alicia L. Magedman; Josep Penuelas
2016-01-01
The concept of iso- vs. anisohydry has been used to describe the stringency of stomatal regulation of plant water potential (Ï). However, metrics that accurately and consistently quantify speciesâ operating ranges along a continuum of iso- to anisohydry have been elusive. Additionally, most approaches to quantifying iso/anisohydry require labour-intensive measurements...
Inactive Wells: Economic and Policy Issues
NASA Astrophysics Data System (ADS)
Krupnick, A.
2016-12-01
This paper examines the economic and policy issues associated with various types of inactive oil and gas wells. It covers the costs of decommissioning wells, and compares them to the bonding requirements on these wells, looking at a large number of states. It also reviews the detailed regulations governing treatment of inactive wells by states and the federal government and compares them according to their completeness and stringency.
ERIC Educational Resources Information Center
Wasser, Henry, Ed.
Proceedings from a conference on the economics of higher education are presented. Following an introduction by Henry Wasser and opening remarks by Mina Rees, a paper by Gareth Williams is presented. In "The Buffer Under Pressure: An Examination of the British System of Financing Higher Education in Periods of Affluence and Stringency,"…
Frederick C. Meinzer; Duncan D. Smith; David R. Woodruff; Danielle E. Marias; Katherine A. McCulloh; Ava R. Howard; Alicia L. Magedman
2017-01-01
Speciesâ differences in the stringency of stomatal control of plant water potential represent a continuum of isohydric to anisohydric behaviours. However, little is known about how quasi-steady-state stomatal regulation of water potential may relate to dynamic behaviour of stomata and photosynthetic gas exchange in species operating at different positions along this...
ERIC Educational Resources Information Center
Wong, Vivian C.; Wing, Coady; Martin, David; Krishnamachari, Anandita
2018-01-01
When No Child Left Behind (NCLB) became law in 2002, it was viewed as an effort to create uniform standards for students and schools across the country. More than a decade later, we know surprisingly little about how states actually implemented NCLB and the extent to which state implementation decisions managed to undo the centralizing objectives…
NASA Astrophysics Data System (ADS)
Chen, Biao; Jing, Zhenxue; Smith, Andrew P.; Parikh, Samir; Parisky, Yuri
2006-03-01
Dual-energy contrast enhanced digital mammography (DE-CEDM), which is based upon the digital subtraction of low/high-energy image pairs acquired before/after the administration of contrast agents, may provide physicians physiologic and morphologic information of breast lesions and help characterize their probability of malignancy. This paper proposes to use only one pair of post-contrast low / high-energy images to obtain digitally subtracted dual-energy contrast-enhanced images with an optimal weighting factor deduced from simulated characteristics of the imaging chain. Based upon our previous CEDM framework, quantitative characteristics of the materials and imaging components in the x-ray imaging chain, including x-ray tube (tungsten) spectrum, filters, breast tissues / lesions, contrast agents (non-ionized iodine solution), and selenium detector, were systemically modeled. Using the base-material (polyethylene-PMMA) decomposition method based on entrance low / high-energy x-ray spectra and breast thickness, the optimal weighting factor was calculated to cancel the contrast between fatty and glandular tissues while enhancing the contrast of iodized lesions. By contrast, previous work determined the optimal weighting factor through either a calibration step or through acquisition of a pre-contrast low/high-energy image pair. Computer simulations were conducted to determine weighting factors, lesions' contrast signal values, and dose levels as functions of x-ray techniques and breast thicknesses. Phantom and clinical feasibility studies were performed on a modified Selenia full field digital mammography system to verify the proposed method and computer-simulated results. The resultant conclusions from the computer simulations and phantom/clinical feasibility studies will be used in the upcoming clinical study.
Improving Strategies for Low-Income Family Children's Information Literacy
ERIC Educational Resources Information Center
Zhang, Haiyan; Washington, Rodney; Yin, Jianjun
2014-01-01
This article discussed the significance of improving low-income family children's information literacy, which could improve educational quality, enhance children's self-esteem, adapt children to the future competitive world market, as well as the problems in improving low-income family children's information literacy, such as no home computer and…
ERIC Educational Resources Information Center
Deng, Weiling; Monfils, Lora
2017-01-01
Using simulated data, this study examined the impact of different levels of stringency of the valid case inclusion criterion on item response theory (IRT)-based true score equating over 5 years in the context of K-12 assessment when growth in student achievement is expected. Findings indicate that the use of the most stringent inclusion criterion…
ERIC Educational Resources Information Center
Fisher, W. Halder; And Others
The purposes of this stud y were to test the dependability of stated employee selection criteria, to ascertain the degree of congruence between stated and actual selection criteria, and to determine the degree of change in criteria due to "looseness" or "tightness" of the local labor market. Seven test labor market areas representing a variety of…
Evolutionary Telemetry and Command Processor (TCP) architecture
NASA Technical Reports Server (NTRS)
Schneider, John R.
1992-01-01
A low cost, modular, high performance, and compact Telemetry and Command Processor (TCP) is being built as the foundation of command and data handling subsystems for the next generation of satellites. The TCP product line will support command and telemetry requirements for small to large spacecraft and from low to high rate data transmission. It is compatible with the latest TDRSS, STDN and SGLS transponders and provides CCSDS protocol communications in addition to standard TDM formats. Its high performance computer provides computing resources for hosted flight software. Layered and modular software provides common services using standardized interfaces to applications thereby enhancing software re-use, transportability, and interoperability. The TCP architecture is based on existing standards, distributed networking, distributed and open system computing, and packet technology. The first TCP application is planned for the 94 SDIO SPAS 3 mission. The architecture enhances rapid tailoring of functions thereby reducing costs and schedules developed for individual spacecraft missions.
ERIC Educational Resources Information Center
Zwick, Rebecca
2012-01-01
Differential item functioning (DIF) analysis is a key component in the evaluation of the fairness and validity of educational tests. The goal of this project was to review the status of ETS DIF analysis procedures, focusing on three aspects: (a) the nature and stringency of the statistical rules used to flag items, (b) the minimum sample size…
Diverse patterns of genomic targeting by transcriptional regulators in Drosophila melanogaster.
Slattery, Matthew; Ma, Lijia; Spokony, Rebecca F; Arthur, Robert K; Kheradpour, Pouya; Kundaje, Anshul; Nègre, Nicolas; Crofts, Alex; Ptashkin, Ryan; Zieba, Jennifer; Ostapenko, Alexander; Suchy, Sarah; Victorsen, Alec; Jameel, Nader; Grundstad, A Jason; Gao, Wenxuan; Moran, Jennifer R; Rehm, E Jay; Grossman, Robert L; Kellis, Manolis; White, Kevin P
2014-07-01
Annotation of regulatory elements and identification of the transcription-related factors (TRFs) targeting these elements are key steps in understanding how cells interpret their genetic blueprint and their environment during development, and how that process goes awry in the case of disease. One goal of the modENCODE (model organism ENCyclopedia of DNA Elements) Project is to survey a diverse sampling of TRFs, both DNA-binding and non-DNA-binding factors, to provide a framework for the subsequent study of the mechanisms by which transcriptional regulators target the genome. Here we provide an updated map of the Drosophila melanogaster regulatory genome based on the location of 84 TRFs at various stages of development. This regulatory map reveals a variety of genomic targeting patterns, including factors with strong preferences toward proximal promoter binding, factors that target intergenic and intronic DNA, and factors with distinct chromatin state preferences. The data also highlight the stringency of the Polycomb regulatory network, and show association of the Trithorax-like (Trl) protein with hotspots of DNA binding throughout development. Furthermore, the data identify more than 5800 instances in which TRFs target DNA regions with demonstrated enhancer activity. Regions of high TRF co-occupancy are more likely to be associated with open enhancers used across cell types, while lower TRF occupancy regions are associated with complex enhancers that are also regulated at the epigenetic level. Together these data serve as a resource for the research community in the continued effort to dissect transcriptional regulatory mechanisms directing Drosophila development. © 2014 Slattery et al.; Published by Cold Spring Harbor Laboratory Press.
Correlation between Mitochondrial Reactive Oxygen and Severity of Atherosclerosis.
Dorighello, Gabriel G; Paim, Bruno A; Kiihl, Samara F; Ferreira, Mônica S; Catharino, Rodrigo R; Vercesi, Anibal E; Oliveira, Helena C F
2016-01-01
Atherosclerosis has been associated with mitochondria dysfunction and damage. Our group demonstrated previously that hypercholesterolemic mice present increased mitochondrial reactive oxygen (mtROS) generation in several tissues and low NADPH/NADP+ ratio. Here, we investigated whether spontaneous atherosclerosis in these mice could be modulated by treatments that replenish or spare mitochondrial NADPH, named citrate supplementation, cholesterol synthesis inhibition, or both treatments simultaneously. Robust statistical analyses in pooled group data were performed in order to explain the variation of atherosclerosis lesion areas as related to the classic atherosclerosis risk factors such as plasma lipids, obesity, and oxidative stress, including liver mtROS. Using three distinct statistical tools (univariate correlation, adjusted correlation, and multiple regression) with increasing levels of stringency, we identified a novel significant association and a model that reliably predicts the extent of atherosclerosis due to variations in mtROS. Thus, results show that atherosclerosis lesion area is positively and independently correlated with liver mtROS production rates. Based on these findings, we propose that modulation of mitochondrial redox state influences the atherosclerosis extent.
Blow, Frederic C; Walton, Maureen A; Bohnert, Amy S B; Ignacio, Rosalinda V; Chermack, Stephen; Cunningham, Rebecca M; Booth, Brenda M; Ilgen, Mark; Barry, Kristen L
2017-08-01
To examine efficacy of drug brief interventions (BIs) among adults presenting to a low-income urban emergency department (ED). Randomized controlled trial on drug use outcomes at 3, 6 and 12 months. Participants were assigned to (1) computer-delivered BI (Computer BI), (2) therapist-delivered, computer-guided BI (Therapist BI) or (3) enhanced usual care (EUC-ED) for drug-using adults. Participants were re-randomized after the 3-month assessment to either adapted motivational enhancement therapy (AMET) booster or enhanced usual care booster (EUC-B). Patients recruited from low-income urban emergency departments (ED) in Flint, Michigan, USA. A total of 780 ED patients reporting recent drug use, 44% males, mean age = 31 years. Computer BI consisted of an interactive program guided by a virtual health counselor. Therapist BI included computer guidance. The EUC-ED conditions included review of community health and HIV prevention resources. The BIs and boosters were based on motivational interviewing, focusing on reduction of drug use and HIV risk behaviors. Primary outcome was past 90 days using drugs at 6 and 12 months and secondary outcomes were weighted drug-days and days of marijuana use. Percentage changes in mean days used any drug from baseline to 12 months were: Computer BI + EUC-B: -10.9%, P = 0.0844; Therapist BI + EUC-B: -26.7%, P = 0.0041, for EUC-ED + EUC-B: -20.9, P = 0.0011. In adjusted analyses, there was no significant interaction between ED intervention and booster AMET for primary and secondary outcomes. Compared with EUC-ED, Therapist BI reduced number of days using any drug [95% confidence interval (CI) = -0.41, -0.07, P = 0.0422] and weighted drug-days (95% CI = -0.41, -0.08, P = 0.0283). Both Therapist and Computer BI had significantly fewer number of days using marijuana compared to EUC-ED (Therapist BI: 95% CI = -0.42, -0.06, P = 0.0104, Computer BI: 95% CI = -0.34, -0.01, P = 0.0406). Booster effects were not significant. An emergency department-based motivational brief intervention, delivered by a therapist and guided by computer, appears to reduce drug use among adults seeking emergency department care compared with enhanced usual care. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.
Yarygina, M V; Kiku, P F; Gorborukova, T V
2015-01-01
There are presented results of the socio-hygienic analysis of ecologically related skin pathology ofthe population of Primorsky Krai. The aim of the work is to establish the patterns of distribution of ecologically related skin diseases in various ecological bioclimatic zones of Primorsky Krai. There was performed an analysis of skin diseases ofform 12 according to ICD-10 main demographic groups (children, adolescents and adults) of the population of Primorsky Krai, residing in various bioclimatic zones of Krai with different levels of environmental stringency in rural and urban areas for the period of 2000-2013. There were established causal-effect relationships of the prevalence of skin diseases, as ecologically related pathology. The level of the prevalence of skin diseases among the population in the Primorye region depends on bioclimatic zones, the degree of the stringency of the environmental situation and combinations of environmental factors. The prevalence of skin diseases in adults, adolescents and children, as the organism's response to the impact ofthe parameters ofthe environment is affected by the complex offactors, the main of them are hygienic: the sanitary--hygienic level of ambient air pollution, the specification of chemical pollution and adverse physical factors in urban and rural settlements, the characteristics of the state of the soil. The medical-sociological research of the lifestyle of the population was performed on the basis of a specially designed questionnaire. The questionnaire included three arrays of issues: environmental, hygienic and social. On the basis of medical and sociological research with the use of multivariate analysis, method of correlation pleiades advanced by P. V. Terentiev, lifestyle and psycho-emotional factors, socio-hygienic, bioclimatic factors were found to play an important role in the prevalence of ecologically related pathology of skin in residents of Primorsky Krai. Differences in responses between urban and rural residents living in areas with various environmental stringency, confirm the preposition about the ambiguity of the approach to the solution of social, hygienic and environmental problems in areas with different socio-economic situation. The obtained results of the study allow us to determine the main directions of treatment and prevention and to develop targeted prevention programs.
Fukui, Shoichi; Iwamoto, Naoki; Tsuji, Sosuke; Umeda, Masataka; Nishino, Ayako; Nakashima, Yoshikazu; Suzuki, Takahisa; Horai, Yoshiro; Koga, Tomohiro; Kawashiri, Shin-ya; Ichinose, Kunihiro; Hirai, Yasuko; Tamai, Mami; Nakamura, Hideki; Origuchi, Tomoki; Kawakami, Atsushi
2015-01-01
A 55-year-old man was diagnosed with remitting seronegative symmetrical synovitis with pitting edema (RS3PE) syndrome. Contrast-enhanced computed tomography for cancer screening showed a mass with low-density centers with an enhanced rim in the left iliopsoas muscle. We suspected an iliopsoas abscess and performed computed-tomography-guided puncture of the mass. Both Gram staining and the culture of the fluid were negative. We diagnosed the patient with RS3PE syndrome with iliopsoas bursitis and administered low-dose corticosteroids without antibiotics. The symptoms, including left hip pain, quickly disappeared following treatment. Clinicians should be aware that iliopsoas bursitis may resemble an iliopsoas abscess. As a result, it is important to make an accurate differential diagnosis.
Low-Light Image Enhancement Using Adaptive Digital Pixel Binning
Yoo, Yoonjong; Im, Jaehyun; Paik, Joonki
2015-01-01
This paper presents an image enhancement algorithm for low-light scenes in an environment with insufficient illumination. Simple amplification of intensity exhibits various undesired artifacts: noise amplification, intensity saturation, and loss of resolution. In order to enhance low-light images without undesired artifacts, a novel digital binning algorithm is proposed that considers brightness, context, noise level, and anti-saturation of a local region in the image. The proposed algorithm does not require any modification of the image sensor or additional frame-memory; it needs only two line-memories in the image signal processor (ISP). Since the proposed algorithm does not use an iterative computation, it can be easily embedded in an existing digital camera ISP pipeline containing a high-resolution image sensor. PMID:26121609
Stringency of the 2-His–1-Asp Active-Site Motif in Prolyl 4-Hydroxylase
Gorres, Kelly L.; Pua, Khian Hong; Raines, Ronald T.
2009-01-01
The non-heme iron(II) dioxygenase family of enzymes contain a common 2-His–1-carboxylate iron-binding motif. These enzymes catalyze a wide variety of oxidative reactions, such as the hydroxylation of aliphatic C–H bonds. Prolyl 4-hydroxylase (P4H) is an α-ketoglutarate-dependent iron(II) dioxygenase that catalyzes the post-translational hydroxylation of proline residues in protocollagen strands, stabilizing the ensuing triple helix. Human P4H residues His412, Asp414, and His483 have been identified as an iron-coordinating 2-His–1-carboxylate motif. Enzymes that catalyze oxidative halogenation do so by a mechanism similar to that of P4H. These halogenases retain the active-site histidine residues, but the carboxylate ligand is replaced with a halide ion. We replaced Asp414 of P4H with alanine (to mimic the active site of a halogenase) and with glycine. These substitutions do not, however, convert P4H into a halogenase. Moreover, the hydroxylase activity of D414A P4H cannot be rescued with small molecules. In addition, rearranging the two His and one Asp residues in the active site eliminates hydroxylase activity. Our results demonstrate a high stringency for the iron-binding residues in the P4H active site. We conclude that P4H, which catalyzes an especially demanding chemical transformation, is recalcitrant to change. PMID:19890397
Developing an alcohol policy assessment toolkit: application in the western Pacific.
Carragher, Natacha; Byrnes, Joshua; Doran, Christopher M; Shakeshaft, Anthony
2014-10-01
To demonstrate the development and feasibility of a tool to assess the adequacy of national policies aimed at reducing alcohol consumption and related problems. We developed a quantitative tool - the Toolkit for Evaluating Alcohol policy Stringency and Enforcement (TEASE-16) - to assess the level of stringency and enforcement of 16 alcohol control policies. TEASE-16 was applied to policy data from nine study areas in the western Pacific: Australia, China excluding Hong Kong Special Administrative Region (SAR), Hong Kong SAR, Japan, Malaysia, New Zealand, the Philippines, Singapore and Viet Nam. Correlation and regression analyses were then used to examine the relationship between alcohol policy scores and income-adjusted levels of alcohol consumption per capita. Vast differences exist in how alcohol control policies are implemented in the western Pacific. Out of a possible 100 points, the nine study areas achieved TEASE-16 scores that ranged from 24.1 points for the Philippines to 67.5 points for Australia. Study areas with high policy scores - indicating relatively strong alcohol policy frameworks - had lower alcohol consumption per capita. Sensitivity analyses indicated scores and rankings for each study area remained relatively stable across different weighting schemes, indicating that TEASE-16 was robust. TEASE-16 could be used by international and national regulatory bodies and policy-makers to guide the design, implementation, evaluation and refinement of effective policies to reduce alcohol consumption and related problems.
miR-MaGiC improves quantification accuracy for small RNA-seq.
Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina
2018-05-15
Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.
A Low-Tech, Hands-On Approach To Teaching Sorting Algorithms to Working Students.
ERIC Educational Resources Information Center
Dios, R.; Geller, J.
1998-01-01
Focuses on identifying the educational effects of "activity oriented" instructional techniques. Examines which instructional methods produce enhanced learning and comprehension. Discusses the problem of learning "sorting algorithms," a major topic in every Computer Science curriculum. Presents a low-tech, hands-on teaching method for sorting…
Content analysis of antiretroviral adherence enhancing interview reports.
Kamal, Susan; Nulty, Paul; Bugnon, Olivier; Cavassini, Matthias; Schneider, Marie P
2018-05-17
To identify factors associated with low or high antiretroviral (ARV) adherence through computational text analysis of an adherence enhancing programme interview reports. Using text from 8428 interviews with 522 patients, we constructed a term-frequency matrix for each patient, retaining words that occurred at least ten times overall and used in at least six interviews with six different patients. The text included both the pharmacist's and the patient's verbalizations. We investigated their association with an adherence threshold (above or below 90%) using a regularized logistic regression model. In addition to this data-driven approach, we studied the contexts of words with a focus group. Analysis resulted in 7608 terms associated with low or high adherence. Terms associated with low adherence included disruption in daily schedule, side effects, socio-economic factors, stigma, cognitive factors and smoking. Terms associated with high adherence included fixed medication intake timing, no side effects and positive psychological state. Computational text analysis helps to analyze a large corpus of adherence enhancing interviews. It confirms main known themes affecting ARV adherence and sheds light on new emerging themes. Health care providers should be aware of factors that are associated with low or high adherence. This knowledge should reinforce the supporting factors and try to resolve the barriers together with the patient. Copyright © 2018 Elsevier B.V. All rights reserved.
Evaluating biomarkers for prognostic enrichment of clinical trials.
Kerr, Kathleen F; Roth, Jeremy; Zhu, Kehao; Thiessen-Philbrook, Heather; Meisner, Allison; Wilson, Francis Perry; Coca, Steven; Parikh, Chirag R
2017-12-01
A potential use of biomarkers is to assist in prognostic enrichment of clinical trials, where only patients at relatively higher risk for an outcome of interest are eligible for the trial. We investigated methods for evaluating biomarkers for prognostic enrichment. We identified five key considerations when considering a biomarker and a screening threshold for prognostic enrichment: (1) clinical trial sample size, (2) calendar time to enroll the trial, (3) total patient screening costs and the total per-patient trial costs, (4) generalizability of trial results, and (5) ethical evaluation of trial eligibility criteria. Items (1)-(3) are amenable to quantitative analysis. We developed the Biomarker Prognostic Enrichment Tool for evaluating biomarkers for prognostic enrichment at varying levels of screening stringency. We demonstrate that both modestly prognostic and strongly prognostic biomarkers can improve trial metrics using Biomarker Prognostic Enrichment Tool. Biomarker Prognostic Enrichment Tool is available as a webtool at http://prognosticenrichment.com and as a package for the R statistical computing platform. In some clinical settings, even biomarkers with modest prognostic performance can be useful for prognostic enrichment. In addition to the quantitative analysis provided by Biomarker Prognostic Enrichment Tool, investigators must consider the generalizability of trial results and evaluate the ethics of trial eligibility criteria.
Simple Penalties on Maximum-Likelihood Estimates of Genetic Parameters to Reduce Sampling Variation
Meyer, Karin
2016-01-01
Multivariate estimates of genetic parameters are subject to substantial sampling variation, especially for smaller data sets and more than a few traits. A simple modification of standard, maximum-likelihood procedures for multivariate analyses to estimate genetic covariances is described, which can improve estimates by substantially reducing their sampling variances. This is achieved by maximizing the likelihood subject to a penalty. Borrowing from Bayesian principles, we propose a mild, default penalty—derived assuming a Beta distribution of scale-free functions of the covariance components to be estimated—rather than laboriously attempting to determine the stringency of penalization from the data. An extensive simulation study is presented, demonstrating that such penalties can yield very worthwhile reductions in loss, i.e., the difference from population values, for a wide range of scenarios and without distorting estimates of phenotypic covariances. Moreover, mild default penalties tend not to increase loss in difficult cases and, on average, achieve reductions in loss of similar magnitude to computationally demanding schemes to optimize the degree of penalization. Pertinent details required for the adaptation of standard algorithms to locate the maximum of the likelihood function are outlined. PMID:27317681
Mechanisms and Evolution of Control Logic in Prokaryotic Transcriptional Regulation
van Hijum, Sacha A. F. T.; Medema, Marnix H.; Kuipers, Oscar P.
2009-01-01
Summary: A major part of organismal complexity and versatility of prokaryotes resides in their ability to fine-tune gene expression to adequately respond to internal and external stimuli. Evolution has been very innovative in creating intricate mechanisms by which different regulatory signals operate and interact at promoters to drive gene expression. The regulation of target gene expression by transcription factors (TFs) is governed by control logic brought about by the interaction of regulators with TF binding sites (TFBSs) in cis-regulatory regions. A factor that in large part determines the strength of the response of a target to a given TF is motif stringency, the extent to which the TFBS fits the optimal TFBS sequence for a given TF. Advances in high-throughput technologies and computational genomics allow reconstruction of transcriptional regulatory networks in silico. To optimize the prediction of transcriptional regulatory networks, i.e., to separate direct regulation from indirect regulation, a thorough understanding of the control logic underlying the regulation of gene expression is required. This review summarizes the state of the art of the elements that determine the functionality of TFBSs by focusing on the molecular biological mechanisms and evolutionary origins of cis-regulatory regions. PMID:19721087
ERIC Educational Resources Information Center
Hadi, Marham Jupri
2013-01-01
Researcher's observation on his ESL class indicates the main issues concerning the writing skills: learners' low motivation to write, minimum interaction in writing, and poor writing skills. These limitations have led them to be less confidence to write in English. This article discusses how computers can be used for the purpose of increasing…
ERIC Educational Resources Information Center
Fichten, Catherine S.; Asuncion, Jennison V.; Barile, Maria; Ferraro, Vittoria; Wolforth, Joan
2009-01-01
This article presents the results of two studies on the accessibility of e-learning materials and other information and computer and communication technologies for 143 Canadian college and university students with low vision and 29 who were blind. It offers recommendations for enhancing access, creating new learning opportunities, and eliminating…
The Use of Spatialized Speech in Auditory Interfaces for Computer Users Who Are Visually Impaired
ERIC Educational Resources Information Center
Sodnik, Jaka; Jakus, Grega; Tomazic, Saso
2012-01-01
Introduction: This article reports on a study that explored the benefits and drawbacks of using spatially positioned synthesized speech in auditory interfaces for computer users who are visually impaired (that is, are blind or have low vision). The study was a practical application of such systems--an enhanced word processing application compared…
Power Efficient Hardware Architecture of SHA-1 Algorithm for Trusted Mobile Computing
NASA Astrophysics Data System (ADS)
Kim, Mooseop; Ryou, Jaecheol
The Trusted Mobile Platform (TMP) is developed and promoted by the Trusted Computing Group (TCG), which is an industry standard body to enhance the security of the mobile computing environment. The built-in SHA-1 engine in TMP is one of the most important circuit blocks and contributes the performance of the whole platform because it is used as key primitives supporting platform integrity and command authentication. Mobile platforms have very stringent limitations with respect to available power, physical circuit area, and cost. Therefore special architecture and design methods for low power SHA-1 circuit are required. In this paper, we present a novel and efficient hardware architecture of low power SHA-1 design for TMP. Our low power SHA-1 hardware can compute 512-bit data block using less than 7,000 gates and has a power consumption about 1.1 mA on a 0.25μm CMOS process.
Bhateja, Vikrant; Moin, Aisha; Srivastava, Anuja; Bao, Le Nguyen; Lay-Ekuakille, Aimé; Le, Dac-Nhuong
2016-07-01
Computer based diagnosis of Alzheimer's disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer's disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhateja, Vikrant, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn; Moin, Aisha; Srivastava, Anuja
Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Componentmore » Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).« less
NASA Technical Reports Server (NTRS)
Kim, B. F.; Moorjani, K.; Phillips, T. E.; Adrian, F. J.; Bohandy, J.; Dolecek, Q. E.
1993-01-01
A method for characterization of granular superconducting thin films has been developed which encompasses both the morphological state of the sample and its fabrication process parameters. The broad scope of this technique is due to the synergism between experimental measurements and their interpretation using numerical simulation. Two novel technologies form the substance of this system: the magnetically modulated resistance method for characterizing superconductors; and a powerful new computer peripheral, the Parallel Information Processor card, which provides enhanced computing capability for PC computers. This enhancement allows PC computers to operate at speeds approaching that of supercomputers. This makes atomic scale simulations possible on low cost machines. The present development of this system involves the integration of these two technologies using mesoscale simulations of thin film growth. A future stage of development will incorporate atomic scale modeling.
In Vivo EPR Resolution Enhancement Using Techniques Known from Quantum Computing Spin Technology.
Rahimi, Robabeh; Halpern, Howard J; Takui, Takeji
2017-01-01
A crucial issue with in vivo biological/medical EPR is its low signal-to-noise ratio, giving rise to the low spectroscopic resolution. We propose quantum hyperpolarization techniques based on 'Heat Bath Algorithmic Cooling', allowing possible approaches for improving the resolution in magnetic resonance spectroscopy and imaging.
Adjustable typography: an approach to enhancing low vision text accessibility.
Arditi, Aries
2004-04-15
Millions of people have low vision, a disability condition caused by uncorrectable or partially correctable disorders of the eye. The primary goal of low vision rehabilitation is increasing access to printed material. This paper describes how adjustable typography, a computer graphic approach to enhancing text accessibility, can play a role in this process, by allowing visually-impaired users to customize fonts to maximize legibility according to their own visual needs. Prototype software and initial testing of the concept is described. The results show that visually-impaired users tend to produce a variety of very distinct fonts, and that the adjustment process results in greatly enhanced legibility. But this initial testing has not yet demonstrated increases in legibility over and above the legibility of highly legible standard fonts such as Times New Roman.
Boonyapakron, Katewadee; Jaruwat, Aritsara; Liwnaree, Benjamas; Nimchua, Thidarat; Champreda, Verawat; Chitnumsub, Penchit
2017-10-10
In the pulp bleaching industry, enzymes with robust activity at high pH and temperatures are desirable for facilitating the pre-bleaching process with simplified processing and minimal use of chlorinated compounds. To engineer an enzyme for this purpose, we determined the crystal structure of the Xyn12.2 xylanase, a xylan-hydrolyzing enzyme derived from the termite gut symbiont metagenome, as the basis for structure-based protein engineering to improve Xyn12.2 stability in high heat and alkaline conditions. Engineered cysteine pairs that generated exterior disulfide bonds increased the k cat of Xyn12.2 variants and melting temperature at all tested conditions. These improvements led to up to 4.2-fold increases in catalytic efficiency at pH 9.0, 50°C for 1h and up to 3-fold increases at 60°C. The most effective variants, XynTT and XynTTTE, exhibited 2-3-fold increases in bagasse hydrolysis at pH 9.0 and 60°C compared to the wild-type enzyme. Overall, engineering arginines and phenylalanines for increased pK a and hydrogen bonding improved enzyme catalytic efficiency at high stringency conditions. These modifications were the keys to enhancing thermostability and alkaliphilicity in our enzyme variants, with XynTT and XynTTTE being especially promising for their application to the pulp and paper industry. Copyright © 2017 Elsevier B.V. All rights reserved.
Crane, Michael; Steinwand, Dan; Beckmann, Tim; Krpan, Greg; Liu, Shu-Guang; Nichols, Erin; Haga, Jim; Maddox, Brian; Bilderback, Chris; Feller, Mark; Homer, George
2001-01-01
The overarching goal of this project is to build a spatially distributed infrastructure for information science research by forming a team of information science researchers and providing them with similar hardware and software tools to perform collaborative research. Four geographically distributed Centers of the U.S. Geological Survey (USGS) are developing their own clusters of low-cost, personal computers into parallel computing environments that provide a costeffective way for the USGS to increase participation in the high-performance computing community. Referred to as Beowulf clusters, these hybrid systems provide the robust computing power required for conducting information science research into parallel computing systems and applications.
Infection control practices and zoonotic disease risks among veterinarians in the United States.
Wright, Jennifer G; Jung, Sherry; Holman, Robert C; Marano, Nina N; McQuiston, Jennifer H
2008-06-15
OBJECTIVE-To assess the knowledge and use of infection control practices (ICPs) among US veterinarians. DESIGN-Anonymous mail-out population survey. PROCEDURES-In 2005 a questionnaire was mailed to US small animal, large animal, and equine veterinarians who were randomly selected from the AVMA membership to assess precaution awareness (PA) and veterinarians' perceptions of zoonotic disease risks. Respondents were assigned a PA score (0 to 4) on the basis of their responses (higher scores representing higher stringency of ICPs); within a practice type, respondents' scores were categorized as being within the upper 25% or lower 75% of scores (high and low PA ranking, respectively). Characteristics associated with low PA rankings were assessed. RESULTS-Generally, respondents did not engage in protective behaviors or use personal protective equipment considered appropriate to protect against zoonotic disease transmission. Small animal and equine veterinarians employed in practices that had no written infection control policy were significantly more likely to have low PA ranking. Male gender was associated with low PA ranking among small animal and large animal veterinarians; equine practitioners not working in a teaching or referral hospital were more likely to have low PA ranking than equine practitioners working in such institutions. CONCLUSIONS AND CLINICAL RELEVANCE-Results indicated that most US veterinarians are not aware of appropriate personal protective equipment use and do not engage in practices that may help reduce zoonotic disease transmission. Gender differences may influence personal choices for ICPs. Provision of information and training on ICPs and establishment of written infection control policies could be effective means of improving ICPs in veterinary practices.
Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization
NASA Technical Reports Server (NTRS)
Dominguez, Jesus A.; Klinko, Steve
2011-01-01
This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.
Rapp, Martin; Ley, Charles J; Hansson, Kerstin; Sjöström, Lennart
2017-03-20
To describe postoperative computed tomography (CT) and magnetic resonance imaging (MRI) findings in dogs with degenerative lumbosacral stenosis (DLSS) treated by dorsal laminectomy and partial discectomy. Prospective clinical case study of dogs diagnosed with and treated for DLSS. Surgical and clinical findings were described. Computed tomography and low field MRI findings pre- and postoperatively were described and graded. Clinical, CT and MRI examinations were performed four to 18 months after surgery. Eleven of 13 dogs were clinically improved and two dogs had unchanged clinical status postoperatively despite imaging signs of neural compression. Vacuum phenomenon, spondylosis, sclerosis of the seventh lumbar (L7) and first sacral (S1) vertebrae endplates and lumbosacral intervertebral joint osteoarthritis became more frequent in postoperative CT images. Postoperative MRI showed mild disc extrusions in five cases, and in all cases contrast enhancing non-discal tissue was present. All cases showed contrast enhancement of the L7 spinal nerves both pre- and postoperatively and seven had contrast enhancement of the lumbosacral intervertebral joints and paraspinal tissue postoperatively. Articular process fractures or fissures were noted in four dogs. The study indicates that imaging signs of neural compression are common after DLSS surgery, even in dogs that have clinical improvement. Contrast enhancement of spinal nerves and soft tissues around the region of disc herniation is common both pre- and postoperatively and thus are unreliable criteria for identifying complications of the DLSS surgery.
Computer graphics application in the engineering design integration system
NASA Technical Reports Server (NTRS)
Glatt, C. R.; Abel, R. W.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Stewart, W. A.
1975-01-01
The computer graphics aspect of the Engineering Design Integration (EDIN) system and its application to design problems were discussed. Three basic types of computer graphics may be used with the EDIN system for the evaluation of aerospace vehicles preliminary designs: offline graphics systems using vellum-inking or photographic processes, online graphics systems characterized by direct coupled low cost storage tube terminals with limited interactive capabilities, and a minicomputer based refresh terminal offering highly interactive capabilities. The offline line systems are characterized by high quality (resolution better than 0.254 mm) and slow turnaround (one to four days). The online systems are characterized by low cost, instant visualization of the computer results, slow line speed (300 BAUD), poor hard copy, and the early limitations on vector graphic input capabilities. The recent acquisition of the Adage 330 Graphic Display system has greatly enhanced the potential for interactive computer aided design.
Comparing climate policy co-benefits in the United States and China
NASA Astrophysics Data System (ADS)
Selin, N. E.; Li, M.; Saari, R.; Zhang, D.; Karplus, V. J.; Li, C. T.; Thompson, T. M.; Mulvaney, K. M.; Rausch, S.
2016-12-01
We use modeling approaches that integrate atmospheric chemistry, economic analysis, and health impacts calculations to evaluate and compare the co-benefits of greenhouse gas reduction policies for the U.S. and China. Climate policies can have a variety of co-benefits for air quality, in particular on the concentrations of health-damaging pollutants O3 and PM2.5 as well as toxic pollutants such as mercury. Co-benefits. Controlling CO2 sources such as power plants and vehicles can lead to concomitant reductions in other pollutants such as SO2, NOx, and Hg. Here, we present and discuss our evaluations of co-benefits in the U.S. and China, at both national and regional (state/provincial) scales. In particular, we assess how policy design, stringency, and energy system characteristics affect projected co-benefits and their valuation for human health. We find that at the national scale in both the U.S. and China, monetized co-benefits can offset the costs of carbon policies. U.S. co-benefits exhibit diminishing returns to projected increases in policy stringency, while Chinese co-benefits show different behavior. The magnitude of health-related co-benefits is sensitive to the economic valuation methodology, and to the choice of health impact function, with China-specific functions yielding substantially less co-benefit than those typically used in the U.S. In both cases, we see overall national-scale co-benefits but substantial regional variation at sub-national scales, illustrating the benefits of coupled atmospheric-economic analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaturvedi, Vaibhav; Clarke, Leon E.; Edmonds, James A.
Electrification plays a crucial role in cost-effective greenhouse gas emissions mitigation strategies. Such strategies in turn carry implications for financial capital markets. This paper explores the implication of climate mitigation policy for capital investment demands by the electric power sector on decade to century time scales. We go further to explore the implications of technology performance and the stringency of climate policy for capital investment demands by the power sector. Finally, we discuss the regional distribution of investment demands. We find that stabilizing GHG emissions will require additional investment in the electricity generation sector over and above investments that wouldmore » be need in the absence of climate policy, in the range of 16 to 29 Trillion US$ (60-110%) depending on the stringency of climate policy during the period 2015 to 2095 under default technology assumptions. This increase reflects the higher capital intensity of power systems that control emissions. Limits on the penetration of nuclear and carbon capture and storage technology could increase costs substantially. Energy efficiency improvements can reduce the investment requirement by 8 to21 Trillion US$ (default technology assumptions), depending on climate policy scenario with higher savings being obtained under the most stringent climate policy. The heaviest investments in power generation were observed in the China, India, SE Asia and Africa regions with the latter three regions dominating in the second half of the 21st century.« less
Student Assistant for Learning from Text (SALT): a hypermedia reading aid.
MacArthur, C A; Haynes, J B
1995-03-01
Student Assistant for Learning from Text (SALT) is a software system for developing hypermedia versions of textbooks designed to help students with learning disabilities and other low-achieving students to compensate for their reading difficulties. In the present study, 10 students with learning disabilities (3 young women and 7 young men ages 15 to 17) in Grades 9 and 10 read passages from a science textbook using a basic computer version and an enhanced computer version. The basic version included the components found in the printed textbook (text, graphics, outline, and questions) and a notebook. The enhanced version added speech synthesis, an on-line glossary, links between questions and text, highlighting of main ideas, and supplementary explanations that summarized important ideas. Students received significantly higher comprehension scores using the enhanced version. Furthermore, students preferred the enhanced version and thought it helped them learn the material better.
Scoarughi, G L; Cimmino, C; Donini, P
1995-01-01
The stringent halobacterial strain Haloferax volcanii was subjected to a set of physiological conditions different from amino acid starvation that are known to cause production of guanosine polyphosphates [(p)pp Gpp] in eubacteria via the relA-independent (spoT) pathway. The conditions used were temperature upshift, treatment with cyanide, and total starvation. Under none of these conditions were detectable levels of (p)ppGpp observed. This result, in conjunction with our previous finding that (p)ppGpp synthesis does not occur under amino acid starvation, leads to the conclusion that in halobacteria both growth rate control and stringency are probably governed by mechanisms that operate in the absence of ppGpp. During exponential growth, a low level of phosphorylated compounds with electrophoretic mobilities similar, but not identical, to that of (p)ppGpp were observed. The intracellular concentration of these compounds increased considerably during the stationary phase of growth and with all of the treatments used. The compounds were identified as short-chain polyphosphates identical to those found under similar conditions in Saccharomyces cerevisiae. PMID:7798153
Golczyk, Hieronim; Hasterok, Robert; Szklarczyk, Marek
2010-12-01
High- and low-stringency FISH and base-specific fluorescence were performed on the permanent translocation heterozygote Rhoeo spathacea (2n = 12). Our results indicate that 45S rDNA arrays, rDNA-related sequences and other GC-rich DNA fraction(s) are located within the pericentromeric regions of all twelve chromosomes, usually colocalizing with the chromomycin A(3)-positive bands. Homogenization of the pericentromeric regions appears to result from the concerted spread of GC-rich sequences, with differential amplification likely. We found new 5S rDNA patterns, which suggest a variability in the breakpoints and in the consequent chromosome reorganizations. It was found that the large 5S rDNA locus residing on each of the 8E and 9E arms consisted of two smaller loci. On each of the two chromosome arms 3b and 4b, in addition to the major subtelomeric 5S rDNA locus, a new minor locus was found interstitially about 40% along the arm length. The arrangement of cytotogenetic landmarks and chromosome arm measurements are discussed with regard to genome repatterning in Rhoeo.
Angular dose anisotropy around gold nanoparticles exposed to X-rays.
Gadoue, Sherif M; Toomeh, Dolla; Zygmanski, Piotr; Sajo, Erno
2017-07-01
Gold nanoparticle (GNP) radiotherapy has recently emerged as a promising modality in cancer treatment. The use of high atomic number nanoparticles can lead to enhanced radiation dose in tumors due to low-energy leakage electrons depositing in the vicinity of the GNP. A single metric, the dose enhancement ratio has been used in the literature, often in substantial disagreement, to quantify the GNP's capacity to increase local energy deposition. This 1D approach neglects known sources of dose anisotropy and assumes that one average value is representative of the dose enhancement. Whether this assumption is correct and within what accuracy limits it could be trusted, have not been studied due to computational difficulties at the nanoscale. Using a next-generation deterministic computational method, we show that significant dose anisotropy exists which may have radiobiological consequences, and can impact the treatment outcome as well as the development of treatment planning computational methods. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Suwannasri, A.; Kaewlai, R.; Asavaphatiboon, S.
2016-03-01
This study was to determine if administration of a low volume high-concentration iodinated contrast medium can preserve image quality in comparison with regular-concentration intravenous contrast medium in patient undergoing contrast-enhancement abdominal computed tomography (CT). Eighty-four patients were randomly divided into 3 groups of similar iodine delivery rate; A: 1.2 cc/kg of iomeprol-400, B: 1.0 cc/kg of iomeprol-400 and C: 1.5 cc/kg of ioversol-350. Contrast enhancement of the liver parenchyma, pancreas and aorta was quantitatively measured in Hounsfield units and qualitative assessed by a radiologist. T-test was used to evaluate contrast enhancement, and Chi-square test was used to evaluate qualitative image assessment, at significance level of 0.05 with 95% confidence intervals. There were no statistically significant differences in contrast enhancement of liver parenchyma and pancreas between group A and group C in both quantitative and qualitative analyses. Group C showed superior vascular enhancement to group A and B on quantitative analysis.
Inverse halftoning via robust nonlinear filtering
NASA Astrophysics Data System (ADS)
Shen, Mei-Yin; Kuo, C.-C. Jay
1999-10-01
A new blind inverse halftoning algorithm based on a nonlinear filtering technique of low computational complexity and low memory requirement is proposed in this research. It is called blind since we do not require the knowledge of the halftone kernel. The proposed scheme performs nonlinear filtering in conjunction with edge enhancement to improve the quality of an inverse halftoned image. Distinct features of the proposed approach include: efficiently smoothing halftone patterns in large homogeneous areas, additional edge enhancement capability to recover the edge quality and an excellent PSNR performance with only local integer operations and a small memory buffer.
Single cell transcriptomic analysis of prostate cancer cells.
Welty, Christopher J; Coleman, Ilsa; Coleman, Roger; Lakely, Bryce; Xia, Jing; Chen, Shu; Gulati, Roman; Larson, Sandy R; Lange, Paul H; Montgomery, Bruce; Nelson, Peter S; Vessella, Robert L; Morrissey, Colm
2013-02-16
The ability to interrogate circulating tumor cells (CTC) and disseminated tumor cells (DTC) is restricted by the small number detected and isolated (typically <10). To determine if a commercially available technology could provide a transcriptomic profile of a single prostate cancer (PCa) cell, we clonally selected and cultured a single passage of cell cycle synchronized C4-2B PCa cells. Ten sets of single, 5-, or 10-cells were isolated using a micromanipulator under direct visualization with an inverted microscope. Additionally, two groups of 10 individual DTC, each isolated from bone marrow of 2 patients with metastatic PCa were obtained. RNA was amplified using the WT-Ovation™ One-Direct Amplification System. The amplified material was hybridized on a 44K Whole Human Gene Expression Microarray. A high stringency threshold, a mean Alexa Fluor® 3 signal intensity above 300, was used for gene detection. Relative expression levels were validated for select genes using real-time PCR (RT-qPCR). Using this approach, 22,410, 20,423, and 17,009 probes were positive on the arrays from 10-cell pools, 5-cell pools, and single-cells, respectively. The sensitivity and specificity of gene detection on the single-cell analyses were 0.739 and 0.972 respectively when compared to 10-cell pools, and 0.814 and 0.979 respectively when compared to 5-cell pools, demonstrating a low false positive rate. Among 10,000 randomly selected pairs of genes, the Pearson correlation coefficient was 0.875 between the single-cell and 5-cell pools and 0.783 between the single-cell and 10-cell pools. As expected, abundant transcripts in the 5- and 10-cell samples were detected by RT-qPCR in the single-cell isolates, while lower abundance messages were not. Using the same stringency, 16,039 probes were positive on the patient single-cell arrays. Cluster analysis showed that all 10 DTC grouped together within each patient. A transcriptomic profile can be reliably obtained from a single cell using commercially available technology. As expected, fewer amplified genes are detected from a single-cell sample than from pooled-cell samples, however this method can be used to reliably obtain a transcriptomic profile from DTC isolated from the bone marrow of patients with PCa.
A Two-Stream Plasma Electron Microwave Source for High-Power Millimeter Wave Generation. Phase 1
1989-03-29
MIT Press, Cambridge, MA, 1963). these findings that strong amplification is possible for repre- ’See, for example, G. E. Guest and D. J. Sigmar , Nucl...stringency of the stability criteria for electrostatic and whistler modes, as fl(z=O, 6, t=O) = -i/2u,(e/m) discussed by Guest and Sigmar [22], and...therein. Function. Academic Press, New York (1961). 1221 GUEST, G.E., SIGMAR , D.J., Nuci. Fusion It1(1971) [271 ABRAMOWITZ, M., STEGUN. I.A. (Eds
Low-dimensional recurrent neural network-based Kalman filter for speech enhancement.
Xia, Youshen; Wang, Jun
2015-07-01
This paper proposes a new recurrent neural network-based Kalman filter for speech enhancement, based on a noise-constrained least squares estimate. The parameters of speech signal modeled as autoregressive process are first estimated by using the proposed recurrent neural network and the speech signal is then recovered from Kalman filtering. The proposed recurrent neural network is globally asymptomatically stable to the noise-constrained estimate. Because the noise-constrained estimate has a robust performance against non-Gaussian noise, the proposed recurrent neural network-based speech enhancement algorithm can minimize the estimation error of Kalman filter parameters in non-Gaussian noise. Furthermore, having a low-dimensional model feature, the proposed neural network-based speech enhancement algorithm has a much faster speed than two existing recurrent neural networks-based speech enhancement algorithms. Simulation results show that the proposed recurrent neural network-based speech enhancement algorithm can produce a good performance with fast computation and noise reduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Scheuch, Matthias; Höper, Dirk; Beer, Martin
2015-03-03
Fuelled by the advent and subsequent development of next generation sequencing technologies, metagenomics became a powerful tool for the analysis of microbial communities both scientifically and diagnostically. The biggest challenge is the extraction of relevant information from the huge sequence datasets generated for metagenomics studies. Although a plethora of tools are available, data analysis is still a bottleneck. To overcome the bottleneck of data analysis, we developed an automated computational workflow called RIEMS - Reliable Information Extraction from Metagenomic Sequence datasets. RIEMS assigns every individual read sequence within a dataset taxonomically by cascading different sequence analyses with decreasing stringency of the assignments using various software applications. After completion of the analyses, the results are summarised in a clearly structured result protocol organised taxonomically. The high accuracy and performance of RIEMS analyses were proven in comparison with other tools for metagenomics data analysis using simulated sequencing read datasets. RIEMS has the potential to fill the gap that still exists with regard to data analysis for metagenomics studies. The usefulness and power of RIEMS for the analysis of genuine sequencing datasets was demonstrated with an early version of RIEMS in 2011 when it was used to detect the orthobunyavirus sequences leading to the discovery of Schmallenberg virus.
Impact of HLA diversity on donor selection in organ and stem cell transplantation.
Tiercy, Jean-Marie; Claas, Frans
2013-01-01
The human major histocompatibility complex is a multigene system encoding polymorphic human leucocyte antigens (HLA) that present peptides derived from pathogens to the immune system. The high diversity of HLA alleles and haplotypes in the worldwide populations represents a major barrier to organ and allogeneic hematopoietic stem cell transplantation, because HLA incompatibilities are efficiently recognized by T and B lymphocytes. In organ transplantation, pre-transplant anti-HLA antibodies need to be taken into account for organ allocation. Although HLA-incompatible transplants can be performed thanks to immunosuppressive drugs, the de novo production of anti-HLA antibodies still represents a major cause of graft failure. The HLAMatchmaker computer algorithm determines the immunogenicity of HLA mismatches and allows to define HLA antigens that will not induce an antibody response. Because of the much higher stringency of HLA compatibility criteria in stem cell transplantation, the best donor is a HLA genotypically identical sibling. However, more than 50% of the transplants are now performed with hematopoietic stem cells from volunteer donors selected from the international registry. The development of European national registries covering populations with different HLA haplotype frequencies is essential for optimizing donor search algorithms and providing the best chance for European patients to find a fully compatible donor.
GP, Douglas; RA, Deula; SE, Connor
2003-01-01
Computer-based order entry is a powerful tool for enhancing patient care. A pilot project in the pediatric department of the Lilongwe Central Hospital (LCH) in Malawi, Africa has demonstrated that computer-based order entry (COE): 1) can be successfully deployed and adopted in resource-poor settings, 2) can be built, deployed and sustained at relatively low cost and with local resources, and 3) has a greater potential to improve patient care in developing than in developed countries. PMID:14728338
Local reconstruction in computed tomography of diffraction enhanced imaging
NASA Astrophysics Data System (ADS)
Huang, Zhi-Feng; Zhang, Li; Kang, Ke-Jun; Chen, Zhi-Qiang; Zhu, Pei-Ping; Yuan, Qing-Xi; Huang, Wan-Xia
2007-07-01
Computed tomography of diffraction enhanced imaging (DEI-CT) based on synchrotron radiation source has extremely high sensitivity of weakly absorbing low-Z samples in medical and biological fields. The authors propose a modified backprojection filtration(BPF)-type algorithm based on PI-line segments to reconstruct region of interest from truncated refraction-angle projection data in DEI-CT. The distribution of refractive index decrement in the sample can be directly estimated from its reconstruction images, which has been proved by experiments at the Beijing Synchrotron Radiation Facility. The algorithm paves the way for local reconstruction of large-size samples by the use of DEI-CT with small field of view based on synchrotron radiation source.
Pharyngitis of infectious mononucleosis: computed tomography findings.
Kutuya, Naoki; Kurosaki, Yoshihisa; Suzuki, Kazuhiro; Takata, Koremochi; Shiraihshi, Akihiko
2008-05-01
Two women presented with sore throat and fever. Their symptoms were not alleviated by antibiotics. Cervical computed tomography (CT) with contrast enhancement demonstrated enlargement of predominant posterior cervical lymph nodes and streaky heterogeneous tonsils with interspersed low attenuation. They were diagnosed as having infectious mononucleosis by their laboratory data. Thus, when radiologists encounter these CT findings of pharyngitis that is not alleviated by antibiotic therapy, infectious mononucleosis should be considered in the differential diagnosis.
Landscape Encodings Enhance Optimization
Klemm, Konstantin; Mehta, Anita; Stadler, Peter F.
2012-01-01
Hard combinatorial optimization problems deal with the search for the minimum cost solutions (ground states) of discrete systems under strong constraints. A transformation of state variables may enhance computational tractability. It has been argued that these state encodings are to be chosen invertible to retain the original size of the state space. Here we show how redundant non-invertible encodings enhance optimization by enriching the density of low-energy states. In addition, smooth landscapes may be established on encoded state spaces to guide local search dynamics towards the ground state. PMID:22496860
Enhanced computer vision with Microsoft Kinect sensor: a review.
Han, Jungong; Shao, Ling; Xu, Dong; Shotton, Jamie
2013-10-01
With the invention of the low-cost Microsoft Kinect sensor, high-resolution depth and visual (RGB) sensing has become available for widespread use. The complementary nature of the depth and visual information provided by the Kinect sensor opens up new opportunities to solve fundamental problems in computer vision. This paper presents a comprehensive review of recent Kinect-based computer vision algorithms and applications. The reviewed approaches are classified according to the type of vision problems that can be addressed or enhanced by means of the Kinect sensor. The covered topics include preprocessing, object tracking and recognition, human activity analysis, hand gesture analysis, and indoor 3-D mapping. For each category of methods, we outline their main algorithmic contributions and summarize their advantages/differences compared to their RGB counterparts. Finally, we give an overview of the challenges in this field and future research trends. This paper is expected to serve as a tutorial and source of references for Kinect-based computer vision researchers.
2014-01-01
Background Determination of fetal aneuploidy is central to evaluation of recurrent pregnancy loss (RPL). However, obtaining this information at the time of a miscarriage is not always possible or may not have been ordered. Here we report on “rescue karyotyping”, wherein DNA extracted from archived paraffin-embedded pregnancy loss tissue from a prior dilation and curettage (D&C) is evaluated by array-based comparative genomic hybridization (aCGH). Methods A retrospective case series was conducted at an academic medical center. Patients included had unexplained RPL and a prior pregnancy loss for which karyotype information would be clinically informative but was unavailable. After extracting DNA from slides of archived tissue, aCGH with a reduced stringency approach was performed, allowing for analysis of partially degraded DNA. Statistics were computed using STATA v12.1 (College Station, TX). Results Rescue karyotyping was attempted on 20 specimens from 17 women. DNA was successfully extracted in 16 samples (80.0%), enabling analysis at either high or low resolution. The longest interval from tissue collection to DNA extraction was 4.2 years. There was no significant difference in specimen sufficiency for analysis in the collection-to-extraction interval (p = 0.14) or gestational age at pregnancy loss (p = 0.32). Eight specimens showed copy number variants: 3 trisomies, 2 partial chromosomal deletions, 1 mosaic abnormality and 2 unclassified variants. Conclusions Rescue karyotyping using aCGH on DNA extracted from paraffin-embedded tissue provides the opportunity to obtain critical fetal cytogenetic information from a prior loss, even if it occurred years earlier. Given the ubiquitous archiving of paraffin embedded tissue obtained during a D&C and the ease of obtaining results despite long loss-to-testing intervals or early gestational age at time of fetal demise, this may provide a useful technique in the evaluation of couples with recurrent pregnancy loss. PMID:24589081
Mapping neurofibromatosis 1 homologous loci by fluorescence in situ hybridization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viskochil, D.; Breidenbach, H.H.; Cawthon, R.
Neurofibromatosis 1 maps to chromosome band 17q11.2 and the NF1 gene is comprised of 59 exons that span approximately 335 kb of genomic DNA. In order to further analyze the structure of NF1 from exons 2 through 27b, we isolated a number of cosmid and bacteriophage P-1 genomic clones using NF1-exon probes under high-stringency hybridization conditions. Using tagged, intron-based primers and DNA from various clones as a template, we PCR-amplified and sequenced individual NF1 exons. The exon sequences in PCR products from several genomic clones differed from the exon sequence derived from cloned NF1 cDNAs. Clones with variant sequences weremore » mapped by fluorescence in situ hybridization under high-stringency conditions. Three clones mapped to chromosome band 15q11.2, one mapped to 14q11.2, one mapped to both 2q14.1-14.3 and 14q11.2, one mapped to 2q33-34, and one mapped to both 18q11.2 and 21q21. Even though some PCR-product sequences retained proper splice junctions and open reading frames, we have yet to identify cDNAs that correspond to the variant exon sequences. We are now sequencing clones that map to NF1-homologous loci in order to develop discriminating primer pairs for the exclusive amplification of NF1-specific sequences in our efforts to develop a comprehensive NF1 mutation screen using genomic DNA as template. The role of NF1-homologous sequences may play in neurofibromatosis 1 is not clear.« less
Quantification of indium in steel using PIXE
NASA Astrophysics Data System (ADS)
Oliver, A.; Miranda, J.; Rickards, J.; Cheang, J. C.
1989-04-01
The quantitative analysis of steel for endodontics tools was carried out using low-energy protons (≤ 700 keV). A computer program for a thick-target analysis which includes enhancement due to secondary fluorescence was used. In this experiment the L-lines of indium are enhanced due to the proximity of other elements' K-lines to the indium absorption edge. The results show that the ionization cross section expression employed to evaluate this magnitude is important.
A Platform for Combined DNA and Protein Microarrays Based on Total Internal Reflection Fluorescence
Asanov, Alexander; Zepeda, Angélica; Vaca, Luis
2012-01-01
We have developed a novel microarray technology based on total internal reflection fluorescence (TIRF) in combination with DNA and protein bioassays immobilized at the TIRF surface. Unlike conventional microarrays that exhibit reduced signal-to-background ratio, require several stages of incubation, rinsing and stringency control, and measure only end-point results, our TIRF microarray technology provides several orders of magnitude better signal-to-background ratio, performs analysis rapidly in one step, and measures the entire course of association and dissociation kinetics between target DNA and protein molecules and the bioassays. In many practical cases detection of only DNA or protein markers alone does not provide the necessary accuracy for diagnosing a disease or detecting a pathogen. Here we describe TIRF microarrays that detect DNA and protein markers simultaneously, which reduces the probabilities of false responses. Supersensitive and multiplexed TIRF DNA and protein microarray technology may provide a platform for accurate diagnosis or enhanced research studies. Our TIRF microarray system can be mounted on upright or inverted microscopes or interfaced directly with CCD cameras equipped with a single objective, facilitating the development of portable devices. As proof-of-concept we applied TIRF microarrays for detecting molecular markers from Bacillus anthracis, the pathogen responsible for anthrax. PMID:22438738
Roth, Justin C.; Ismail, Mourad; Reese, Jane S.; Lingas, Karen T.; Ferrari, Giuliana; Gerson, Stanton L.
2012-01-01
The P140K point mutant of MGMT allows robust hematopoietic stem cell (HSC) enrichment in vivo. Thus, dual-gene vectors that couple MGMT and therapeutic gene expression have allowed enrichment of gene-corrected HSCs in animal models. However, expression levels from dual-gene vectors are often reduced for one or both genes. Further, it may be desirable to express selection and therapeutic genes at distinct stages of cell differentiation. In this regard, we evaluated whether hematopoietic cells could be efficiently cotransduced using low MOIs of two separate single-gene lentiviruses, including MGMT for dual-positive cell enrichment. Cotransduction efficiencies were evaluated using a range of MGMT : GFP virus ratios, MOIs, and selection stringencies in vitro. Cotransduction was optimal when equal proportions of each virus were used, but low MGMT : GFP virus ratios resulted in the highest proportion of dual-positive cells after selection. This strategy was then evaluated in murine models for in vivo selection of HSCs cotransduced with a ubiquitous MGMT expression vector and an erythroid-specific GFP vector. Although the MGMT and GFP expression percentages were variable among engrafted recipients, drug selection enriched MGMT-positive leukocyte and GFP-positive erythroid cell populations. These data demonstrate cotransduction as a mean to rapidly enrich and evaluate therapeutic lentivectors in vivo. PMID:22888445
Fontenete, Sílvia; Leite, Marina; Guimarães, Nuno; Madureira, Pedro; Ferreira, Rui Manuel; Figueiredo, Céu; Wengel, Jesper; Azevedo, Nuno Filipe
2015-01-01
In recent years, there have been several attempts to improve the diagnosis of infection caused by Helicobacter pylori. Fluorescence in situ hybridization (FISH) is a commonly used technique to detect H. pylori infection but it requires biopsies from the stomach. Thus, the development of an in vivo FISH-based method (FIVH) that directly detects and allows the visualization of the bacterium within the human body would significantly reduce the time of analysis, allowing the diagnosis to be performed during endoscopy. In a previous study we designed and synthesized a phosphorothioate locked nucleic acid (LNA)/ 2’ O-methyl RNA (2’OMe) probe using standard phosphoramidite chemistry and FISH hybridization was then successfully performed both on adhered and suspended bacteria at 37°C. In this work we simplified, shortened and adapted FISH to work at gastric pH values, meaning that the hybridization step now takes only 30 minutes and, in addition to the buffer, uses only urea and probe at non-toxic concentrations. Importantly, the sensitivity and specificity of the FISH method was maintained in the range of conditions tested, even at low stringency conditions (e.g., low pH). In conclusion, this methodology is a promising approach that might be used in vivo in the future in combination with a confocal laser endomicroscope for H. pylori visualization. PMID:25915865
Roles of the amino group of purine bases in the thermodynamic stability of DNA base pairing.
Nakano, Shu-ichi; Sugimoto, Naoki
2014-08-05
The energetic aspects of hydrogen-bonded base-pair interactions are important for the design of functional nucleotide analogs and for practical applications of oligonucleotides. The present study investigated the contribution of the 2-amino group of DNA purine bases to the thermodynamic stability of oligonucleotide duplexes under different salt and solvent conditions, using 2'-deoxyriboinosine (I) and 2'-deoxyribo-2,6-diaminopurine (D) as non-canonical nucleotides. The stability of DNA duplexes was changed by substitution of a single base pair in the following order: G • C > D • T ≈ I • C > A • T > G • T > I • T. The apparent stabilization energy due to the presence of the 2-amino group of G and D varied depending on the salt concentration, and decreased in the water-ethanol mixed solvent. The effects of salt concentration on the thermodynamics of DNA duplexes were found to be partially sequence-dependent, and the 2-amino group of the purine bases might have an influence on the binding of ions to DNA through the formation of a stable base-paired structure. Our results also showed that physiological salt conditions were energetically favorable for complementary base recognition, and conversely, low salt concentration media and ethanol-containing solvents were effective for low stringency oligonucleotide hybridization, in the context of conditions employed in this study.
DOT National Transportation Integrated Search
2017-05-01
Climate change introduces infrastructure flooding challenges, especially for coastal regions with low topographic relief. More frequently occurring intense storms and sea level rise are two projected impacts of climate change that will lead to increa...
Ozkurt, Huseyin; Tokgoz, Safiye; Karabay, Esra; Ucan, Berna; Akdogan, Melek Pala; Basak, Muzaffer
2014-01-01
Aim To evaluate the diagnostic quality of a new multiple detector-row computed tomography angiography (MDCT-A) protocol using low dose radiation and low volume contrast medium techniques for evaluation of non-cardiac chest pain. Methods Forty-five consecutive patients with clinically suspected noncardiac chest pain and requiring contrast-enhanced chest computed tomography (CT) were examined. The patients were assigned to the protocol, with 80 kilovolt (peak) (kV[p]) and 150 effective milliampere-second (eff mA-s). In our study group, 40 mL of low osmolar contrast material was administered at 3.0 mL/s. Results In the study group, four patients with pulmonary embolism, four with pleural effusion, two with ascending aortic aneurysm and eight patients with pneumonic consolidation were detected. The mean attenuation of the pulmonary truncus and ascendant aortic locations was considered 264±44 and 249±51 HU, respectively. The mean effective radiation dose was 0.83 mSv for MDCT-A. Conclusions Pulmonary artery and the aorta scanning simultaneously was significantly reduced radiation exposure with the mentioned dose saving technique. Additionally, injection of low volume (40 cc) contrast material may reduce the risk of contrast induced nephropathy, therefore, facilitate the diagnostic approach. This technique can be applied to all cases and particularly patients at high risk of contrast induced nephropathy due to its similar diagnostic quality with a low dose and high levels of arteriovenous enhancement simultaneously. PMID:25392818
Niu, Shanzhou; Zhang, Shanli; Huang, Jing; Bian, Zhaoying; Chen, Wufan; Yu, Gaohang; Liang, Zhengrong; Ma, Jianhua
2016-01-01
Cerebral perfusion x-ray computed tomography (PCT) is an important functional imaging modality for evaluating cerebrovascular diseases and has been widely used in clinics over the past decades. However, due to the protocol of PCT imaging with repeated dynamic sequential scans, the associative radiation dose unavoidably increases as compared with that used in conventional CT examinations. Minimizing the radiation exposure in PCT examination is a major task in the CT field. In this paper, considering the rich similarity redundancy information among enhanced sequential PCT images, we propose a low-dose PCT image restoration model by incorporating the low-rank and sparse matrix characteristic of sequential PCT images. Specifically, the sequential PCT images were first stacked into a matrix (i.e., low-rank matrix), and then a non-convex spectral norm/regularization and a spatio-temporal total variation norm/regularization were then built on the low-rank matrix to describe the low rank and sparsity of the sequential PCT images, respectively. Subsequently, an improved split Bregman method was adopted to minimize the associative objective function with a reasonable convergence rate. Both qualitative and quantitative studies were conducted using a digital phantom and clinical cerebral PCT datasets to evaluate the present method. Experimental results show that the presented method can achieve images with several noticeable advantages over the existing methods in terms of noise reduction and universal quality index. More importantly, the present method can produce more accurate kinetic enhanced details and diagnostic hemodynamic parameter maps. PMID:27440948
Vierling-Claassen, Dorea; Cardin, Jessica A.; Moore, Christopher I.; Jones, Stephanie R.
2010-01-01
Selective optogenetic drive of fast-spiking (FS) interneurons (INs) leads to enhanced local field potential (LFP) power across the traditional “gamma” frequency band (20–80 Hz; Cardin et al., 2009). In contrast, drive to regular-spiking (RS) pyramidal cells enhances power at lower frequencies, with a peak at 8 Hz. The first result is consistent with previous computational studies emphasizing the role of FS and the time constant of GABAA synaptic inhibition in gamma rhythmicity. However, the same theoretical models do not typically predict low-frequency LFP enhancement with RS drive. To develop hypotheses as to how the same network can support these contrasting behaviors, we constructed a biophysically principled network model of primary somatosensory neocortex containing FS, RS, and low-threshold spiking (LTS) INs. Cells were modeled with detailed cell anatomy and physiology, multiple dendritic compartments, and included active somatic and dendritic ionic currents. Consistent with prior studies, the model demonstrated gamma resonance during FS drive, dependent on the time constant of GABAA inhibition induced by synchronous FS activity. Lower-frequency enhancement during RS drive was replicated only on inclusion of an inhibitory LTS population, whose activation was critically dependent on RS synchrony and evoked longer-lasting inhibition. Our results predict that differential recruitment of FS and LTS inhibitory populations is essential to the observed cortical dynamics and may provide a means for amplifying the natural expression of distinct oscillations in normal cortical processing. PMID:21152338
VLSI implementation of a new LMS-based algorithm for noise removal in ECG signal
NASA Astrophysics Data System (ADS)
Satheeskumaran, S.; Sabrigiriraj, M.
2016-06-01
Least mean square (LMS)-based adaptive filters are widely deployed for removing artefacts in electrocardiogram (ECG) due to less number of computations. But they posses high mean square error (MSE) under noisy environment. The transform domain variable step-size LMS algorithm reduces the MSE at the cost of computational complexity. In this paper, a variable step-size delayed LMS adaptive filter is used to remove the artefacts from the ECG signal for improved feature extraction. The dedicated digital Signal processors provide fast processing, but they are not flexible. By using field programmable gate arrays, the pipelined architectures can be used to enhance the system performance. The pipelined architecture can enhance the operation efficiency of the adaptive filter and save the power consumption. This technique provides high signal-to-noise ratio and low MSE with reduced computational complexity; hence, it is a useful method for monitoring patients with heart-related problem.
NASA Technical Reports Server (NTRS)
Faust, N.; Jordon, L.
1981-01-01
Since the implementation of the GRID and IMGRID computer programs for multivariate spatial analysis in the early 1970's, geographic data analysis subsequently moved from large computers to minicomputers and now to microcomputers with radical reduction in the costs associated with planning analyses. Programs designed to process LANDSAT data to be used as one element in a geographic data base were used once NIMGRID (new IMGRID), a raster oriented geographic information system, was implemented on the microcomputer. Programs for training field selection, supervised and unsupervised classification, and image enhancement were added. Enhancements to the color graphics capabilities of the microsystem allow display of three channels of LANDSAT data in color infrared format. The basic microcomputer hardware needed to perform NIMGRID and most LANDSAT analyses is listed as well as the software available for LANDSAT processing.
2015-01-01
a spatial resolution of 250-m. The Gumley et al. computation for MODIS sharpening is given as a ratio of high to low resolution top of the atmosphere...NIR) correction (Stumpf, Arnone, Gould, Martinolich, & Ransibrahamanakul, 2003). Standard flagswere used tomask interference from land, clouds , sun...technique This new approach expands on the methodology described by Gumley et al. (2010), with somemodifications. We will compute a sim- ilar spatial
High performance, low cost, self-contained, multipurpose PC based ground systems
NASA Technical Reports Server (NTRS)
Forman, Michael; Nickum, William; Troendly, Gregory
1993-01-01
The use of embedded processors greatly enhances the capabilities of personal computers when used for telemetry processing and command control center functions. Parallel architectures based on the use of transputers are shown to be very versatile and reusable, and the synergism between the PC and the embedded processor with transputers results in single unit, low cost workstations of 20 less than MIPS less than or equal to 1000.
Twisted Vanes Would Enhance Fuel/Air Mixing In Turbines
NASA Technical Reports Server (NTRS)
Nguyen, H. Lee; Micklow, Gerald J.; Dogra, Anju S.
1994-01-01
Computations of flow show performance of high-shear airblast fuel injector in gas-turbine engine enhanced by use of appropriately proportioned twisted (instead of flat) dome swirl vanes. Resultant more nearly uniform fuel/air mixture burns more efficiently, emitting smaller amounts of nitrogen oxides. Twisted-vane high-shear airblast injectors also incorporated into paint sprayers, providing advantages of low pressure drop characteristic of airblast injectors in general and finer atomization of advanced twisted-blade design.
An RNA Domain Imparts Specificity and Selectivity to a Viral DNA Packaging Motor
Zhao, Wei; Jardine, Paul J.
2015-01-01
ABSTRACT During assembly, double-stranded DNA viruses, including bacteriophages and herpesviruses, utilize a powerful molecular motor to package their genomic DNA into a preformed viral capsid. An integral component of the packaging motor in the Bacillus subtilis bacteriophage ϕ29 is a viral genome-encoded pentameric ring of RNA (prohead RNA [pRNA]). pRNA is a 174-base transcript comprised of two domains, domains I and II. Early studies initially isolated a 120-base form (domain I only) that retains high biological activity in vitro; hence, no function could be assigned to domain II. Here we define a role for this domain in the packaging process. DNA packaging using restriction digests of ϕ29 DNA showed that motors with the 174-base pRNA supported the correct polarity of DNA packaging, selectively packaging the DNA left end. In contrast, motors containing the 120-base pRNA had compromised specificity, packaging both left- and right-end fragments. The presence of domain II also provides selectivity in competition assays with genomes from related phages. Furthermore, motors with the 174-base pRNA were restrictive, in that they packaged only one DNA fragment into the head, whereas motors with the 120-base pRNA packaged several fragments into the head, indicating multiple initiation events. These results show that domain II imparts specificity and stringency to the motor during the packaging initiation events that precede DNA translocation. Heteromeric rings of pRNA demonstrated that one or two copies of domain II were sufficient to impart this selectivity/stringency. Although ϕ29 differs from other double-stranded DNA phages in having an RNA motor component, the function provided by pRNA is carried on the motor protein components in other phages. IMPORTANCE During virus assembly, genome packaging involves the delivery of newly synthesized viral nucleic acid into a protein shell. In the double-stranded DNA phages and herpesviruses, this is accomplished by a powerful molecular motor that translocates the viral DNA into a preformed viral shell. A key event in DNA packaging is recognition of the viral DNA among other nucleic acids in the host cell. Commonly, a DNA-binding protein mediates the interaction of viral DNA with the motor/head shell. Here we show that for the bacteriophage ϕ29, this essential step of genome recognition is mediated by a viral genome-encoded RNA rather than a protein. A domain of the prohead RNA (pRNA) imparts specificity and stringency to the motor by ensuring the correct orientation of DNA packaging and restricting initiation to a single event. Since this assembly step is unique to the virus, DNA packaging is a novel target for the development of antiviral drugs. PMID:26423956
An RNA Domain Imparts Specificity and Selectivity to a Viral DNA Packaging Motor.
Zhao, Wei; Jardine, Paul J; Grimes, Shelley
2015-12-01
During assembly, double-stranded DNA viruses, including bacteriophages and herpesviruses, utilize a powerful molecular motor to package their genomic DNA into a preformed viral capsid. An integral component of the packaging motor in the Bacillus subtilis bacteriophage ϕ29 is a viral genome-encoded pentameric ring of RNA (prohead RNA [pRNA]). pRNA is a 174-base transcript comprised of two domains, domains I and II. Early studies initially isolated a 120-base form (domain I only) that retains high biological activity in vitro; hence, no function could be assigned to domain II. Here we define a role for this domain in the packaging process. DNA packaging using restriction digests of ϕ29 DNA showed that motors with the 174-base pRNA supported the correct polarity of DNA packaging, selectively packaging the DNA left end. In contrast, motors containing the 120-base pRNA had compromised specificity, packaging both left- and right-end fragments. The presence of domain II also provides selectivity in competition assays with genomes from related phages. Furthermore, motors with the 174-base pRNA were restrictive, in that they packaged only one DNA fragment into the head, whereas motors with the 120-base pRNA packaged several fragments into the head, indicating multiple initiation events. These results show that domain II imparts specificity and stringency to the motor during the packaging initiation events that precede DNA translocation. Heteromeric rings of pRNA demonstrated that one or two copies of domain II were sufficient to impart this selectivity/stringency. Although ϕ29 differs from other double-stranded DNA phages in having an RNA motor component, the function provided by pRNA is carried on the motor protein components in other phages. During virus assembly, genome packaging involves the delivery of newly synthesized viral nucleic acid into a protein shell. In the double-stranded DNA phages and herpesviruses, this is accomplished by a powerful molecular motor that translocates the viral DNA into a preformed viral shell. A key event in DNA packaging is recognition of the viral DNA among other nucleic acids in the host cell. Commonly, a DNA-binding protein mediates the interaction of viral DNA with the motor/head shell. Here we show that for the bacteriophage ϕ29, this essential step of genome recognition is mediated by a viral genome-encoded RNA rather than a protein. A domain of the prohead RNA (pRNA) imparts specificity and stringency to the motor by ensuring the correct orientation of DNA packaging and restricting initiation to a single event. Since this assembly step is unique to the virus, DNA packaging is a novel target for the development of antiviral drugs. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
NASA Technical Reports Server (NTRS)
Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)
1983-01-01
Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.
Metal oxide resistive random access memory based synaptic devices for brain-inspired computing
NASA Astrophysics Data System (ADS)
Gao, Bin; Kang, Jinfeng; Zhou, Zheng; Chen, Zhe; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan
2016-04-01
The traditional Boolean computing paradigm based on the von Neumann architecture is facing great challenges for future information technology applications such as big data, the Internet of Things (IoT), and wearable devices, due to the limited processing capability issues such as binary data storage and computing, non-parallel data processing, and the buses requirement between memory units and logic units. The brain-inspired neuromorphic computing paradigm is believed to be one of the promising solutions for realizing more complex functions with a lower cost. To perform such brain-inspired computing with a low cost and low power consumption, novel devices for use as electronic synapses are needed. Metal oxide resistive random access memory (ReRAM) devices have emerged as the leading candidate for electronic synapses. This paper comprehensively addresses the recent work on the design and optimization of metal oxide ReRAM-based synaptic devices. A performance enhancement methodology and optimized operation scheme to achieve analog resistive switching and low-energy training behavior are provided. A three-dimensional vertical synapse network architecture is proposed for high-density integration and low-cost fabrication. The impacts of the ReRAM synaptic device features on the performances of neuromorphic systems are also discussed on the basis of a constructed neuromorphic visual system with a pattern recognition function. Possible solutions to achieve the high recognition accuracy and efficiency of neuromorphic systems are presented.
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-03-06
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user's home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered.
Off the Shelf Cloud Robotics for the Smart Home: Empowering a Wireless Robot through Cloud Computing
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-01-01
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user’s home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered. PMID:28272305
Evaluating some computer exhancement algorithms that improve the visibility of cometary morphology
NASA Technical Reports Server (NTRS)
Larson, Stephen M.; Slaughter, Charles D.
1992-01-01
Digital enhancement of cometary images is a necessary tool in studying cometary morphology. Many image processing algorithms, some developed specifically for comets, have been used to enhance the subtle, low contrast coma and tail features. We compare some of the most commonly used algorithms on two different images to evaluate their strong and weak points, and conclude that there currently exists no single 'ideal' algorithm, although the radial gradient spatial filter gives the best overall result. This comparison should aid users in selecting the best algorithm to enhance particular features of interest.
NASA Technical Reports Server (NTRS)
Dunbar, P. M.; Hauser, J. R.
1976-01-01
Various mechanisms which limit the conversion efficiency of silicon solar cells were studied. The effects of changes in solar cell geometry such as layer thickness on performance were examined. The effects of various antireflecting layers were also examined. It was found that any single film antireflecting layer results in a significant surface loss of photons. The use of surface texturing techniques or low loss antireflecting layers can enhance by several percentage points the conversion efficiency of silicon cells. The basic differences between n(+)-p-p(+) and p(+)-n-n(+) cells are treated. A significant part of the study was devoted to the importance of surface region lifetime and heavy doping effects on efficiency. Heavy doping bandgap reduction effects are enhanced by low surface layer lifetimes, and conversely, the reduction in solar cell efficiency due to low surface layer lifetime is further enhanced by heavy doping effects. A series of computer studies is reported which seeks to determine the best cell structure and doping levels for maximum efficiency.
Coal fired air turbine cogeneration
NASA Astrophysics Data System (ADS)
Foster-Pegg, R. W.
Fuel options and generator configurations for installation of cogenerator equipment are reviewed, noting that the use of oil or gas may be precluded by cost or legislation within the lifetime of any cogeneration equipment yet to be installed. A coal fueled air turbine cogenerator plant is described, which uses external combustion in a limestone bed at atmospheric pressure and in which air tubes are sunk to gain heat for a gas turbine. The limestone in the 26 MW unit absorbs sulfur from the coal, and can be replaced by other sorbents depending on types of coal available and stringency of local environmental regulations. Low temperature combustion reduces NOx formation and release of alkali salts and corrosion. The air heat is exhausted through a heat recovery boiler to produce process steam, then can be refed into the combustion chamber to satisfy preheat requirements. All parts of the cogenerator are designed to withstand full combustion temperature (1500 F) in the event of air flow stoppage. Costs are compared with those of a coal fired boiler and purchased power, and it is shown that the increased capital requirements for cogenerator apparatus will yield a 2.8 year payback. Detailed flow charts, diagrams and costs schedules are included.
2012-03-01
Lowe, David G. “Distinctive Image Features from Scale-Invariant Keypoints”. International Journal of Computer Vision, 2004. 13. Maybeck, Peter S...Fairfax Drive - 3rd Floor Arlington,VA 22203 Dr. Stefanie Tompkins ; (703)248–1540; Stefanie.Tompkins@darpa.mil DARPA Distribution A. Approved for Public
A Computational Framework for Efficient Low Temperature Plasma Simulations
NASA Astrophysics Data System (ADS)
Verma, Abhishek Kumar; Venkattraman, Ayyaswamy
2016-10-01
Over the past years, scientific computing has emerged as an essential tool for the investigation and prediction of low temperature plasmas (LTP) applications which includes electronics, nanomaterial synthesis, metamaterials etc. To further explore the LTP behavior with greater fidelity, we present a computational toolbox developed to perform LTP simulations. This framework will allow us to enhance our understanding of multiscale plasma phenomenon using high performance computing tools mainly based on OpenFOAM FVM distribution. Although aimed at microplasma simulations, the modular framework is able to perform multiscale, multiphysics simulations of physical systems comprises of LTP. Some salient introductory features are capability to perform parallel, 3D simulations of LTP applications on unstructured meshes. Performance of the solver is tested based on numerical results assessing accuracy and efficiency of benchmarks for problems in microdischarge devices. Numerical simulation of microplasma reactor at atmospheric pressure with hemispherical dielectric coated electrodes will be discussed and hence, provide an overview of applicability and future scope of this framework.
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
Similarity-based gene detection: using COGs to find evolutionarily-conserved ORFs.
Powell, Bradford C; Hutchison, Clyde A
2006-01-19
Experimental verification of gene products has not kept pace with the rapid growth of microbial sequence information. However, existing annotations of gene locations contain sufficient information to screen for probable errors. Furthermore, comparisons among genomes become more informative as more genomes are examined. We studied all open reading frames (ORFs) of at least 30 codons from the genomes of 27 sequenced bacterial strains. We grouped the potential peptide sequences encoded from the ORFs by forming Clusters of Orthologous Groups (COGs). We used this grouping in order to find homologous relationships that would not be distinguishable from noise when using simple BLAST searches. Although COG analysis was initially developed to group annotated genes, we applied it to the task of grouping anonymous DNA sequences that may encode proteins. "Mixed COGs" of ORFs (clusters in which some sequences correspond to annotated genes and some do not) are attractive targets when seeking errors of gene prediction. Examination of mixed COGs reveals some situations in which genes appear to have been missed in current annotations and a smaller number of regions that appear to have been annotated as gene loci erroneously. This technique can also be used to detect potential pseudogenes or sequencing errors. Our method uses an adjustable parameter for degree of conservation among the studied genomes (stringency). We detail results for one level of stringency at which we found 83 potential genes which had not previously been identified, 60 potential pseudogenes, and 7 sequences with existing gene annotations that are probably incorrect. Systematic study of sequence conservation offers a way to improve existing annotations by identifying potentially homologous regions where the annotation of the presence or absence of a gene is inconsistent among genomes.
Similarity-based gene detection: using COGs to find evolutionarily-conserved ORFs
Powell, Bradford C; Hutchison, Clyde A
2006-01-01
Background Experimental verification of gene products has not kept pace with the rapid growth of microbial sequence information. However, existing annotations of gene locations contain sufficient information to screen for probable errors. Furthermore, comparisons among genomes become more informative as more genomes are examined. We studied all open reading frames (ORFs) of at least 30 codons from the genomes of 27 sequenced bacterial strains. We grouped the potential peptide sequences encoded from the ORFs by forming Clusters of Orthologous Groups (COGs). We used this grouping in order to find homologous relationships that would not be distinguishable from noise when using simple BLAST searches. Although COG analysis was initially developed to group annotated genes, we applied it to the task of grouping anonymous DNA sequences that may encode proteins. Results "Mixed COGs" of ORFs (clusters in which some sequences correspond to annotated genes and some do not) are attractive targets when seeking errors of gene predicion. Examination of mixed COGs reveals some situations in which genes appear to have been missed in current annotations and a smaller number of regions that appear to have been annotated as gene loci erroneously. This technique can also be used to detect potential pseudogenes or sequencing errors. Our method uses an adjustable parameter for degree of conservation among the studied genomes (stringency). We detail results for one level of stringency at which we found 83 potential genes which had not previously been identified, 60 potential pseudogenes, and 7 sequences with existing gene annotations that are probably incorrect. Conclusion Systematic study of sequence conservation offers a way to improve existing annotations by identifying potentially homologous regions where the annotation of the presence or absence of a gene is inconsistent among genomes. PMID:16423288
Riou, Virginie; Périot, Marine; Biegala, Isabelle C
2017-01-01
Oligonucleotide probes are increasingly being used to characterize natural microbial assemblages by Tyramide Signal Amplification-Fluorescent in situ Hybridization (TSA-FISH, or CAtalysed Reporter Deposition CARD-FISH). In view of the fast-growing rRNA databases, we re-evaluated the in silico specificity of eleven bacterial and eukaryotic probes and competitor frequently used for the quantification of marine picoplankton. We performed tests on cell cultures to decrease the risk for non-specific hybridization, before they are used on environmental samples. The probes were confronted to recent databases and hybridization conditions were tested against target strains matching perfectly with the probes, and against the closest non-target strains presenting one to four mismatches. We increased the hybridization stringency from 55 to 65% formamide for the Eub338+EubII+EubIII probe mix to be specific to the Eubacteria domain. In addition, we found that recent changes in the Gammaproteobacteria classification decreased the specificity of Gam42a probe, and that the Roseo536R and Ros537 probes were not specific to, and missed part of the Roseobacter clade. Changes in stringency conditions were important for bacterial probes; these induced, respectively, a significant increase, in Eubacteria and Roseobacter and no significant changes in Gammaproteobacteria concentrations from the investigated natural environment. We confirmed the eukaryotic probes original conditions, and propose the Euk1209+NChlo01+Chlo02 probe mix to target the largest picoeukaryotic diversity. Experiences acquired through these investigations leads us to propose the use of seven steps protocol for complete FISH probe specificity check-up to improve data quality in environmental studies.
Schneider, Uffe Vest; Mikkelsen, Nikolaj Dam; Lindqvist, Anja; Okkels, Limei Meng; Jøhnk, Nina; Lisby, Gorm
2012-01-01
We introduce quantitative polymerase chain reaction (qPCR) primers and multiplex end-point PCR primers modified by the addition of a single ortho-Twisted Intercalating Nucleic Acid (o-TINA) molecule at the 5′-end. In qPCR, the 5′-o-TINA modified primers allow for a qPCR efficiency of 100% at significantly stressed reaction conditions, increasing the robustness of qPCR assays compared to unmodified primers. In samples spiked with genomic DNA, 5′-o-TINA modified primers improve the robustness by increased sensitivity and specificity compared to unmodified DNA primers. In unspiked samples, replacement of unmodified DNA primers with 5′-o-TINA modified primers permits an increased qPCR stringency. Compared to unmodified DNA primers, this allows for a qPCR efficiency of 100% at lowered primer concentrations and at increased annealing temperatures with unaltered cross-reactivity for primers with single nucleobase mismatches. In a previously published octaplex end-point PCR targeting diarrheagenic Escherichia coli, application of 5′-o-TINA modified primers allows for a further reduction (>45% or approximately one hour) in overall PCR program length, while sustaining the amplification and analytical sensitivity for all targets in crude bacterial lysates. For all crude bacterial lysates, 5′-o-TINA modified primers permit a substantial increase in PCR stringency in terms of lower primer concentrations and higher annealing temperatures for all eight targets. Additionally, crude bacterial lysates spiked with human genomic DNA show lesser formation of non-target amplicons implying increased robustness. Thus, 5′-o-TINA modified primers are advantageous in PCR assays, where one or more primer pairs are required to perform at stressed reaction conditions. PMID:22701644
Implementing the undergraduate mini-CEX: a tailored approach at Southampton University.
Hill, Faith; Kendall, Kathleen; Galbraith, Kevin; Crossley, Jim
2009-04-01
The mini-clinical evaluation exercise (mini-CEX) is widely used in the UK to assess clinical competence, but there is little evidence regarding its implementation in the undergraduate setting. This study aimed to estimate the validity and reliability of the undergraduate mini-CEX and discuss the challenges involved in its implementation. A total of 3499 mini-CEX forms were completed. Validity was assessed by estimating associations between mini-CEX score and a number of external variables, examining the internal structure of the instrument, checking competency domain response rates and profiles against expectations, and by qualitative evaluation of stakeholder interviews. Reliability was evaluated by overall reliability coefficient (R), estimation of the standard error of measurement (SEM), and from stakeholders' perceptions. Variance component analysis examined the contribution of relevant factors to students' scores. Validity was threatened by various confounding variables, including: examiner status; case complexity; attachment specialty; patient gender, and case focus. Factor analysis suggested that competency domains reflect a single latent variable. Maximum reliability can be achieved by aggregating scores over 15 encounters (R = 0.73; 95% confidence interval [CI] +/- 0.28 based on a 6-point assessment scale). Examiner stringency contributed 29% of score variation and student attachment aptitude 13%. Stakeholder interviews revealed staff development needs but the majority perceived the mini-CEX as more reliable and valid than the previous long case. The mini-CEX has good overall utility for assessing aspects of the clinical encounter in an undergraduate setting. Strengths include fidelity, wide sampling, perceived validity, and formative observation and feedback. Reliability is limited by variable examiner stringency, and validity by confounding variables, but these should be viewed within the context of overall assessment strategies.
NASA Astrophysics Data System (ADS)
Hejazi, M. I.; Edmonds, J.; Clarke, L.; Kyle, P.; Davies, E.; Chaturvedi, V.; Eom, J.; Wise, M.; Patel, P.; Calvin, K.
2013-03-01
We investigate the effects of emission mitigation policies on water scarcity both globally and regionally using the Global Change Assessment Model (GCAM), a leading community integrated assessment model of energy, agriculture, climate, and water. Three climate policy scenarios with increasing mitigation stringency of 7.7, 5.5, and 4.2 W m-2 in year 2095 (equivalent to the SRES A2, B2, and B1 emission scenarios, respectively), under two carbon tax regimes (a universal carbon tax (UCT) which includes land use change emissions, and a fossil fuel and industrial emissions carbon tax (FFICT) which excludes land use change emissions) are analyzed. The results are compared to a baseline scenario (i.e. no climate change mitigation policy) with radiative forcing reaching 8.8 W m-2 (equivalent to the SRES A1Fi emission scenario) by 2095. When compared to the baseline scenario and maintaining the same baseline socioeconomic assumptions, water scarcity declines under a UCT mitigation policy but increases with a FFICT mitigation scenario by the year 2095 particularly with more stringent climate mitigation targets. The decreasing trend with UCT policy stringency is due to substitution from more water-intensive to less water-intensive choices in food and energy production, and in land use. Under the FFICT scenario, water scarcity is projected to increase driven by higher water demands for bio-energy crops. This study implies an increasingly prominent role for water availability in future human decisions, and highlights the importance of including water in integrated assessment of global change. Future research will be directed at incorporating water shortage feedbacks in GCAM to better understand how such stresses will propagate across the various human and natural systems in GCAM.
Feagan, Brian; Sandborn, William J; Rutgeerts, Paul; Levesque, Barrett G; Khanna, Reena; Huang, Bidan; Zhou, Qian; Maa, Jen-Fue; Wallace, Kori; Lacerda, Ana; Thakkar, Roopal B; Robinson, Anne M
2018-04-23
Clinical trial endpoints for Crohn's disease (CD) activity correlate poorly with mucosal inflammation; to assess treatment efficacy, patient-reported outcomes and endoscopic assessments are preferred. This study assessed the impact on treatment efficacy estimations of using different definitions of clinical and endoscopic remission and endoscopic response, and of using site- or central-based endoscopy evaluation. This post hoc analysis of data fromEXTEND (extend the safety and efficacy of adalimumab through endoscopic healing), a placebo (PBO)-controlled, randomized trial of adalimumab (ADA) for mucosal healing, included adults with moderate-to-severe CD. Subsets of patients meeting specified Simplified Endoscopic Score for CD (SES-CD) inclusion criteria, according to site or central reading, and baseline stool frequency (SF) and/or abdominal pain score (AP) thresholds were evaluated. Various endpoint definitions based on the Crohn's Disease Activity Index (CDAI), its SF and AP components, SES-CD, and composite endpoints were compared between treatment groups. Increased stringency of Week 12 clinical endpoints compared to CDAI<150 to SF≤3.0/1.5&AP≤1.0 reduced PBO response rates by ≥12% and increased treatment effects by ≤10%. Amending the SES-CD endpoint from ≤4 to ≤2 reduced the treatment effect from 24% to 8%. Composite endpoints further diminished response rates and effect sizes. Site-based evaluation was associated with lower remission rates versus central reading in the PBO group and, thus, greater ADA-related treatment effects. This analysis is the first to demonstrate that increasing the stringency of clinical and endoscopic endpoint definitions in CD trials, especially lowering SF or SES-CD definitions, reduces the ability to detect treatment-related change in CD activity; focus on endpoints that reflect clinical change is warranted.
Materials Selection Criteria for Nuclear Power Applications: A Decision Algorithm
NASA Astrophysics Data System (ADS)
Rodríguez-Prieto, Álvaro; Camacho, Ana María; Sebastián, Miguel Ángel
2016-02-01
An innovative methodology based on stringency levels is proposed in this paper and improves the current selection method for structural materials used in demanding industrial applications. This paper describes a new approach for quantifying the stringency of materials requirements based on a novel deterministic algorithm to prevent potential failures. We have applied the new methodology to different standardized specifications used in pressure vessels design, such as SA-533 Grade B Cl.1, SA-508 Cl.3 (issued by the American Society of Mechanical Engineers), DIN 20MnMoNi55 (issued by the German Institute of Standardization) and 16MND5 (issued by the French Nuclear Commission) specifications and determine the influence of design code selection. This study is based on key scientific publications on the influence of chemical composition on the mechanical behavior of materials, which were not considered when the technological requirements were established in the aforementioned specifications. For this purpose, a new method to quantify the efficacy of each standard has been developed using a deterministic algorithm. The process of assigning relative weights was performed by consulting a panel of experts in materials selection for reactor pressure vessels to provide a more objective methodology; thus, the resulting mathematical calculations for quantitative analysis are greatly simplified. The final results show that steel DIN 20MnMoNi55 is the best material option. Additionally, more recently developed materials such as DIN 20MnMoNi55, 16MND5 and SA-508 Cl.3 exhibit mechanical requirements more stringent than SA-533 Grade B Cl.1. The methodology presented in this paper can be used as a decision tool in selection of materials for a wide range of applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hejazi, Mohamad I.; Edmonds, James A.; Clarke, Leon E.
2013-01-01
We investigate the effects of emission mitigation policies on water scarcity both globally and regionally using the Global Change Assessment Model (GCAM), a leading community integrated assessment model of energy, agriculture, climate, and water. Three climate policy scenarios with increasing mitigation stringency of 7.7, 5.5, and 4.2 W/m2 in year 2095 (equivalent to the SRES A2, B2, and B1 emission scenarios, respectively), under two carbon tax regimes (a universal carbon tax (UCT) which includes land use change emissions, and a fossil fuel and industrial emissions carbon tax (FFICT) which excludes land use change emissions) are analyzed. The results are comparedmore » to a baseline scenario (i.e., no climate change mitigation policy) with radiative forcing reaching 8.8 W/m2 (equivalent to the SRES A1Fi emission scenario) by 2095. When compared to the baseline scenario and maintaining the same baseline underlying socioeconomic assumptions, water scarcity declines under a UCT mitigation policy while increases with a FFICT mitigation scenario by the year 2095 with more stringent climate mitigation targets. The decreasing trend with UCT policy stringency is due to substitution from more water-intensive to less water-intensive choices in food, energy, and land use. Under the FFICT scenario, water scarcity is projected to increase driven by higher water demands for bio-energy crops. This study implies an increasingly prominent role for water availability in future human decisions, and highlights the importance of including water in integrated assessment of global change. Future research will be directed at incorporating water shortage feedbacks in GCAM to better understand how such stresses will propagate across the various human and natural systems in GCAM.« less
Body-wide anatomy recognition in PET/CT images
NASA Astrophysics Data System (ADS)
Wang, Huiqian; Udupa, Jayaram K.; Odhner, Dewey; Tong, Yubing; Zhao, Liming; Torigian, Drew A.
2015-03-01
With the rapid growth of positron emission tomography/computed tomography (PET/CT)-based medical applications, body-wide anatomy recognition on whole-body PET/CT images becomes crucial for quantifying body-wide disease burden. This, however, is a challenging problem and seldom studied due to unclear anatomy reference frame and low spatial resolution of PET images as well as low contrast and spatial resolution of the associated low-dose CT images. We previously developed an automatic anatomy recognition (AAR) system [15] whose applicability was demonstrated on diagnostic computed tomography (CT) and magnetic resonance (MR) images in different body regions on 35 objects. The aim of the present work is to investigate strategies for adapting the previous AAR system to low-dose CT and PET images toward automated body-wide disease quantification. Our adaptation of the previous AAR methodology to PET/CT images in this paper focuses on 16 objects in three body regions - thorax, abdomen, and pelvis - and consists of the following steps: collecting whole-body PET/CT images from existing patient image databases, delineating all objects in these images, modifying the previous hierarchical models built from diagnostic CT images to account for differences in appearance in low-dose CT and PET images, automatically locating objects in these images following object hierarchy, and evaluating performance. Our preliminary evaluations indicate that the performance of the AAR approach on low-dose CT images achieves object localization accuracy within about 2 voxels, which is comparable to the accuracies achieved on diagnostic contrast-enhanced CT images. Object recognition on low-dose CT images from PET/CT examinations without requiring diagnostic contrast-enhanced CT seems feasible.
NASA Astrophysics Data System (ADS)
Bismuth, Vincent; Vancamberg, Laurence; Gorges, Sébastien
2009-02-01
During interventional radiology procedures, guide-wires are usually inserted into the patients vascular tree for diagnosis or healing purpose. These procedures are monitored with an Xray interventional system providing images of the interventional devices navigating through the patient's body. The automatic detection of such tools by image processing means has gained maturity over the past years and enables applications ranging from image enhancement to multimodal image fusion. Sophisticated detection methods are emerging, which rely on a variety of device enhancement techniques. In this article we reviewed and classified these techniques into three families. We chose a state of the art approach in each of them and built a rigorous framework to compare their detection capability and their computational complexity. Through simulations and the intensive use of ROC curves we demonstrated that the Hessian based methods are the most robust to strong curvature of the devices and that the family of rotated filters technique is the most suited for detecting low CNR and low curvature devices. The steerable filter approach demonstrated less interesting detection capabilities and appears to be the most expensive one to compute. Finally we demonstrated the interest of automatic guide-wire detection on a clinical topic: the compensation of respiratory motion in multimodal image fusion.
Ali, N; Heslop-Harrison, Js Pat; Ahmad, H; Graybosch, R A; Hein, G L; Schwarzacher, T
2016-08-01
Pyramiding of alien-derived Wheat streak mosaic virus (WSMV) resistance and resistance enhancing genes in wheat is a cost-effective and environmentally safe strategy for disease control. PCR-based markers and cytogenetic analysis with genomic in situ hybridisation were applied to identify alien chromatin in four genetically diverse populations of wheat (Triticum aestivum) lines incorporating chromosome segments from Thinopyrum intermedium and Secale cereale (rye). Out of 20 experimental lines, 10 carried Th. intermedium chromatin as T4DL*4Ai#2S translocations, while, unexpectedly, 7 lines were positive for alien chromatin (Th. intermedium or rye) on chromosome 1B. The newly described rye 1RS chromatin, transmitted from early in the pedigree, was associated with enhanced WSMV resistance. Under field conditions, the 1RS chromatin alone showed some resistance, while together with the Th. intermedium 4Ai#2S offered superior resistance to that demonstrated by the known resistant cultivar Mace. Most alien wheat lines carry whole chromosome arms, and it is notable that these lines showed intra-arm recombination within the 1BS arm. The translocation breakpoints between 1BS and alien chromatin fell in three categories: (i) at or near to the centromere, (ii) intercalary between markers UL-Thin5 and Xgwm1130 and (iii) towards the telomere between Xgwm0911 and Xbarc194. Labelled genomic Th. intermedium DNA hybridised to the rye 1RS chromatin under high stringency conditions, indicating the presence of shared tandem repeats among the cereals. The novel small alien fragments may explain the difficulty in developing well-adapted lines carrying Wsm1 despite improved tolerance to the virus. The results will facilitate directed chromosome engineering producing agronomically desirable WSMV-resistant germplasm.
plasmaFoam: An OpenFOAM framework for computational plasma physics and chemistry
NASA Astrophysics Data System (ADS)
Venkattraman, Ayyaswamy; Verma, Abhishek Kumar
2016-09-01
As emphasized in the 2012 Roadmap for low temperature plasmas (LTP), scientific computing has emerged as an essential tool for the investigation and prediction of the fundamental physical and chemical processes associated with these systems. While several in-house and commercial codes exist, with each having its own advantages and disadvantages, a common framework that can be developed by researchers from all over the world will likely accelerate the impact of computational studies on advances in low-temperature plasma physics and chemistry. In this regard, we present a finite volume computational toolbox to perform high-fidelity simulations of LTP systems. This framework, primarily based on the OpenFOAM solver suite, allows us to enhance our understanding of multiscale plasma phenomenon by performing massively parallel, three-dimensional simulations on unstructured meshes using well-established high performance computing tools that are widely used in the computational fluid dynamics community. In this talk, we will present preliminary results obtained using the OpenFOAM-based solver suite with benchmark three-dimensional simulations of microplasma devices including both dielectric and plasma regions. We will also discuss the future outlook for the solver suite.
Mohammadi, M R; Vadamalai, G; Joseph, H
2010-01-01
Coconut cadong-cadong viroid (CCCVd) causes the Lethal cadang-cadang disease of coconut palms in the Philippines and it is recently reported to be associated with the orange spotting disease on oil palm in Malaysia. The low concentration of the viroid RNA in oil palm as well as the high content of polyphenols and polysaccharides in this plant which interfere with the purification steps makes it difficult to extract and detect this viroid from oil palm. A previously described method was modified and optimized for extraction and detection of CCCVd from infected oil palms. Briefly, 7 g of leaf material was homogenized in a mortar or a blender using liquid nitrogen. 10 ml of extraction buffer (100 mM Tris-HCl pH 7.5, 100 mM NaCl, 10 mM EDTA) along with 100 mM 2-mercaptoethanol and 10 ml water saturated phenol was added to the frozen powder. After centrifuging at 4 degrees C, 4000 g for 30 min, the aqueous phase was extracted once more with phenol then once with chloroform-isoamyl alcohol (24:1). After adding sodium acetate, pH 5.6 to 200 mM, the mixture was precipitated with 2.5 vol ethanol overnight in -20 freezer and then the pellet was washed with 70% ethanol and air-dried. One milliliter of 8 M LiCl was added to the dried pellet and after shaking overnight at 4 degrees C and another centrifugation step the supernatant was collected and precipitated again with ethanol and then the resulting pellet was washed and air-dried. To carry out northern blotting, samples equivalent to 40 g of plant tissue were mixed with formamide buffer and loaded onto a 12% polyacrylamide gel containing 7 M urea and after separation by electrophoresis, were electroblotted onto membrane and fixed by UV cross-linking. Pre-hybridization and hybridization using hybridization buffer (50% formamide, 25%SSPE, 0.1% Ficol and PVP, 0.1 % SDS, 0.02 % DNA (5mg/ml)) was carried out at 45 degrees C for 90 min and 16 h, respectively followed by two low stringency washes (0.5 X SSC, 0.1% SDS, at room temperature for 5 min) and one high stringency wash (0.1X SSC, 0.1% SDS at 60 degrees C for 1 hour). In vitro synthesized DIG-labeled full-length CCCVd(-) RNA probe was used in hybridization step. DIG Nucleic Acid Detection Kit (Roche) instructions were followed for detection procedure and as a result the blue bands corresponding to the position of the viroid were appeared on the membrane. The result of this study showed the ability of DIG labeled probe in detection of the viroid and also provided a suitable extraction and hybridization method for the detection of CCCVd from oil palm.
An online semi-supervised brain-computer interface.
Gu, Zhenghui; Yu, Zhuliang; Shen, Zhifang; Li, Yuanqing
2013-09-01
Practical brain-computer interface (BCI) systems should require only low training effort for the user, and the algorithms used to classify the intent of the user should be computationally efficient. However, due to inter- and intra-subject variations in EEG signal, intermittent training/calibration is often unavoidable. In this paper, we present an online semi-supervised P300 BCI speller system. After a short initial training (around or less than 1 min in our experiments), the system is switched to a mode where the user can input characters through selective attention. In this mode, a self-training least squares support vector machine (LS-SVM) classifier is gradually enhanced in back end with the unlabeled EEG data collected online after every character input. In this way, the classifier is gradually enhanced. Even though the user may experience some errors in input at the beginning due to the small initial training dataset, the accuracy approaches that of fully supervised method in a few minutes. The algorithm based on LS-SVM and its sequential update has low computational complexity; thus, it is suitable for online applications. The effectiveness of the algorithm has been validated through data analysis on BCI Competition III dataset II (P300 speller BCI data). The performance of the online system was evaluated through experimental results on eight healthy subjects, where all of them achieved the spelling accuracy of 85 % or above within an average online semi-supervised learning time of around 3 min.
Sera, Toshihiro; Yokota, Hideo; Tanaka, Gaku; Uesugi, Kentaro; Yagi, Naoto; Schroter, Robert C
2013-07-15
We visualized pulmonary acini in the core regions of the mouse lung in situ using synchrotron refraction-enhanced computed tomography (CT) and evaluated their kinematics during quasi-static inflation. This CT system (with a cube voxel of 2.8 μm) allows excellent visualization of not just the conducting airways, but also the alveolar ducts and sacs, and tracking of the acinar shape and its deformation during inflation. The kinematics of individual alveoli and alveolar clusters with a group of terminal alveoli is influenced not only by the connecting alveolar duct and alveoli, but also by the neighboring structures. Acinar volume was not a linear function of lung volume. The alveolar duct diameter changed dramatically during inflation at low pressures and remained relatively constant above an airway pressure of ∼8 cmH2O during inflation. The ratio of acinar surface area to acinar volume indicates that acinar distension during low-pressure inflation differed from that during inflation over a higher pressure range; in particular, acinar deformation was accordion-like during low-pressure inflation. These results indicated that the alveoli and duct expand differently as total acinar volume increases and that the alveolar duct may expand predominantly during low-pressure inflation. Our findings suggest that acinar deformation in the core regions of the lung is complex and heterogeneous.
Wan, Tao; Bloch, B. Nicolas; Plecha, Donna; Thompson, CheryI L.; Gilmore, Hannah; Jaffe, Carl; Harris, Lyndsay; Madabhushi, Anant
2016-01-01
To identify computer extracted imaging features for estrogen receptor (ER)-positive breast cancers on dynamic contrast en-hanced (DCE)-MRI that are correlated with the low and high OncotypeDX risk categories. We collected 96 ER-positivebreast lesions with low (<18, N = 55) and high (>30, N = 41) OncotypeDX recurrence scores. Each lesion was quantitatively charac-terize via 6 shape features, 3 pharmacokinetics, 4 enhancement kinetics, 4 intensity kinetics, 148 textural kinetics, 5 dynamic histogram of oriented gradient (DHoG), and 6 dynamic local binary pattern (DLBP) features. The extracted features were evaluated by a linear discriminant analysis (LDA) classifier in terms of their ability to distinguish low and high OncotypeDX risk categories. Classification performance was evaluated by area under the receiver operator characteristic curve (Az). The DHoG and DLBP achieved Az values of 0.84 and 0.80, respectively. The 6 top features identified via feature selection were subsequently combined with the LDA classifier to yield an Az of 0.87. The correlation analysis showed that DHoG (ρ = 0.85, P < 0.001) and DLBP (ρ = 0.83, P < 0.01) were significantly associated with the low and high risk classifications from the OncotypeDX assay. Our results indicated that computer extracted texture features of DCE-MRI were highly correlated with the high and low OncotypeDX risk categories for ER-positive cancers. PMID:26887643
Large-Scale Campus Computer Technology Implementation: Lessons from the First Year.
ERIC Educational Resources Information Center
Nichols, Todd; Frazer, Linda H.
The purpose of the Elementary Technology Demonstration Schools (ETDS) Project, funded by IBM and Apple, Inc., was to demonstrate the effectiveness of technology in accelerating the learning of low achieving at-risk students and enhancing the education of high achieving students. The paper begins by giving background information on the district,…
A fast method to emulate an iterative POCS image reconstruction algorithm.
Zeng, Gengsheng L
2017-10-01
Iterative image reconstruction algorithms are commonly used to optimize an objective function, especially when the objective function is nonquadratic. Generally speaking, the iterative algorithms are computationally inefficient. This paper presents a fast algorithm that has one backprojection and no forward projection. This paper derives a new method to solve an optimization problem. The nonquadratic constraint, for example, an edge-preserving denoising constraint is implemented as a nonlinear filter. The algorithm is derived based on the POCS (projections onto projections onto convex sets) approach. A windowed FBP (filtered backprojection) algorithm enforces the data fidelity. An iterative procedure, divided into segments, enforces edge-enhancement denoising. Each segment performs nonlinear filtering. The derived iterative algorithm is computationally efficient. It contains only one backprojection and no forward projection. Low-dose CT data are used for algorithm feasibility studies. The nonlinearity is implemented as an edge-enhancing noise-smoothing filter. The patient studies results demonstrate its effectiveness in processing low-dose x ray CT data. This fast algorithm can be used to replace many iterative algorithms. © 2017 American Association of Physicists in Medicine.
Mazziotti, David A
2016-10-07
A central challenge of physics is the computation of strongly correlated quantum systems. The past ten years have witnessed the development and application of the variational calculation of the two-electron reduced density matrix (2-RDM) without the wave function. In this Letter we present an orders-of-magnitude improvement in the accuracy of 2-RDM calculations without an increase in their computational cost. The advance is based on a low-rank, dual formulation of an important constraint on the 2-RDM, the T2 condition. Calculations are presented for metallic chains and a cadmium-selenide dimer. The low-scaling T2 condition will have significant applications in atomic and molecular, condensed-matter, and nuclear physics.
NASA Astrophysics Data System (ADS)
Mazziotti, David A.
2016-10-01
A central challenge of physics is the computation of strongly correlated quantum systems. The past ten years have witnessed the development and application of the variational calculation of the two-electron reduced density matrix (2-RDM) without the wave function. In this Letter we present an orders-of-magnitude improvement in the accuracy of 2-RDM calculations without an increase in their computational cost. The advance is based on a low-rank, dual formulation of an important constraint on the 2-RDM, the T 2 condition. Calculations are presented for metallic chains and a cadmium-selenide dimer. The low-scaling T 2 condition will have significant applications in atomic and molecular, condensed-matter, and nuclear physics.
Relevance feedback-based building recognition
NASA Astrophysics Data System (ADS)
Li, Jing; Allinson, Nigel M.
2010-07-01
Building recognition is a nontrivial task in computer vision research which can be utilized in robot localization, mobile navigation, etc. However, existing building recognition systems usually encounter the following two problems: 1) extracted low level features cannot reveal the true semantic concepts; and 2) they usually involve high dimensional data which require heavy computational costs and memory. Relevance feedback (RF), widely applied in multimedia information retrieval, is able to bridge the gap between the low level visual features and high level concepts; while dimensionality reduction methods can mitigate the high-dimensional problem. In this paper, we propose a building recognition scheme which integrates the RF and subspace learning algorithms. Experimental results undertaken on our own building database show that the newly proposed scheme appreciably enhances the recognition accuracy.
Aydın, Eda Akman; Bay, Ömer Faruk; Güler, İnan
2016-01-01
Brain Computer Interface (BCI) based environment control systems could facilitate life of people with neuromuscular diseases, reduces dependence on their caregivers, and improves their quality of life. As well as easy usage, low-cost, and robust system performance, mobility is an important functionality expected from a practical BCI system in real life. In this study, in order to enhance users' mobility, we propose internet based wireless communication between BCI system and home environment. We designed and implemented a prototype of an embedded low-cost, low power, easy to use web server which is employed in internet based wireless control of a BCI based home environment. The embedded web server provides remote access to the environmental control module through BCI and web interfaces. While the proposed system offers to BCI users enhanced mobility, it also provides remote control of the home environment by caregivers as well as the individuals in initial stages of neuromuscular disease. The input of BCI system is P300 potentials. We used Region Based Paradigm (RBP) as stimulus interface. Performance of the BCI system is evaluated on data recorded from 8 non-disabled subjects. The experimental results indicate that the proposed web server enables internet based wireless control of electrical home appliances successfully through BCIs.
Image jitter enhances visual performance when spatial resolution is impaired.
Watson, Lynne M; Strang, Niall C; Scobie, Fraser; Love, Gordon D; Seidel, Dirk; Manahilov, Velitchko
2012-09-06
Visibility of low-spatial frequency stimuli improves when their contrast is modulated at 5 to 10 Hz compared with stationary stimuli. Therefore, temporal modulations of visual objects could enhance the performance of low vision patients who primarily perceive images of low-spatial frequency content. We investigated the effect of retinal-image jitter on word recognition speed and facial emotion recognition in subjects with central visual impairment. Word recognition speed and accuracy of facial emotion discrimination were measured in volunteers with AMD under stationary and jittering conditions. Computer-driven and optoelectronic approaches were used to induce retinal-image jitter with duration of 100 or 166 ms and amplitude within the range of 0.5 to 2.6° visual angle. Word recognition speed was also measured for participants with simulated (Bangerter filters) visual impairment. Text jittering markedly enhanced word recognition speed for people with severe visual loss (101 ± 25%), while for those with moderate visual impairment, this effect was weaker (19 ± 9%). The ability of low vision patients to discriminate the facial emotions of jittering images improved by a factor of 2. A prototype of optoelectronic jitter goggles produced similar improvement in facial emotion discrimination. Word recognition speed in participants with simulated visual impairment was enhanced for interjitter intervals over 100 ms and reduced for shorter intervals. Results suggest that retinal-image jitter with optimal frequency and amplitude is an effective strategy for enhancing visual information processing in the absence of spatial detail. These findings will enable the development of novel tools to improve the quality of life of low vision patients.
Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C
2013-05-01
Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. Copyright © 2013 Elsevier B.V. All rights reserved.
Fang, Ruogu; Chen, Tsuhan; Sanelli, Pina C.
2014-01-01
Computed tomography perfusion (CTP) is an important functional imaging modality in the evaluation of cerebrovascular diseases, particularly in acute stroke and vasospasm. However, the post-processed parametric maps of blood flow tend to be noisy, especially in low-dose CTP, due to the noisy contrast enhancement profile and the oscillatory nature of the results generated by the current computational methods. In this paper, we propose a robust sparse perfusion deconvolution method (SPD) to estimate cerebral blood flow in CTP performed at low radiation dose. We first build a dictionary from high-dose perfusion maps using online dictionary learning and then perform deconvolution-based hemodynamic parameters estimation on the low-dose CTP data. Our method is validated on clinical data of patients with normal and pathological CBF maps. The results show that we achieve superior performance than existing methods, and potentially improve the differentiation between normal and ischemic tissue in the brain. PMID:23542422
Reducing the latency of the Fractal Iterative Method to half an iteration
NASA Astrophysics Data System (ADS)
Béchet, Clémentine; Tallon, Michel
2013-12-01
The fractal iterative method for atmospheric tomography (FRiM-3D) has been introduced to solve the wavefront reconstruction at the dimensions of an ELT with a low-computational cost. Previous studies reported the requirement of only 3 iterations of the algorithm in order to provide the best adaptive optics (AO) performance. Nevertheless, any iterative method in adaptive optics suffer from the intrinsic latency induced by the fact that one iteration can start only once the previous one is completed. Iterations hardly match the low-latency requirement of the AO real-time computer. We present here a new approach to avoid iterations in the computation of the commands with FRiM-3D, thus allowing low-latency AO response even at the scale of the European ELT (E-ELT). The method highlights the importance of "warm-start" strategy in adaptive optics. To our knowledge, this particular way to use the "warm-start" has not been reported before. Futhermore, removing the requirement of iterating to compute the commands, the computational cost of the reconstruction with FRiM-3D can be simplified and at least reduced to half the computational cost of a classical iteration. Thanks to simulations of both single-conjugate and multi-conjugate AO for the E-ELT,with FRiM-3D on Octopus ESO simulator, we demonstrate the benefit of this approach. We finally enhance the robustness of this new implementation with respect to increasing measurement noise, wind speed and even modeling errors.
Help for the Visually Impaired
NASA Technical Reports Server (NTRS)
1995-01-01
The Low Vision Enhancement System (LVES) is a video headset that offers people with low vision a view of their surroundings equivalent to the image on a five-foot television screen four feet from the viewer. It will not make the blind see but for many people with low vision, it eases everyday activities such as reading, watching TV and shopping. LVES was developed over almost a decade of cooperation between Stennis Space Center, the Wilmer Eye Institute of the Johns Hopkins Medical Institutions, the Department of Veteran Affairs, and Visionics Corporation. With the aid of Stennis scientists, Wilmer researchers used NASA technology for computer processing of satellite images and head-mounted vision enhancement systems originally intended for the space station. The unit consists of a head-mounted video display, three video cameras, and a control unit for the cameras. The cameras feed images to the video display in the headset.
NASA Technical Reports Server (NTRS)
Oluwole, Oluwayemisi O.; Wong, Hsi-Wu; Green, William
2012-01-01
AdapChem software enables high efficiency, low computational cost, and enhanced accuracy on computational fluid dynamics (CFD) numerical simulations used for combustion studies. The software dynamically allocates smaller, reduced chemical models instead of the larger, full chemistry models to evolve the calculation while ensuring the same accuracy to be obtained for steady-state CFD reacting flow simulations. The software enables detailed chemical kinetic modeling in combustion CFD simulations. AdapChem adapts the reaction mechanism used in the CFD to the local reaction conditions. Instead of a single, comprehensive reaction mechanism throughout the computation, a dynamic distribution of smaller, reduced models is used to capture accurately the chemical kinetics at a fraction of the cost of the traditional single-mechanism approach.
Texton-based super-resolution for achieving high spatiotemporal resolution in hybrid camera system
NASA Astrophysics Data System (ADS)
Kamimura, Kenji; Tsumura, Norimichi; Nakaguchi, Toshiya; Miyake, Yoichi
2010-05-01
Many super-resolution methods have been proposed to enhance the spatial resolution of images by using iteration and multiple input images. In a previous paper, we proposed the example-based super-resolution method to enhance an image through pixel-based texton substitution to reduce the computational cost. In this method, however, we only considered the enhancement of a texture image. In this study, we modified this texton substitution method for a hybrid camera to reduce the required bandwidth of a high-resolution video camera. We applied our algorithm to pairs of high- and low-spatiotemporal-resolution videos, which were synthesized to simulate a hybrid camera. The result showed that the fine detail of the low-resolution video can be reproduced compared with bicubic interpolation and the required bandwidth could be reduced to about 1/5 in a video camera. It was also shown that the peak signal-to-noise ratios (PSNRs) of the images improved by about 6 dB in a trained frame and by 1.0-1.5 dB in a test frame, as determined by comparison with the processed image using bicubic interpolation, and the average PSNRs were higher than those obtained by the well-known Freeman’s patch-based super-resolution method. Compared with that of the Freeman’s patch-based super-resolution method, the computational time of our method was reduced to almost 1/10.
Tongrod, Nattapong; Lokavee, Shongpun; Watthanawisuth, Natthapol; Tuantranont, Adisorn; Kerdcharoen, Teerakiat
2013-03-01
Current trends in Human-Computer Interface (HCI) have brought on a wave of new consumer devices that can track the motion of our hands. These devices have enabled more natural interfaces with computer applications. Data gloves are commonly used as input devices, equipped with sensors that detect the movements of hands and communication unit that interfaces those movements with a computer. Unfortunately, the high cost of sensor technology inevitably puts some burden to most general users. In this research, we have proposed a low-cost data glove concept based on printed polymeric sensor to make pressure and bending sensors fabricated by a consumer ink-jet printer. These sensors were realized using a conductive polymer (poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) [PEDOT:PSS]) thin film printed on glossy photo paper. Performance of these sensors can be enhanced by addition of dimethyl sulfoxide (DMSO) into the aqueous dispersion of PEDOT:PSS. The concept of surface resistance was successfully adopted for the design and fabrication of sensors. To demonstrate the printed sensors, we constructed a data glove using such sensors and developed software for real time hand tracking. Wireless networks based on low-cost Zigbee technology were used to transfer data from the glove to a computer. To our knowledge, this is the first report on low cost data glove based on paper pressure sensors. This low cost implementation of both sensors and communication network as proposed in this paper should pave the way toward a widespread implementation of data glove for real-time hand tracking applications.
Sonoporation enhances liposome accumulation and penetration in tumors with low EPR.
Theek, Benjamin; Baues, Maike; Ojha, Tarun; Möckel, Diana; Veettil, Seena Koyadan; Steitz, Julia; van Bloois, Louis; Storm, Gert; Kiessling, Fabian; Lammers, Twan
2016-06-10
The Enhanced Permeability and Retention (EPR) effect is a highly variable phenomenon. To enhance EPR-mediated passive drug targeting to tumors, several different pharmacological and physical strategies have been evaluated over the years, including e.g. TNFα-treatment, vascular normalization, hyperthermia and radiotherapy. Here, we systematically investigated the impact of sonoporation, i.e. the combination of ultrasound (US) and microbubbles (MB), on the tumor accumulation and penetration of liposomes. Two different MB formulations were employed, and their ability to enhance liposome accumulation and penetration was evaluated in two different tumor models, which are both characterized by relatively low levels of EPR (i.e. highly cellular A431 epidermoid xenografts and highly stromal BxPC-3 pancreatic carcinoma xenografts). The liposomes were labeled with two different fluorophores, enabling in vivo computed tomography/fluorescence molecular tomography (CT-FMT) and ex vivo two-photon laser scanning microscopy (TPLSM). In both models, in spite of relatively high inter- and intra-individual variability, a trend towards improved liposome accumulation and penetration was observed. In treated tumors, liposome concentrations were up to twice as high as in untreated tumors, and sonoporation enhanced the ability of liposomes to extravasate out of the blood vessels into the tumor interstitium. These findings indicate that sonoporation may be a useful strategy for improving drug targeting to tumors with low EPR. Copyright © 2016 Elsevier B.V. All rights reserved.
Fernandez, Maria E.; LaRue, Denise M.; Bartholomew, L. Kay
2012-01-01
Computer-based multimedia technologies can be used to tailor health messages, but promotoras (Spanish-speaking community health workers) rarely use these tools. Promotoras delivered health messages about colorectal cancer screening to medically underserved Latinos in South Texas using two small media formats: a “low-tech” format (flipchart and video); and a “high-tech” format consisting of a tailored, interactive computer program delivered on a tablet computer. Using qualitative methods, we observed promotora training and intervention delivery, and conducted interviews with five promotoras to compare and contrast program implementation of both formats. We discuss the ways each format aided or challenged promotoras’ intervention delivery. Findings reveal that some aspects of both formats enhanced intervention delivery by tapping into Latino health communication preferences and facilitating interpersonal communication, while other aspects hindered intervention delivery. This study contributes to our understanding of how community health workers use low- and high-tech small media formats when delivering health messages to Latinos. PMID:21986243
A meta-analysis of outcomes from the use of computer-simulated experiments in science education
NASA Astrophysics Data System (ADS)
Lejeune, John Van
The purpose of this study was to synthesize the findings from existing research on the effects of computer simulated experiments on students in science education. Results from 40 reports were integrated by the process of meta-analysis to examine the effect of computer-simulated experiments and interactive videodisc simulations on student achievement and attitudes. Findings indicated significant positive differences in both low-level and high-level achievement of students who use computer-simulated experiments and interactive videodisc simulations as compared to students who used more traditional learning activities. No significant differences in retention, student attitudes toward the subject, or toward the educational method were found. Based on the findings of this study, computer-simulated experiments and interactive videodisc simulations should be used to enhance students' learning in science, especially in cases where the use of traditional laboratory activities are expensive, dangerous, or impractical.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, L; Shen, C; Wang, J
Purpose: To reduce cone beam CT (CBCT) imaging dose, we previously proposed a progressive dose control (PDC) scheme to employ temporal correlation between CBCT images at different fractions for image quality enhancement. A temporal non-local means (TNLM) method was developed to enhance quality of a new low-dose CBCT using existing high-quality CBCT. To enhance a voxel value, the TNLM method searches for similar voxels in a window. Due to patient deformation among the two CBCTs, a large searching window was required, reducing image quality and computational efficiency. This abstract proposes a deformation-assisted TNLM (DA-TNLM) method to solve this problem. Methods:more » For a low-dose CBCT to be enhanced using a high-quality CBCT, we first performed deformable image registration between the low-dose CBCT and the high-quality CBCT to approximately establish voxel correspondence between the two. A searching window for a voxel was then set based on the deformation vector field. Specifically, the search window for each voxel was shifted by the deformation vector. A TNLM step was then applied using only voxels within this determined window to correct image intensity at the low-dose CBCT. Results: We have tested the proposed scheme on simulated CIRS phantom data and real patient data. The CITS phantom was scanned on Varian onboard imaging CBCT system with coach shifting and dose reducing for each time. The real patient data was acquired in four fractions with dose reduced from standard CBCT dose to 12.5% of standard dose. It was found that the DA-TNLM method can reduce total dose by over 75% on average in the first four fractions. Conclusion: We have developed a PDC scheme which can enhance the quality of image scanned at low dose using a DA-TNLM method. Tests in phantom and patient studies demonstrated promising results.« less
Southam, Lorraine; Panoutsopoulou, Kalliope; Rayner, N William; Chapman, Kay; Durrant, Caroline; Ferreira, Teresa; Arden, Nigel; Carr, Andrew; Deloukas, Panos; Doherty, Michael; Loughlin, John; McCaskie, Andrew; Ollier, William E R; Ralston, Stuart; Spector, Timothy D; Valdes, Ana M; Wallis, Gillian A; Wilkinson, J Mark; Marchini, Jonathan; Zeggini, Eleftheria
2011-05-01
Imputation is an extremely valuable tool in conducting and synthesising genome-wide association studies (GWASs). Directly typed SNP quality control (QC) is thought to affect imputation quality. It is, therefore, common practise to use quality-controlled (QCed) data as an input for imputing genotypes. This study aims to determine the effect of commonly applied QC steps on imputation outcomes. We performed several iterations of imputing SNPs across chromosome 22 in a dataset consisting of 3177 samples with Illumina 610 k (Illumina, San Diego, CA, USA) GWAS data, applying different QC steps each time. The imputed genotypes were compared with the directly typed genotypes. In addition, we investigated the correlation between alternatively QCed data. We also applied a series of post-imputation QC steps balancing elimination of poorly imputed SNPs and information loss. We found that the difference between the unQCed data and the fully QCed data on imputation outcome was minimal. Our study shows that imputation of common variants is generally very accurate and robust to GWAS QC, which is not a major factor affecting imputation outcome. A minority of common-frequency SNPs with particular properties cannot be accurately imputed regardless of QC stringency. These findings may not generalise to the imputation of low frequency and rare variants.
Single genome retrieval of context-dependent variability in mutation rates for human germline.
Sahakyan, Aleksandr B; Balasubramanian, Shankar
2017-01-13
Accurate knowledge of the core components of substitution rates is of vital importance to understand genome evolution and dynamics. By performing a single-genome and direct analysis of 39,894 retrotransposon remnants, we reveal sequence context-dependent germline nucleotide substitution rates for the human genome. The rates are characterised through rate constants in a time-domain, and are made available through a dedicated program (Trek) and a stand-alone database. Due to the nature of the method design and the imposed stringency criteria, we expect our rate constants to be good estimates for the rates of spontaneous mutations. Benefiting from such data, we study the short-range nucleotide (up to 7-mer) organisation and the germline basal substitution propensity (BSP) profile of the human genome; characterise novel, CpG-independent, substitution prone and resistant motifs; confirm a decreased tendency of moieties with low BSP to undergo somatic mutations in a number of cancer types; and, produce a Trek-based estimate of the overall mutation rate in human. The extended set of rate constants we report may enrich our resources and help advance our understanding of genome dynamics and evolution, with possible implications for the role of spontaneous mutations in the emergence of pathological genotypes and neutral evolution of proteomes.
Chen, Shen-Yi; Chou, Li-Chieh
2016-08-01
Heavy metals can be removed from the sludge using bioleaching technologies at thermophilic condition, thereby providing an option for biotreatment of wasted sludge generated from wastewater treatment. The purposes of this study were to establish a molecular biology technique, real-time PCR, for the detection and enumeration of the sulfur-oxidizing bacteria during the thermophilic sludge bioleaching. The 16S rRNA gene for real-time PCR quantification targeted the bioleaching bacteria: Sulfobacillus thermosulfidooxidans, Sulfobacillus acidophilus, and Acidithiobacillus caldus. The specificity and stringency for thermophilic sulfur-oxidizing bacteria were tested before the experiments of monitoring the bacterial community, bacterial number during the thermophilic sludge bioleaching and the future application on testing various environmental samples. The results showed that S. acidophilus was identified as the dominant sulfur-oxidizing bacteria, while A. caldus and S. thermosulfidooxidans occurred in relatively low numbers. The total number of the sulfur-oxidizing bacteria increased during the thermophilic bioleaching process. Meanwhile, the decrease of pH, production of sulfate, degradation of SS/VSS, and solubilization of heavy metal were found to correlate well with the population of thermophilic sulfur-oxidizing bacteria during the bioleaching process. The real-time PCR used in this study is a suitable method to monitor numbers of thermophilic sulfur-oxidizing bacteria during the bioleaching process.
Tam, S F
2000-10-15
The aim of this controlled, quasi-experimental study was to evaluate the effects of both self-efficacy enhancement and social comparison training strategy on computer skills learning and self-concept outcome of trainees with physical disabilities. The self-efficacy enhancement group comprised 16 trainees, the tutorial training group comprised 15 trainees, and there were 25 subjects in the control group. Both the self-efficacy enhancement group and the tutorial training group received a 15 week computer skills training course, including generic Chinese computer operation, Chinese word processing and Chinese desktop publishing skills. The self-efficacy enhancement group received training with tutorial instructions that incorporated self-efficacy enhancement strategies and experienced self-enhancing social comparisons. The tutorial training group received behavioural learning-based tutorials only, and the control group did not receive any training. The following measurements were employed to evaluate the outcomes: the Self-Concept Questionnaire for the Physically Disabled Hong Kong Chinese (SCQPD), the computer self-efficacy rating scale and the computer performance rating scale. The self-efficacy enhancement group showed significantly better computer skills learning outcome, total self-concept, and social self-concept than the tutorial training group. The self-efficacy enhancement group did not show significant changes in their computer self-efficacy: however, the tutorial training group showed a significant lowering of their computer self-efficacy. The training strategy that incorporated self-efficacy enhancement and positive social comparison experiences maintained the computer self-efficacy of trainees with physical disabilities. This strategy was more effective in improving the learning outcome (p = 0.01) and self-concept (p = 0.05) of the trainees than the conventional tutorial-based training strategy.
Martín-Blanco, E; Kornberg, T B
1993-11-16
Degenerate oligodeoxyribonucleotides were designed for both ends of the DNA-binding domain of members of the nuclear receptor superfamily. PCR amplified Drosophila melanogaster DNA was purified and cloned (DR plasmids). Genomic lambda DASH clones were identified at high stringency with an amplified DR-78 plasmid DNA and isolated. The partial sequence shows a very probable open reading frame which would encode a peptide highly homologous to members of the thyroid hormone-retinoic acid-vitamin D receptor subfamily. The fragment corresponds to a single copy gene and was mapped at position 78D of chromosome three by in situ hybridization.
Impacts of Hospital Budget Limits in Rochester, New York
Friedman, Bernard; Wong, Herbert S.
1995-01-01
During 1980-87, eight hospitals in the Rochester, New York area participated in an experimental program to limit total revenue. This article analyzes: increase of costs for Rochester hospitals; trends for inputs and compensation; and cash flow margins. Real expense per case grew annually by about 3 percent less in Rochester. However, after 1984, Medicare prospective payment had an effect of similar size outside Rochester. Some capital inputs to hospital care were restrained, as were wages and particularly benefits. The program did not generally raise or stabilize hospital revenue margins, while the ratio of cash flow to debt trended down. Financial stringency of this program relative to alternatives may have contributed to its end. PMID:10151889
Impacts of hospital budget limits in Rochester, New York.
Friedman, B; Wong, H S
1995-01-01
During 1980-87, eight hospitals in the Rochester, New York area participated in an experimental program to limit total revenue. This article analyzes: increase of costs for Rochester hospitals; trends for inputs and compensation; and cash flow margins. Real expense per case grew annually by about 3 percent less in Rochester. However, after 1984, Medicare prospective payment had an effect of similar size outside Rochester. Some capital inputs to hospital care were restrained, as were wages and particularly benefits. The program did not generally raise or stabilize hospital revenue margins, while the ratio of cash flow to debt trended down. Financial stringency of this program relative to alternatives may have contributed to its end.
Tobita, Kenji; Matsumoto, Takuya; Ohashi, Satoru; Bessho, Masahiko; Kaneko, Masako; Ohnishi, Isao
2012-07-01
It has been previously demonstrated that low-intensity pulsed ultrasound stimulation (LIPUS) enhances formation of the medullary canal and cortex in a gap-healing model of the tibia in rabbits, shortens the time required for remodeling, and enhances mineralization of the callus. In the current study, the mechanical integrity of these models was confirmed. In order to do this, the cross-sectional moment of inertia (CSMI) obtained from quantitative micro-computed tomography scans was calculated, and a comparison was made with a four-point bending test. This parameter can be analyzed in any direction, and three directions were selected in order to adopt an XYZ coordinate (X and Y for bending; Z for torsion). The present results demonstrated that LIPUS improved earlier restoration of bending stiffness at the healing site. In addition, LIPUS was effective not only in the ultrasound-irradiated plane, but also in the other two planes. CSMI may provide the structural as well as compositional determinants to assess fracture healing and would be very useful to replace the mechanical testing.
Giant ferrimagnetism and polarization in a mixed metal perovskite metal-organic framework
NASA Astrophysics Data System (ADS)
Rout, Paresh C.; Srinivasan, Varadharajan
2018-01-01
Perovskite metal-organic frameworks (MOFs) have recently emerged as potential candidates for multiferroicity. However, the compounds synthesized so far possess only weak ferromagnetism and low polarization. Additionally, the very low magnetic transition temperatures (Tc) also pose a challenge to the application of the materials. We have computationally designed a mixed metal perovskite MOF—[C(NH2)3] [(Cu0.5Mn0.5) (HCOO) 3] —that is predicted to have magnetization two orders of magnitude larger than its parent ([C (NH2)3] [Cu (HCOO) 3] ), a significantly larger polarization (9.9 μ C /cm2), and an enhanced Tc of up to 56 K, unprecedented in perovskite MOFs. A detailed study of the magnetic interactions revealed a mechanism leading to the large moments as well as the increase in the Tc. Mixing a non-Jahn-Teller ion (Mn2 +) into a Jahn-Teller host (Cu2 +) leads to competing lattice distortions which are directly responsible for the enhanced polarization. The MOF is thermodynamically stable as evidenced by the computed enthalpy of formation and can likely be synthesized. Our work represents a first step towards rational design of multiferroic perovskite MOFs through the largely unexplored mixed metal approach.
Chen, I-Jen; Foloppe, Nicolas
2013-12-15
Computational conformational sampling underpins much of molecular modeling and design in pharmaceutical work. The sampling of smaller drug-like compounds has been an active area of research. However, few studies have tested in details the sampling of larger more flexible compounds, which are also relevant to drug discovery, including therapeutic peptides, macrocycles, and inhibitors of protein-protein interactions. Here, we investigate extensively mainstream conformational sampling methods on three carefully curated compound sets, namely the 'Drug-like', larger 'Flexible', and 'Macrocycle' compounds. These test molecules are chemically diverse with reliable X-ray protein-bound bioactive structures. The compared sampling methods include Stochastic Search and the recent LowModeMD from MOE, all the low-mode based approaches from MacroModel, and MD/LLMOD recently developed for macrocycles. In addition to default settings, key parameters of the sampling protocols were explored. The performance of the computational protocols was assessed via (i) the reproduction of the X-ray bioactive structures, (ii) the size, coverage and diversity of the output conformational ensembles, (iii) the compactness/extendedness of the conformers, and (iv) the ability to locate the global energy minimum. The influence of the stochastic nature of the searches on the results was also examined. Much better results were obtained by adopting search parameters enhanced over the default settings, while maintaining computational tractability. In MOE, the recent LowModeMD emerged as the method of choice. Mixed torsional/low-mode from MacroModel performed as well as LowModeMD, and MD/LLMOD performed well for macrocycles. The low-mode based approaches yielded very encouraging results with the flexible and macrocycle sets. Thus, one can productively tackle the computational conformational search of larger flexible compounds for drug discovery, including macrocycles. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantum-Enhanced Cyber Security: Experimental Computation on Quantum-Encrypted Data
2017-03-02
AFRL-AFOSR-UK-TR-2017-0020 Quantum-Enhanced Cyber Security: Experimental Computation on Quantum-Encrypted Data Philip Walther UNIVERSITT WIEN Final...REPORT TYPE Final 3. DATES COVERED (From - To) 15 Oct 2015 to 31 Dec 2016 4. TITLE AND SUBTITLE Quantum-Enhanced Cyber Security: Experimental Computation...FORM SF 298 Final Report for FA9550-1-6-1-0004 Quantum-enhanced cyber security: Experimental quantum computation with quantum-encrypted data
Nakano, Sachiko; Tsushima, Yoshito; Taketomi-Takahashi, Ayako; Higuchi, Tetsuya; Amanuma, Makoto; Oriuchi, Noboru; Endo, Keigo
2011-07-01
A 63-year-old man underwent computed tomography (CT) using intravenous low-osmolar iodine contrast medium (LOCM) 6 days after undergoing high-dose (131)I-MIBG therapy for metastatic pheochromocytoma. Immediately after the CT examination, his blood pressure increased to 260/160 mmHg (from 179/101 mmHg before the examination). Phentolamine mesilate was administered, and the blood pressure rapidly went back to normal. Although hypertensive crisis after administration of LOCM is rare, this case suggests that high-dose (131)IMIBG therapy may be a risk factor for hypertensive crisis after administration of intravenous LOCM.
Jaafar, Haryati; Ibrahim, Salwani; Ramli, Dzati Athiar
2015-01-01
Mobile implementation is a current trend in biometric design. This paper proposes a new approach to palm print recognition, in which smart phones are used to capture palm print images at a distance. A touchless system was developed because of public demand for privacy and sanitation. Robust hand tracking, image enhancement, and fast computation processing algorithms are required for effective touchless and mobile-based recognition. In this project, hand tracking and the region of interest (ROI) extraction method were discussed. A sliding neighborhood operation with local histogram equalization, followed by a local adaptive thresholding or LHEAT approach, was proposed in the image enhancement stage to manage low-quality palm print images. To accelerate the recognition process, a new classifier, improved fuzzy-based k nearest centroid neighbor (IFkNCN), was implemented. By removing outliers and reducing the amount of training data, this classifier exhibited faster computation. Our experimental results demonstrate that a touchless palm print system using LHEAT and IFkNCN achieves a promising recognition rate of 98.64%. PMID:26113861
Enhancing Hi-C data resolution with deep convolutional neural network HiCPlus.
Zhang, Yan; An, Lin; Xu, Jie; Zhang, Bo; Zheng, W Jim; Hu, Ming; Tang, Jijun; Yue, Feng
2018-02-21
Although Hi-C technology is one of the most popular tools for studying 3D genome organization, due to sequencing cost, the resolution of most Hi-C datasets are coarse and cannot be used to link distal regulatory elements to their target genes. Here we develop HiCPlus, a computational approach based on deep convolutional neural network, to infer high-resolution Hi-C interaction matrices from low-resolution Hi-C data. We demonstrate that HiCPlus can impute interaction matrices highly similar to the original ones, while only using 1/16 of the original sequencing reads. We show that the models learned from one cell type can be applied to make predictions in other cell or tissue types. Our work not only provides a computational framework to enhance Hi-C data resolution but also reveals features underlying the formation of 3D chromatin interactions.
Evaluation of pulmonary function using single-breath-hold dual-energy computed tomography with xenon
Kyoyama, Hiroyuki; Hirata, Yusuke; Kikuchi, Satoshi; Sakai, Kosuke; Saito, Yuriko; Mikami, Shintaro; Moriyama, Gaku; Yanagita, Hisami; Watanabe, Wataru; Otani, Katharina; Honda, Norinari; Uematsu, Kazutsugu
2017-01-01
Abstract Xenon-enhanced dual-energy computed tomography (xenon-enhanced CT) can provide lung ventilation maps that may be useful for assessing structural and functional abnormalities of the lung. Xenon-enhanced CT has been performed using a multiple-breath-hold technique during xenon washout. We recently developed xenon-enhanced CT using a single-breath-hold technique to assess ventilation. We sought to evaluate whether xenon-enhanced CT using a single-breath-hold technique correlates with pulmonary function testing (PFT) results. Twenty-six patients, including 11 chronic obstructive pulmonary disease (COPD) patients, underwent xenon-enhanced CT and PFT. Three of the COPD patients underwent xenon-enhanced CT before and after bronchodilator treatment. Images from xenon-CT were obtained by dual-source CT during a breath-hold after a single vital-capacity inspiration of a xenon–oxygen gas mixture. Image postprocessing by 3-material decomposition generated conventional CT and xenon-enhanced images. Low-attenuation areas on xenon images matched low-attenuation areas on conventional CT in 21 cases but matched normal-attenuation areas in 5 cases. Volumes of Hounsfield unit (HU) histograms of xenon images correlated moderately and highly with vital capacity (VC) and total lung capacity (TLC), respectively (r = 0.68 and 0.85). Means and modes of histograms weakly correlated with VC (r = 0.39 and 0.38), moderately with forced expiratory volume in 1 second (FEV1) (r = 0.59 and 0.56), weakly with the ratio of FEV1 to FVC (r = 0.46 and 0.42), and moderately with the ratio of FEV1 to its predicted value (r = 0.64 and 0.60). Mode and volume of histograms increased in 2 COPD patients after the improvement of FEV1 with bronchodilators. Inhalation of xenon gas caused no adverse effects. Xenon-enhanced CT using a single-breath-hold technique depicted functional abnormalities not detectable on thin-slice CT. Mode, mean, and volume of HU histograms of xenon images reflected pulmonary function. Xenon images obtained with xenon-enhanced CT using a single-breath-hold technique can qualitatively depict pulmonary ventilation. A larger study comprising only COPD patients should be conducted, as xenon-enhanced CT is expected to be a promising technique for the management of COPD. PMID:28099359
Kyoyama, Hiroyuki; Hirata, Yusuke; Kikuchi, Satoshi; Sakai, Kosuke; Saito, Yuriko; Mikami, Shintaro; Moriyama, Gaku; Yanagita, Hisami; Watanabe, Wataru; Otani, Katharina; Honda, Norinari; Uematsu, Kazutsugu
2017-01-01
Xenon-enhanced dual-energy computed tomography (xenon-enhanced CT) can provide lung ventilation maps that may be useful for assessing structural and functional abnormalities of the lung. Xenon-enhanced CT has been performed using a multiple-breath-hold technique during xenon washout. We recently developed xenon-enhanced CT using a single-breath-hold technique to assess ventilation. We sought to evaluate whether xenon-enhanced CT using a single-breath-hold technique correlates with pulmonary function testing (PFT) results.Twenty-six patients, including 11 chronic obstructive pulmonary disease (COPD) patients, underwent xenon-enhanced CT and PFT. Three of the COPD patients underwent xenon-enhanced CT before and after bronchodilator treatment. Images from xenon-CT were obtained by dual-source CT during a breath-hold after a single vital-capacity inspiration of a xenon-oxygen gas mixture. Image postprocessing by 3-material decomposition generated conventional CT and xenon-enhanced images.Low-attenuation areas on xenon images matched low-attenuation areas on conventional CT in 21 cases but matched normal-attenuation areas in 5 cases. Volumes of Hounsfield unit (HU) histograms of xenon images correlated moderately and highly with vital capacity (VC) and total lung capacity (TLC), respectively (r = 0.68 and 0.85). Means and modes of histograms weakly correlated with VC (r = 0.39 and 0.38), moderately with forced expiratory volume in 1 second (FEV1) (r = 0.59 and 0.56), weakly with the ratio of FEV1 to FVC (r = 0.46 and 0.42), and moderately with the ratio of FEV1 to its predicted value (r = 0.64 and 0.60). Mode and volume of histograms increased in 2 COPD patients after the improvement of FEV1 with bronchodilators. Inhalation of xenon gas caused no adverse effects.Xenon-enhanced CT using a single-breath-hold technique depicted functional abnormalities not detectable on thin-slice CT. Mode, mean, and volume of HU histograms of xenon images reflected pulmonary function. Xenon images obtained with xenon-enhanced CT using a single-breath-hold technique can qualitatively depict pulmonary ventilation. A larger study comprising only COPD patients should be conducted, as xenon-enhanced CT is expected to be a promising technique for the management of COPD.
NASA Astrophysics Data System (ADS)
Wang, Shiyang; Lu, Zhengfeng; Fan, Xiaobing; Medved, Milica; Jiang, Xia; Sammet, Steffen; Yousuf, Ambereen; Pineda, Federico; Oto, Aytekin; Karczmar, Gregory S.
2018-02-01
The purpose of this study was to evaluate the accuracy of arterial input functions (AIFs) measured from dynamic contrast enhanced (DCE) MRI following a low dose of contrast media injection. The AIFs measured from DCE computed tomography (CT) were used as ‘gold standard’. A total of twenty patients received CT and MRI scans on the same day. Patients received 120 ml Iohexol in DCE-CT and a low dose of (0.015 mM kg-1) of gadobenate dimeglumine in DCE-MRI. The AIFs were measured in the iliac artery and normalized to the CT and MRI contrast agent doses. To correct for different temporal resolution and sampling periods of CT and MRI, an empirical mathematical model (EMM) was used to fit the AIFs first. Then numerical AIFs (AIFCT and AIFMRI) were calculated based on fitting parameters. The AIFMRI was convolved with a ‘contrast agent injection’ function (AIFMRICON ) to correct for the difference between MRI and CT contrast agent injection times (~1.5 s versus 30 s). The results show that the EMMs accurately fitted AIFs measured from CT and MRI. There was no significant difference (p > 0.05) between the maximum peak amplitude of AIFs from CT (22.1 ± 4.1 mM/dose) and MRI after convolution (22.3 ± 5.2 mM/dose). The shapes of the AIFCT and AIFMRICON were very similar. Our results demonstrated that AIFs can be accurately measured by MRI following low dose contrast agent injection.
Kim, Jae Heon; Sun, Hwa Yeon; Hwang, Jiyoung; Hong, Seong Sook; Cho, Yong Jin; Doo, Seung Whan; Yang, Won Jae; Song, Yun Seob
2016-10-12
The aim of this study was to investigate the diagnostic accuracy of contrast-enhanced computed tomography (CT) and contrast-enhanced magnetic resonance imaging (MRI) of small renal masses in real practice. Contrast-enhanced CT and MRI were performed between February 2008 and February 2013 on 68 patients who had suspected small (≤4 cm) renal cell carcinoma (RCC) based on ultrasonographic measurements. CT and MRI radiographs were reviewed, and the findings of small renal masses were re-categorized into five dichotomized scales by the same two radiologists who had interpreted the original images. Receiver operating characteristics curve analysis was performed, and sensitivity and specificity were determined. Among the 68 patients, 60 (88.2 %) had RCC and eight had benign disease. The diagnostic accuracy rates of contrast-enhanced CT and MRI were 79.41 and 88.23 %, respectively. Diagnostic accuracy was greater when using contrast-enhanced MRI because too many masses (67.6 %) were characterized as "4 (probably solid cancer) or 5 (definitely solid cancer)." The sensitivity of contrast-enhanced CT and MRI for predicting RCC were 79.7 and 88.1 %, respectively. The specificities of contrast-enhanced CT and MRI for predicting RCC were 44.4 and 33.3 %, respectively. Fourteen diagnoses (20.5 %) were missed or inconsistent compared with the final pathological diagnoses. One appropriate nephroureterectomy and five unnecessary percutaneous biopsies were performed for RCC. Seven unnecessary partial nephrectomies were performed for benign disease. Although contrast-enhanced CT and MRI showed high sensitivity for detecting small renal masses, specificity remained low.
Progress and challenges in bioinformatics approaches for enhancer identification
Kleftogiannis, Dimitrios; Kalnis, Panos
2016-01-01
Enhancers are cis-acting DNA elements that play critical roles in distal regulation of gene expression. Identifying enhancers is an important step for understanding distinct gene expression programs that may reflect normal and pathogenic cellular conditions. Experimental identification of enhancers is constrained by the set of conditions used in the experiment. This requires multiple experiments to identify enhancers, as they can be active under specific cellular conditions but not in different cell types/tissues or cellular states. This has opened prospects for computational prediction methods that can be used for high-throughput identification of putative enhancers to complement experimental approaches. Potential functions and properties of predicted enhancers have been catalogued and summarized in several enhancer-oriented databases. Because the current methods for the computational prediction of enhancers produce significantly different enhancer predictions, it will be beneficial for the research community to have an overview of the strategies and solutions developed in this field. In this review, we focus on the identification and analysis of enhancers by bioinformatics approaches. First, we describe a general framework for computational identification of enhancers, present relevant data types and discuss possible computational solutions. Next, we cover over 30 existing computational enhancer identification methods that were developed since 2000. Our review highlights advantages, limitations and potentials, while suggesting pragmatic guidelines for development of more efficient computational enhancer prediction methods. Finally, we discuss challenges and open problems of this topic, which require further consideration. PMID:26634919
Bashir, Mustafa R; Weber, Paul W; Husarik, Daniela B; Howle, Laurens E; Nelson, Rendon C
2012-08-01
To assess whether a scan triggering technique based on the slope of the time-attenuation curve combined with table speed optimization may improve arterial enhancement in aortic CT angiography compared to conventional threshold-based triggering techniques. Measurements of arterial enhancement were performed in a physiologic flow phantom over a range of simulated cardiac outputs (2.2-8.1 L/min) using contrast media boluses of 80 and 150 mL injected at 4 mL/s. These measurements were used to construct computer models of aortic attenuation in CT angiography, using cardiac output, aortic diameter, and CT table speed as input parameters. In-plane enhancement was calculated for normal and aneurysmal aortic diameters. Calculated arterial enhancement was poor (<150 HU) along most of the scan length using the threshold-based triggering technique for low cardiac outputs and the aneurysmal aorta model. Implementation of the slope-based triggering technique with table speed optimization improved enhancement in all scenarios and yielded good- (>200 HU; 13/16 scenarios) to excellent-quality (>300 HU; 3/16 scenarios) enhancement in all cases. Slope-based triggering with table speed optimization may improve the technical quality of aortic CT angiography over conventional threshold-based techniques, and may reduce technical failures related to low cardiac output and slow flow through an aneurysmal aorta.
Advanced Computational Methods for High-accuracy Refinement of Protein Low-quality Models
NASA Astrophysics Data System (ADS)
Zang, Tianwu
Predicting the 3-dimentional structure of protein has been a major interest in the modern computational biology. While lots of successful methods can generate models with 3˜5A root-mean-square deviation (RMSD) from the solution, the progress of refining these models is quite slow. It is therefore urgently needed to develop effective methods to bring low-quality models to higher-accuracy ranges (e.g., less than 2 A RMSD). In this thesis, I present several novel computational methods to address the high-accuracy refinement problem. First, an enhanced sampling method, named parallel continuous simulated tempering (PCST), is developed to accelerate the molecular dynamics (MD) simulation. Second, two energy biasing methods, Structure-Based Model (SBM) and Ensemble-Based Model (EBM), are introduced to perform targeted sampling around important conformations. Third, a three-step method is developed to blindly select high-quality models along the MD simulation. These methods work together to make significant refinement of low-quality models without any knowledge of the solution. The effectiveness of these methods is examined in different applications. Using the PCST-SBM method, models with higher global distance test scores (GDT_TS) are generated and selected in the MD simulation of 18 targets from the refinement category of the 10th Critical Assessment of Structure Prediction (CASP10). In addition, in the refinement test of two CASP10 targets using the PCST-EBM method, it is indicated that EBM may bring the initial model to even higher-quality levels. Furthermore, a multi-round refinement protocol of PCST-SBM improves the model quality of a protein to the level that is sufficient high for the molecular replacement in X-ray crystallography. Our results justify the crucial position of enhanced sampling in the protein structure prediction and demonstrate that a considerable improvement of low-accuracy structures is still achievable with current force fields.
Mousa-Pasandi, Mohammad E; Zhuge, Qunbi; Xu, Xian; Osman, Mohamed M; El-Sahn, Ziad A; Chagnon, Mathieu; Plant, David V
2012-07-02
We experimentally investigate the performance of a low-complexity non-iterative phase noise induced inter-carrier interference (ICI) compensation algorithm in reduced-guard-interval dual-polarization coherent-optical orthogonal-frequency-division-multiplexing (RGI-DP-CO-OFDM) transport systems. This interpolation-based ICI compensator estimates the time-domain phase noise samples by a linear interpolation between the CPE estimates of the consecutive OFDM symbols. We experimentally study the performance of this scheme for a 28 Gbaud QPSK RGI-DP-CO-OFDM employing a low cost distributed feedback (DFB) laser. Experimental results using a DFB laser with the linewidth of 2.6 MHz demonstrate 24% and 13% improvement in transmission reach with respect to the conventional equalizer (CE) in presence of weak and strong dispersion-enhanced-phase-noise (DEPN), respectively. A brief analysis of the computational complexity of this scheme in terms of the number of required complex multiplications is provided. This practical approach does not suffer from error propagation while enjoying low computational complexity.
NASA Astrophysics Data System (ADS)
Nugraha, Muhamad Gina; Kaniawati, Ida; Rusdiana, Dadi; Kirana, Kartika Hajar
2016-02-01
Among the purposes of physics learning at high school is to master the physics concepts and cultivate scientific attitude (including critical attitude), develop inductive and deductive reasoning skills. According to Ennis et al., inductive and deductive reasoning skills are part of critical thinking. Based on preliminary studies, both of the competence are lack achieved, it is seen from student learning outcomes is low and learning processes that are not conducive to cultivate critical thinking (teacher-centered learning). One of learning model that predicted can increase mastery concepts and train CTS is inquiry learning model aided computer simulations. In this model, students were given the opportunity to be actively involved in the experiment and also get a good explanation with the computer simulations. From research with randomized control group pretest-posttest design, we found that the inquiry learning model aided computer simulations can significantly improve students' mastery concepts than the conventional (teacher-centered) method. With inquiry learning model aided computer simulations, 20% of students have high CTS, 63.3% were medium and 16.7% were low. CTS greatly contribute to the students' mastery concept with a correlation coefficient of 0.697 and quite contribute to the enhancement mastery concept with a correlation coefficient of 0.603.
Dick, W
1994-01-01
Emergency medicine is subjected worldwide to financial stringencies and organizational evaluations of cost-effectiveness. The various links in the chain of survival are affected differently. Bystander assistance or bystander CPR is available in only 30% of the emergencies, response intervals--if at all required by legislation--are observed to only a limited degree or are too extended for survival in cardiac arrest. A single emergency telephone number is lacking. Too many different phone numbers for emergency reporting result in confusion and delays. Organizational realities are not fully overcome and impair efficiency. The position of the emergency physician in the EMS System is inadequately defined, the qualification of too many emergency physicians are unsatisfactory. In spite of this, emergency physicians are frequently forced to answer out-of-hospital emergency calls. Conflicts between emergency physicians and EMTs may be overcome by providing both groups with comparable qualifications as well as by providing an explicit definition of emergency competence. A further source of conflict occurs at the juncture of prehospital and inhospital emergency care in the emergency department. Deficiencies on either side play a decisive role. At least in principle there are solutions to the deficiencies in the EMSS and in intensive care medicine. They are among others: Adequate financial compensation of emergency personnel, availability of sufficient numbers of highly qualified personnel, availability of a central receiving area with an adjacent emergency ward, constant information flow to the dispatch center on the number of available emergency beds, maintaining 5% of all beds as emergency beds, establishing intermediate care facilities. Efficiency of emergency physician activities can be demonstrated in polytraumatized patients or in patients with ventricular fibrillation or acute myocardial infarction, in patients with acute myocardial insufficiency and other emergency clinical pictures. Cost effectiveness is clearly in favor of emergency medicine. Future developments will be characterized by the consequences of new health care legislation and by effects of financial stringencies on the emergency medical services.
Liu, Yan-chen; Huang, Ai-long; Hu, Yuan; Hu, Jie-li; Lai, Guo-qi; Zhang, Wen-lu
2011-12-01
To establish a detection method for HBV drug-resistant mutations related to lamivudine, adefovir and entecavir by optimization and assessment of reverse hybridization system. 26 degenerated probes covering 10 drug-resistant hotspots of 3 drugs were synthesized and immobilized on the same positively charged nylon membrane. PCR products labeled with digoxigenin were hybridized with corresponding probes. To improve the sensitivity and specificity, 4 reaction steps of reverse hybridization were optimized including the number of labeled digoxigenin, the energy intensity of UV cross-linking, hybridization and stringency wash conditions. To prove the feasibility, the specificity, sensitivity and accuracy of this system were assessed respectively. Sensitive and specific results are obtained by the optimization of the following 4 reaction steps: the primers labeled with 3 digoxigenin, energy intensity of UV cross-linking for 1500 x 0.1 mJ/cm², hybridization at 42 degrees C and stringency wash with 0.5 x SSC and 0.1% SDS solution at 44 degrees C for 30 min. In the assessment of system, the majority of probes have high specificity. The quantity of PCR product with a concentration of 10 ng/μl or above can be detected by this method. The concordant rate between reverse hybridization and direct sequencing is 93.9% in the clinical sample test. Though the specificity of several probes needs to be improved further, it is a simple, rapid and sensitive method which can detect HBV resistant mutations related to lamivudine, adefovir and entecavir simultaneously. Due to the short distance between 180 and 181, likewise 202 and 204, the sequence of the same probe covers two codon positions, and hybridization will be interfered by each other. To avoid such interference, the possible solution is that probes are designed by arranging and combining various forms of two near codons.
Berg, Marianne; Hagland, Hanne R; Søreide, Kjetil
2014-01-01
In colorectal cancer a distinct subgroup of tumours demonstrate the CpG island methylator phenotype (CIMP). However, a consensus of how to score CIMP is not reached, and variation in definition may influence the reported CIMP prevalence in tumours. Thus, we sought to compare currently suggested definitions and cut-offs for methylation markers and how they influence CIMP classification in colon cancer. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA), with subsequent fragment analysis, was used to investigate methylation of tumour samples. In total, 31 CpG sites, located in 8 different genes (RUNX3, MLH1, NEUROG1, CDKN2A, IGF2, CRABP1, SOCS1 and CACNA1G) were investigated in 64 distinct colon cancers and 2 colon cancer cell lines. The Ogino gene panel includes all 8 genes, in addition to the Weisenberger panel of which only 5 of the 8 genes included were investigated. In total, 18 alternative combinations of scoring of CIMP positivity on probe-, gene-, and panel-level were analysed and compared. For 47 samples (71%), the CIMP status was constant and independent of criteria used for scoring; 34 samples were constantly scored as CIMP negative, and 13 (20%) consistently scored as CIMP positive. Only four of 31 probes (13%) investigated showed no difference in the numbers of positive samples using the different cut-offs. Within the panels a trend was observed that increasing the gene-level stringency resulted in a larger difference in CIMP positive samples than increasing the probe-level stringency. A significant difference between positive samples using 'the most stringent' as compared to 'the least stringent' criteria (20% vs 46%, respectively; p<0.005) was demonstrated. A statistical significant variation in the frequency of CIMP depending on the cut-offs and genes included in a panel was found, with twice as many positives samples by least compared to most stringent definition used.
Silveira, Cynthia B.; Vieira, Ricardo P.; Cardoso, Alexander M.; Paranhos, Rodolfo; Albano, Rodolpho M.; Martins, Orlando B.
2011-01-01
Background Planktonic bacteria are recognized as important drivers of biogeochemical processes in all aquatic ecosystems, however, the taxa that make up these communities are poorly known. The aim of this study was to investigate bacterial communities in aquatic ecosystems at Ilha Grande, Rio de Janeiro, Brazil, a preserved insular environment of the Atlantic rain forest and how they correlate with a salinity gradient going from terrestrial aquatic habitats to the coastal Atlantic Ocean. Methodology/Principal Findings We analyzed chemical and microbiological parameters of water samples and constructed 16S rRNA gene libraries of free living bacteria obtained at three marine (two coastal and one offshore) and three freshwater (water spring, river, and mangrove) environments. A total of 836 sequences were analyzed by MOTHUR, yielding 269 freshwater and 219 marine operational taxonomic units (OTUs) grouped at 97% stringency. Richness and diversity indexes indicated that freshwater environments were the most diverse, especially the water spring. The main bacterial group in freshwater environments was Betaproteobacteria (43.5%), whereas Cyanobacteria (30.5%), Alphaproteobacteria (25.5%), and Gammaproteobacteria (26.3%) dominated the marine ones. Venn diagram showed no overlap between marine and freshwater OTUs at 97% stringency. LIBSHUFF statistics and PCA analysis revealed marked differences between the freshwater and marine libraries suggesting the importance of salinity as a driver of community composition in this habitat. The phylogenetic analysis of marine and freshwater libraries showed that the differences in community composition are consistent. Conclusions/Significance Our data supports the notion that a divergent evolutionary scenario is driving community composition in the studied habitats. This work also improves the comprehension of microbial community dynamics in tropical waters and how they are structured in relation to physicochemical parameters. Furthermore, this paper reveals for the first time the pristine bacterioplankton communities in a tropical island at the South Atlantic Ocean. PMID:21408023
Silveira, Cynthia B; Vieira, Ricardo P; Cardoso, Alexander M; Paranhos, Rodolfo; Albano, Rodolpho M; Martins, Orlando B
2011-03-09
Planktonic bacteria are recognized as important drivers of biogeochemical processes in all aquatic ecosystems, however, the taxa that make up these communities are poorly known. The aim of this study was to investigate bacterial communities in aquatic ecosystems at Ilha Grande, Rio de Janeiro, Brazil, a preserved insular environment of the Atlantic rain forest and how they correlate with a salinity gradient going from terrestrial aquatic habitats to the coastal Atlantic Ocean. We analyzed chemical and microbiological parameters of water samples and constructed 16S rRNA gene libraries of free living bacteria obtained at three marine (two coastal and one offshore) and three freshwater (water spring, river, and mangrove) environments. A total of 836 sequences were analyzed by MOTHUR, yielding 269 freshwater and 219 marine operational taxonomic units (OTUs) grouped at 97% stringency. Richness and diversity indexes indicated that freshwater environments were the most diverse, especially the water spring. The main bacterial group in freshwater environments was Betaproteobacteria (43.5%), whereas Cyanobacteria (30.5%), Alphaproteobacteria (25.5%), and Gammaproteobacteria (26.3%) dominated the marine ones. Venn diagram showed no overlap between marine and freshwater OTUs at 97% stringency. LIBSHUFF statistics and PCA analysis revealed marked differences between the freshwater and marine libraries suggesting the importance of salinity as a driver of community composition in this habitat. The phylogenetic analysis of marine and freshwater libraries showed that the differences in community composition are consistent. Our data supports the notion that a divergent evolutionary scenario is driving community composition in the studied habitats. This work also improves the comprehension of microbial community dynamics in tropical waters and how they are structured in relation to physicochemical parameters. Furthermore, this paper reveals for the first time the pristine bacterioplankton communities in a tropical island at the South Atlantic Ocean.
Jibaja-Weiss, Maria L; Volk, Robert J
2007-01-01
Decision aids have been developed by using various delivery methods, including interactive computer programs. Such programs, however, still rely heavily on written information, health and digital literacy, and reading ease. We describe an approach to overcome these potential barriers for low-literate, underserved populations by making design considerations for poor readers and naïve computer users and by using concepts from entertainment education to engage the user and to contextualize the content for the user. The system design goals are to make the program both didactic and entertaining and the navigation and graphical user interface as simple as possible. One entertainment education strategy, the soap opera, is linked seamlessly to interactive learning modules to enhance the content of the soap opera episodes. The edutainment decision aid model (EDAM) guides developers through the design process. Although designing patient decision aids that are educational, entertaining, and targeted toward poor readers and those with limited computer skills is a complex task, it is a promising strategy for aiding this population. Entertainment education may be a highly effective approach to promoting informed decision making for patients with low health literacy.
Marek's disease is a natural model for lymphomas overexpressing Hodgkin's disease antigen (CD30)
Burgess, S. C.; Young, J. R.; Baaten, B. J. G.; Hunt, L.; Ross, L. N. J.; Parcells, M. S.; Kumar, P. M.; Tregaskes, C. A.; Lee, L. F.; Davison, T. F.
2004-01-01
Animal models are essential for elucidating the molecular mechanisms of carcinogenesis. Hodgkin's and many diverse non-Hodgkin's lymphomas overexpress the Hodgkin's disease antigen CD30 (CD30hi), a tumor necrosis factor receptor II family member. Here we show that chicken Marek's disease (MD) lymphoma cells are also CD30hi and are a unique natural model for CD30hi lymphoma. Chicken CD30 resembles an ancestral form, and we identify a previously undescribed potential cytoplasmic signaling domain conserved in chicken, human, and mouse CD30. Our phylogeneic analysis defines a relationship between the structures of human and mouse CD30 and confirms that mouse CD30 represents the ancestral mammalian gene structure. CD30 expression by MD virus (MDV)-transformed lymphocytes correlates with expression of the MDV Meq putative oncogene (a c-Jun homologue) in vivo. The chicken CD30 promoter has 15 predicted high-stringency Meq-binding transcription factor recognition motifs, and Meq enhances transcription from the CD30 promoter in vitro. Plasma proteomics identified a soluble form of CD30. CD30 overexpression is evolutionarily conserved and defines one class of neoplastic transformation events, regardless of etiology. We propose that CD30 is a component of a critical intracellular signaling pathway perturbed in neoplastic transformation. Specific anti-CD30 Igs occurred after infection of genetically MD-resistant chickens with oncogenic MDV, suggesting immunity to CD30 could play a role in MD lymphoma regression. PMID:15356338
Mason, Christopher E.; Shu, Feng-Jue; Wang, Cheng; Session, Ryan M.; Kallen, Roland G.; Sidell, Neil; Yu, Tianwei; Liu, Mei Hui; Cheung, Edwin; Kallen, Caleb B.
2010-01-01
Location analysis for estrogen receptor-α (ERα)-bound cis-regulatory elements was determined in MCF7 cells using chromatin immunoprecipitation (ChIP)-on-chip. Here, we present the estrogen response element (ERE) sequences that were identified at ERα-bound loci and quantify the incidence of ERE sequences under two stringencies of detection: <10% and 10–20% nucleotide deviation from the canonical ERE sequence. We demonstrate that ∼50% of all ERα-bound loci do not have a discernable ERE and show that most ERα-bound EREs are not perfect consensus EREs. Approximately one-third of all ERα-bound ERE sequences reside within repetitive DNA sequences, most commonly of the AluS family. In addition, the 3-bp spacer between the inverted ERE half-sites, rather than being random nucleotides, is C(A/T)G-enriched at bona fide receptor targets. Diverse ERα-bound loci were validated using electrophoretic mobility shift assay and ChIP-polymerase chain reaction (PCR). The functional significance of receptor-bound loci was demonstrated using luciferase reporter assays which proved that repetitive element ERE sequences contribute to enhancer function. ChIP-PCR demonstrated estrogen-dependent recruitment of the coactivator SRC3 to these loci in vivo. Our data demonstrate that ERα binds to widely variant EREs with less sequence specificity than had previously been suspected and that binding at repetitive and nonrepetitive genomic targets is favored by specific trinucleotide spacers. PMID:20047966
Mason, Christopher E; Shu, Feng-Jue; Wang, Cheng; Session, Ryan M; Kallen, Roland G; Sidell, Neil; Yu, Tianwei; Liu, Mei Hui; Cheung, Edwin; Kallen, Caleb B
2010-04-01
Location analysis for estrogen receptor-alpha (ERalpha)-bound cis-regulatory elements was determined in MCF7 cells using chromatin immunoprecipitation (ChIP)-on-chip. Here, we present the estrogen response element (ERE) sequences that were identified at ERalpha-bound loci and quantify the incidence of ERE sequences under two stringencies of detection: <10% and 10-20% nucleotide deviation from the canonical ERE sequence. We demonstrate that approximately 50% of all ERalpha-bound loci do not have a discernable ERE and show that most ERalpha-bound EREs are not perfect consensus EREs. Approximately one-third of all ERalpha-bound ERE sequences reside within repetitive DNA sequences, most commonly of the AluS family. In addition, the 3-bp spacer between the inverted ERE half-sites, rather than being random nucleotides, is C(A/T)G-enriched at bona fide receptor targets. Diverse ERalpha-bound loci were validated using electrophoretic mobility shift assay and ChIP-polymerase chain reaction (PCR). The functional significance of receptor-bound loci was demonstrated using luciferase reporter assays which proved that repetitive element ERE sequences contribute to enhancer function. ChIP-PCR demonstrated estrogen-dependent recruitment of the coactivator SRC3 to these loci in vivo. Our data demonstrate that ERalpha binds to widely variant EREs with less sequence specificity than had previously been suspected and that binding at repetitive and nonrepetitive genomic targets is favored by specific trinucleotide spacers.
Transcriptome signatures of p,p´-DDE-induced liver damage in Mus spretus mice.
Morales-Prieto, Noelia; Ruiz-Laguna, Julia; Sheehan, David; Abril, Nieves
2018-07-01
The use of DDT (1,1,1-trichloro-2,2-bis(p-chlorophenyl) ethane) in some countries, although regulated, is contributing to an increased worldwide risk of exposure to this organochlorine pesticide or its derivative p,p'-DDE [1,1-dichloro-2,2-bis(p-chlorophenyl) ethylene]. Many studies have associated p,p'-DDE exposure to type 2 diabetes, obesity and alterations of the reproductive system, but their molecular mechanisms of toxicity remain poorly understood. We have addressed this issue by using commercial microarrays based on probes for the entire Mus musculus genome to determine the hepatic transcriptional signatures of p,p'-DDE in the phylogenetically close mouse species Mus spretus. High-stringency hybridization conditions and analysis assured reliable results, which were also verified, in part, by qRT-PCR, immunoblotting and/or enzymatic activity. Our data linked 198 deregulated genes to mitochondrial dysfunction and perturbations of central signaling pathways (kinases, lipids, and retinoic acid) leading to enhanced lipogenesis and aerobic glycolysis, inflammation, cell proliferation and testosterone catabolism and excretion. Alterations of transcript levels of genes encoding enzymes involved in testosterone catabolism and excretion would explain the relationships established between p,p´-DDE exposure and reproductive disorders, obesity and diabetes. Further studies will help to fully understand the molecular basis of p,p´-DDE molecular toxicity in liver and reproductive organs, to identify effective exposure biomarkers and perhaps to design efficient p,p'-DDE exposure counteractive strategies. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry
2015-01-01
Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.
Singanamalli, Asha; Rusu, Mirabela; Sparks, Rachel E; Shih, Natalie N C; Ziober, Amy; Wang, Li-Ping; Tomaszewski, John; Rosen, Mark; Feldman, Michael; Madabhushi, Anant
2016-01-01
To identify computer extracted in vivo dynamic contrast enhanced (DCE) MRI markers associated with quantitative histomorphometric (QH) characteristics of microvessels and Gleason scores (GS) in prostate cancer. This study considered retrospective data from 23 biopsy confirmed prostate cancer patients who underwent 3 Tesla multiparametric MRI before radical prostatectomy (RP). Representative slices from RP specimens were stained with vascular marker CD31. Tumor extent was mapped from RP sections onto DCE MRI using nonlinear registration methods. Seventy-seven microvessel QH features and 18 DCE MRI kinetic features were extracted and evaluated for their ability to distinguish low from intermediate and high GS. The effect of temporal sampling on kinetic features was assessed and correlations between those robust to temporal resolution and microvessel features discriminative of GS were examined. A total of 12 microvessel architectural features were discriminative of low and intermediate/high grade tumors with area under the receiver operating characteristic curve (AUC) > 0.7. These features were most highly correlated with mean washout gradient (WG) (max rho = -0.62). Independent analysis revealed WG to be moderately robust to temporal resolution (intraclass correlation coefficient [ICC] = 0.63) and WG variance, which was poorly correlated with microvessel features, to be predictive of low grade tumors (AUC = 0.77). Enhancement ratio was the most robust (ICC = 0.96) and discriminative (AUC = 0.78) kinetic feature but was moderately correlated with microvessel features (max rho = -0.52). Computer extracted features of prostate DCE MRI appear to be correlated with microvessel architecture and may be discriminative of low versus intermediate and high GS. © 2015 Wiley Periodicals, Inc.
Fleischmann, Ulrike; Pietsch, Hubertus; Korporaal, Johannes G; Flohr, Thomas G; Uder, Michael; Jost, Gregor; Lell, Michael M
2018-05-01
Low peak kilovoltage (kVp) protocols in computed tomography angiography (CTA) demand a review of contrast media (CM) administration practices. The aim of this study was to systematically evaluate different iodine concentrations of CM in a porcine model. Dynamic 70 kVp CTA was performed on 7 pigs using a third-generation dual-source CT system. Three CM injection protocols (A-C) with an identical total iodine dose and iodine delivery rate (150 mg I/kg, 12 s, 0.75 g I/s) differed in iodine concentration and flow rate (protocol A: 400 mg I/mL, 1.9 mL/s; B: 300 mg I/mL, 2.5 mL/s; C: 150 mg I/mL, 5 mL/s). All protocols were applied in a randomized order and compared intraindividually. Arterial enhancement at different locations in the pulmonary artery, the aorta, and aortic branches was measured over time. Time attenuation curves, peak enhancement, time to peak, and bolus tracking delay times needed for static CTA were calculated. The reproducibility of optimal parameters was tested in single-phase CTA. The heart rates of the pigs were comparable for all protocols (P > 0.7). The injection pressure was significantly higher for protocol A (64 ± 5 psi) and protocol C (55 ± 3 psi) compared with protocol B (39 ± 2 psi) (P < 0.001). Average arterial peak enhancement in the dynamic scans was 359 ± 51 HU (protocol A), 382 ± 36 HU (B), and 382 ± 60 HU (C) (A compared with B and C: P < 0.01; B compared with C: P = 0.995). Time to peak enhancement decreased with increasing injection rate. The delay time for bolus tracking depended on the injection rate as well and was highest for protocol A (4.7 seconds) and lowest for protocol C (3.9 seconds) (P = 0.038). The peak enhancement values of the dynamic scans highly correlated with those of the single-phase CTA scans. In 70 kVp CTA, 300 mg I/mL iodine concentrations showed to be superior to high-concentration CM when keeping the iodine delivery rate constant. Besides, iodine concentrations as low as 150 mg I/mL can be administered without compromising vascular enhancement. This opens up new possibilities in CM administration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jacob, Reed B.; Ding, Feng; Ye, Dongmei
Organophosphates are widely used for peaceful (agriculture) and military purposes (chemical warfare agents). The extraordinary toxicity of organophosphates and the risk of deployment, make it critical to develop means for their rapid and efficient deactivation. Organophosphate hydrolase (OPH) already plays an important role in organophosphate remediation, but is insufficient for therapeutic or prophylactic purposes primarily due to low substrate affinity. Current efforts focus on directly modifying the active site to differentiate substrate specificity and increase catalytic activity. Here, we present a novel strategy for enhancing the general catalytic efficiency of OPH through computational redesign of the residues that are allostericallymore » coupled to the active site and validated our design by mutagenesis. Specifically, we identify five such hot-spot residues for allosteric regulation and assay these mutants for hydrolysis activity against paraoxon, a chemical-weapons simulant. A high percentage of the predicted mutants exhibit enhanced activity over wild-type (k cat =16.63 s -1), such as T199I/T54I (899.5 s -1) and C227V/T199I/T54I (848 s -1), while the Km remains relatively unchanged in our high-throughput cell-free expression system. Further computational studies of protein dynamics reveal four distinct distal regions coupled to the active site that display significant changes in conformation dynamics upon these identified mutations. These results validate a computational design method that is both efficient and easily adapted as a general procedure for enzymatic enhancement.« less
NASA Astrophysics Data System (ADS)
Nakagawa, Tomohiko; Gonda, Kohsuke; Kamei, Takashi; Cong, Liman; Hamada, Yoh; Kitamura, Narufumi; Tada, Hiroshi; Ishida, Takanori; Aimiya, Takuji; Furusawa, Naoko; Nakano, Yasushi; Ohuchi, Noriaki
2016-01-01
Contrast agents are often used to enhance the contrast of X-ray computed tomography (CT) imaging of tumors to improve diagnostic accuracy. However, because the iodine-based contrast agents currently used in hospitals are of low molecular weight, the agent is rapidly excreted from the kidney or moves to extravascular tissues through the capillary vessels, depending on its concentration gradient. This leads to nonspecific enhancement of contrast images for tissues. Here, we created gold (Au) nanoparticles as a new contrast agent to specifically image tumors with CT using an enhanced permeability and retention (EPR) effect. Au has a higher X-ray absorption coefficient than does iodine. Au nanoparticles were supported with polyethylene glycol (PEG) chains on their surface to increase the blood retention and were conjugated with a cancer-specific antibody via terminal PEG chains. The developed Au nanoparticles were injected into tumor-bearing mice, and the distribution of Au was examined with CT imaging, transmission electron microscopy, and elemental analysis using inductively coupled plasma optical emission spectrometry. The results show that specific localization of the developed Au nanoparticles in the tumor is affected by a slight difference in particle size and enhanced by the conjugation of a specific antibody against the tumor.
Common Graphics Library (CGL). Volume 2: Low-level user's guide
NASA Technical Reports Server (NTRS)
Taylor, Nancy L.; Hammond, Dana P.; Theophilos, Pauline M.
1989-01-01
The intent is to instruct the users of the Low-Level routines of the Common Graphics Library (CGL). The Low-Level routines form an application-independent graphics package enabling the user community to construct and design scientific charts conforming to the publication and/or viewgraph process. The Low-Level routines allow the user to design unique or unusual report-quality charts from a set of graphics utilities. The features of these routines can be used stand-alone or in conjunction with other packages to enhance or augment their capabilities. This library is written in ANSI FORTRAN 77, and currently uses a CORE-based underlying graphics package, and is therefore machine-independent, providing support for centralized and/or distributed computer systems.
Low-complexity camera digital signal imaging for video document projection system
NASA Astrophysics Data System (ADS)
Hsia, Shih-Chang; Tsai, Po-Shien
2011-04-01
We present high-performance and low-complexity algorithms for real-time camera imaging applications. The main functions of the proposed camera digital signal processing (DSP) involve color interpolation, white balance, adaptive binary processing, auto gain control, and edge and color enhancement for video projection systems. A series of simulations demonstrate that the proposed method can achieve good image quality while keeping computation cost and memory requirements low. On the basis of the proposed algorithms, the cost-effective hardware core is developed using Verilog HDL. The prototype chip has been verified with one low-cost programmable device. The real-time camera system can achieve 1270 × 792 resolution with the combination of extra components and can demonstrate each DSP function.
Atomistic study of mixing at high Z / low Z interfaces at Warm Dense Matter Conditions
NASA Astrophysics Data System (ADS)
Haxhimali, Tomorr; Glosli, James; Rudd, Robert; Lawrence Livermore National Laboratory Team
2016-10-01
We use atomistic simulations to study different aspects of mixing occurring at an initially sharp interface of high Z and low Z plasmas in the Warm/Hot Dense Matter regime. We consider a system of Diamond (the low Z component) in contact with Ag (the high Z component), which undergoes rapid isochoric heating from room temperature up to 10 eV, rapidly changing the solids into warm dense matter at solid density. We simulate the motion of ions via the screened Coulomb potential. The electric field, the electron density and ionizations level are computed on the fly by solving Poisson equation. The spatially varying screening lengths computed from the electron cloud are included in this effective interaction; the electrons are not simulated explicitly. We compute the electric field generated at the Ag-C interface as well as the dynamics of the ions during the mixing process occurring at the plasma interface. Preliminary results indicate an anomalous transport of high Z ions (Ag) into the low Z component (C); a phenomenon that is partially related to the enhanced transport of ions due to the generated electric field. These results are in agreement with recent experimental observation on Au-diamond plasma interface. This work was performed under the auspices of the US Dept. of Energy by Lawrence Livermore National Security, LLC under Contract DE-AC52-07NA27344.
Jones, Jeryl C; Appt, Susan E; Werre, Stephen R; Tan, Joshua C; Kaplan, Jay R
2010-06-01
The purpose of this study was to validate low radiation dose, contrast-enhanced, multi-detector computed tomography (MDCT) as a non-invasive method for measuring ovarian volume in macaques. Computed tomography scans of four known-volume phantoms and nine mature female cynomolgus macaques were acquired using a previously described, low radiation dose scanning protocol, intravenous contrast enhancement, and a 32-slice MDCT scanner. Immediately following MDCT, ovaries were surgically removed and the ovarian weights were measured. The ovarian volumes were determined using water displacement. A veterinary radiologist who was unaware of actual volumes measured ovarian CT volumes three times, using a laptop computer, pen display tablet, hand-traced regions of interest, and free image analysis software. A statistician selected and performed all tests comparing the actual and CT data. Ovaries were successfully located in all MDCT scans. The iliac arteries and veins, uterus, fallopian tubes, cervix, ureters, urinary bladder, rectum, and colon were also consistently visualized. Large antral follicles were detected in six ovaries. Phantom mean CT volume was 0.702+/-SD 0.504 cc and the mean actual volume was 0.743+/-SD 0.526 cc. Ovary mean CT volume was 0.258+/-SD 0.159 cc and mean water displacement volume was 0.257+/-SD 0.145 cc. For phantoms, the mean coefficient of variation for CT volumes was 2.5%. For ovaries, the least squares mean coefficient of variation for CT volumes was 5.4%. The ovarian CT volume was significantly associated with actual ovarian volume (ICC coefficient 0.79, regression coefficient 0.5, P=0.0006) and the actual ovarian weight (ICC coefficient 0.62, regression coefficient 0.6, P=0.015). There was no association between the CT volume accuracy and mean ovarian CT density (degree of intravenous contrast enhancement), and there was no proportional or fixed bias in the CT volume measurements. Findings from this study indicate that MDCT is a valid non-invasive technique for measuring the ovarian volume in macaques.
Negative post sunset height rise of F layer: Causes and implications
NASA Astrophysics Data System (ADS)
Joshi, Lalit Mohan; Patra, Amit
Post sunset height rise (PSHR) of the F layer is a manifestation of the pre reversal enhancement (PRE) of zonal electric field in the equatorial and low latitude ionosphere. Ionosonde observations, made during the equinox period from Sriharikota (13.7 degree North, 80.1 degree East, 6.7 degree North magnetic latitude), a low latitude station in India, have been utilized to study the PSHR of the F layer. Normally, the height of the F layer increases during the early post sunset period (positive PSHR) whose magnitude has a direct bearing on the equatorial spread F (ESF). However, observations revealed that on a few nights (about 3% nights) the height of the F layer descended in the early post sunset period itself, indicating the absence of PRE of zonal field. Such events have been termed as negative PSHR events. Such events never preceded ESF. Detailed investigations revealed that the negative PSHR events were accompanied by an enhancement of low latitude sporadic E (Es) activity with increase in the Es blanketing (fbEs) and top (ftEs) frequencies, during the post sunset period. Numerical simulations have been carried out to evaluate the effectiveness of the westward Pedersen and Hall conductivity gradients that exists in the low latitude E region during the evening hours, in causing the PRE of zonal field and the PSHR of the F layer. Model simulation reveals that the dominant cause of PRE of zonal field is the divergence of Hall current in the low latitude E region. When the zonal conductivity gradient of the low latitude E region was assumed to be either zero or slightly eastward, owing to the intensification of Es, model computation resulted in the negative PSHR of the F layer. Thus, the observational and computational results highlight the important role of the low latitude Es in the PRE of the zonal electric field.
HNET - A National Computerized Health Network
Casey, Mark; Hamilton, Richard
1988-01-01
The HNET system demonstrated conceptually and technically a national text (and limited bit mapped graphics) computer network for use between innovative members of the health care industry. The HNET configuration of a leased high speed national packet switching network connecting any number of mainframe, mini, and micro computers was unique in it's relatively low capital costs and freedom from obsolescence. With multiple simultaneous conferences, databases, bulletin boards, calendars, and advanced electronic mail and surveys, it is marketable to innovative hospitals, clinics, physicians, health care associations and societies, nurses, multisite research projects libraries, etc.. Electronic publishing and education capabilities along with integrated voice and video transmission are identified as future enhancements.
A new method for enhancer prediction based on deep belief network.
Bu, Hongda; Gan, Yanglan; Wang, Yang; Zhou, Shuigeng; Guan, Jihong
2017-10-16
Studies have shown that enhancers are significant regulatory elements to play crucial roles in gene expression regulation. Since enhancers are unrelated to the orientation and distance to their target genes, it is a challenging mission for scholars and researchers to accurately predicting distal enhancers. In the past years, with the high-throughout ChiP-seq technologies development, several computational techniques emerge to predict enhancers using epigenetic or genomic features. Nevertheless, the inconsistency of computational models across different cell-lines and the unsatisfactory prediction performance call for further research in this area. Here, we propose a new Deep Belief Network (DBN) based computational method for enhancer prediction, which is called EnhancerDBN. This method combines diverse features, composed of DNA sequence compositional features, DNA methylation and histone modifications. Our computational results indicate that 1) EnhancerDBN outperforms 13 existing methods in prediction, and 2) GC content and DNA methylation can serve as relevant features for enhancer prediction. Deep learning is effective in boosting the performance of enhancer prediction.
Challenges facing general internal medicine in the 99th Congress.
Prout, D M
1986-01-01
Since 1976, federal support for training in general internal medicine has been provided through the primary care residency programs under Title VII of the Public Health Service Act. Continuation of these programs is now in jeopardy because of severe fiscal pressures and the response of Congress to the resulting budgetary stringency. General internal medicine faces immediate problems in the budgetary, authorization, and appropriations arenas. However, Congressional proposals for changing the method by which Medicare pays for all graduate medical education may provide an important opportunity. Under a revised method of financing graduate medical education, general internal medicine could eliminate its historical dependence on increasingly unstable federal grant funds and could contribute to the development of new federal incentives for training in the primary care specialties.
Intellectual Disability, Mild Cognitive Impairment, and Risk for Dementia
Silverman, Wayne P.; Zigman, Warren B.; Krinsky-McHale, Sharon J.; Ryan, Robert; Schupf, Nicole
2013-01-01
People with intellectual disability (ID) are living longer than ever before, raising concerns about old-age associated disorders. Dementia is among the most serious of these disorders, and theories relating cognitive reserve to risk predict that older adults with ID should be particularly vulnerable. Previous estimates of relative risk for dementia associated with ID have been inconsistent, and the present analyses examined the possible influence of variation in diagnostic criteria on findings. As expected, relaxation in the stringency of case definition for adults with ID increased relative risk, underscoring the importance of developing valid criteria for defining mild cognitive impairment, early dementia, and distinguishing between the two in adults with ID. Once available, these standards will contribute to more effective evidence-based planning. PMID:24273589
Tavares, Anthony J; Noor, M Omair; Uddayasankar, Uvaraj; Krull, Ulrich J; Vannoy, Charles H
2014-01-01
Semiconductor quantum dots (QDs) have long served as integral components in signal transduction modalities such as Förster resonance energy transfer (FRET). The majority of bioanalytical methods using QDs for FRET-based techniques simply monitor binding-induced conformational changes. In more recent work, QDs have been incorporated into solid-phase support systems, such as microfluidic chips, to serve as physical platforms in the development of functional biosensors and bioprobes. Herein, we describe a simple strategy for the transduction of nucleic acid hybridization that combines a novel design method based on FRET with an electrokinetically controlled microfluidic technology, and that offers further potential for amelioration of sample-handling issues and for simplification of dynamic stringency control.
Effect of the stringency of conditions on caloric test results in healthy subjects.
Krstulovic, Claudio; Tulsidas Mahtani, Bharti; Atrache Al Attrache, Nabil; Pérez-Garrigues, Herminio
The caloric test is widely used to assess vestibular function, but the conditions in which it is performed can vary. Caloric nystagmus obtained in 57 healthy subjects were compared: 24 subjects studied in ideal conditions and 33 subjects in non-ideal conditions. A statistically significant decrease in the slow phase velocity of the 4 irrigations performed on the subjects in non-ideal conditions was observed. This must be considered, especially in subjects with suspected bilateral involvement. Stringent conditions reduce the risk of misdiagnosis with bilateral deficit. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Otorrinolaringología y Cirugía de Cabeza y Cuello. All rights reserved.
Bu, Yifan; Cui, Yinglu; Peng, Ying; Hu, Meirong; Tian, Yu'e; Tao, Yong; Wu, Bian
2018-04-01
Xylanases, which cleave the β-1,4-glycosidic bond between xylose residues to release xylooligosaccharides (XOS), are widely used as food additives, animal feeds, and pulp bleaching agents. However, the thermally unstable nature of xylanases would hamper their industrial application. In this study, we used in silico design in a glycoside hydrolase family (GH) 11 xylanase to stabilize the enzyme. A combination of the best mutations increased the apparent melting temperature by 14 °C and significantly enhanced thermostability and thermoactivation. The variant also showed an upward-shifted optimal temperature for catalysis without compromising its activity at low temperatures. Moreover, a 10-fold higher XOS production yield was obtained at 70 °C, which compensated the low yield obtained with the wild-type enzyme. Collectively, the variant constructed by the computational strategy can be used as an efficient biocatalyst for XOS production at industrially viable conditions.
Baumueller, Stephan; Hilty, Regina; Nguyen, Thi Dan Linh; Weder, Walter; Alkadhi, Hatem; Frauenfelder, Thomas
2016-01-01
The purpose of this study was to evaluate the influence of sinogram-affirmed iterative reconstruction (SAFIRE) on quantification of lung volume and pulmonary emphysema in low-dose chest computed tomography compared with filtered back projection (FBP). Enhanced or nonenhanced low-dose chest computed tomography was performed in 20 patients with chronic obstructive pulmonary disease (group A) and in 20 patients without lung disease (group B). Data sets were reconstructed with FBP and SAFIRE strength levels 3 to 5. Two readers semiautomatically evaluated lung volumes and automatically quantified pulmonary emphysema, and another assessed image quality. Radiation dose parameters were recorded. Lung volume between FBP and SAFIRE 3 to 5 was not significantly different among both groups (all P > 0.05). When compared with those of FBP, total emphysema volume was significantly lower among reconstructions with SAFIRE 4 and 5 (mean difference, 0.56 and 0.79 L; all P < 0.001). There was no nondiagnostic image quality. Sinogram-affirmed iterative reconstruction does not alter lung volume measurements, although quantification of lung emphysema is affected at higher strength levels.
Coherent Backscattering in the Cross-Polarized Channel
NASA Technical Reports Server (NTRS)
Mischenko, Michael I.; Mackowski, Daniel W.
2011-01-01
We analyze the asymptotic behavior of the cross-polarized enhancement factor in the framework of the standard low-packing-density theory of coherent backscattering by discrete random media composed of spherically symmetric particles. It is shown that if the particles are strongly absorbing or if the smallest optical dimension of the particulate medium (i.e., the optical thickness of a plane-parallel slab or the optical diameter of a spherically symmetric volume) approaches zero, then the cross-polarized enhancement factor tends to its upper-limit value 2. This theoretical prediction is illustrated using direct computer solutions of the Maxwell equations for spherical volumes of discrete random medium.
ERIC Educational Resources Information Center
Tsai, Chia-Wen; Lee, Tsang-Hsiung
2012-01-01
Vocational education in Taiwan is highly competitive in that it must attract sufficient student enrollment in the environment with a rapidly increasing number of schools. Many students in this context tend to have lower levels of academic achievement, and do not adequately get involved in their schoolwork. Under such constraints but moving toward…
Ncube, S; Coleman, C; Strydom, A; Flahaut, E; de Sousa, A; Bhattacharyya, S
2018-05-23
We report on the enhancement of magnetic properties of multiwalled carbon nanotubes (MWNTs) functionalized with a gadolinium based supramolecular complex. By employing a newly developed synthesis technique we find that the functionalization method of the nanocomposite enhances the strength of magnetic interaction leading to a large effective moment of 15.79 µ B and non-superparamagnetic behaviour unlike what has been previously reported. Saturating resistance at low temperatures is fitted with the numerical renormalization group formula verifying the Kondo effect for magnetic impurities on a metallic electron system. Magnetoresistance shows devices fabricated from aligned gadolinium functionalized MWNTs (Gd-Fctn-MWNTs) exhibit spin-valve switching behaviour of up to 8%. This study highlights the possibility of enhancing magnetic interactions in carbon systems through chemical modification, moreover we demonstrate the rich physics that might be useful for developing spin based quantum computing elements based on one-dimensional (1D) channels.
NASA Astrophysics Data System (ADS)
Stadler, Philipp; Farnleitner, Andreas H.; Zessner, Matthias
2016-04-01
This presentation describes in-depth how a low cost micro-computer was used for substantial improvement of established measuring systems due to the construction and implementation of a purposeful complementary device for on-site sample pretreatment. A fully automated on-site device was developed and field-tested, that enables water sampling with simultaneous filtration as well as effective cleaning procedure of the devicés components. The described auto-sampler is controlled by a low-cost one-board computer and designed for sample pre-treatment, with minimal sample alteration, to meet requirements of on-site measurement devices that cannot handle coarse suspended solids within the measurement procedure or -cycle. The automated sample pretreatment was tested for over one year for rapid and on-site enzymatic activity (beta-D-glucuronidase, GLUC) determination in sediment laden stream water. The formerly used proprietary sampling set-up was assumed to lead to a significant damping of the measurement signal due to its susceptibility to clogging, debris- and bio film accumulation. Results show that the installation of the developed apparatus considerably enhanced error-free running time of connected measurement devices and increased the measurement accuracy to an up-to-now unmatched quality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, H.T.; Scriven, L.E.
1991-07-01
A major program of university research, longer-ranged and more fundamental in approach than industrial research, into basic mechanisms of enhancing petroleum recovery and into underlying physics, chemistry, geology, applied mathematics, computation, and engineering science has been built at Minnesota. The original focus was surfactant-based chemical flooding, but the approach taken was sufficiently fundamental that the research, longer-ranged than industrial efforts, has become quite multidirectional. Topics discussed are volume controlled porosimetry; fluid distribution and transport in porous media at low wetting phase saturation; molecular dynamics of fluids in ultranarrow pores; molecular dynamics and molecular theory of wetting and adsorption; new numericalmore » methods to handle initial and boundary conditions in immiscible displacement; electron microscopy of surfactant fluid microstructure; low cost system for animating liquid crystallites viewed with polarized light; surfaces of constant mean curvature with prescribed contact angle.« less
NASCAP modelling computations on large optics spacecraft in geosynchronous substorm environments
NASA Technical Reports Server (NTRS)
Stevens, N. J.; Purvis, C. K.
1980-01-01
Satellites in geosynchronous orbits have been found to be charged to significant negative voltages during encounters with geomagnetic substorms. When satellite surfaces are charged, there is a probability of enhanced contamination from charged particles attracted back to the satellite by electrostatic forces. This could be particularly disturbing to large satellites using sensitive optical systems. In this study the NASA Charging Analyzer Program (NASCAP) is used to evaluate qualitatively the possibility of such enhanced contamination on a conceptual version of a large satellite. The evaluation is made by computing surface voltages on the satellite due to encounters with substorm environments and then computing charged-particle trajectories in the electric fields around the satellite. Particular attention is paid to the possibility of contaminants reaching a mirror surface inside a dielectric tube because this mirror represents a shielded optical surface in the satellite model used. Deposition of low energy charged particles from other parts of the spacecraft onto the mirror was found to be possible in the assumed moderate substorm environment condition. In the assumed severe substorm environment condition, however, voltage build up on the inside and edges of the dielectric tube in which the mirror is located prevents contaminants from reaching the mirror surface.
Chen, Ying; Fu, Yan-Biao; Xu, Xiu-Fang; Pan, Yao; Lu, Chen-Ying; Zhu, Xiu-Liang; Li, Qing-Hai; Yu, Ri-Sheng
2018-01-01
The lymphadenitis associated with cat-scratch disease (CSD) is often confused with neoplasms by a number of radiologists and clinicians, and consequently, unnecessary invasive procedures or surgeries are performed. In the present study, the contrast-enhanced computed tomography (CT) and magnetic resonance imaging (MRI) findings of 10 patients (6 men and 4 women) with clinically and pathologically confirmed lymphadenitis associated with CSD were retrospectively analyzed (CT in 3 patients, MRI in 6 patients, and CT and MRI in 1 patient) at The Second Affiliated Hospital of Zhejiang University School of Medicine (Hangzhou, China) between January 2007 and November 2014. As a result, 17 enlarged lymph nodes were identified in 10 cases. The 5 nodes identified by CT scan exhibited relatively inhomogeneous isodensity to muscle, with patchy low density in the center. All 14 nodes identified by MRI scan exhibited homogeneous or heterogeneous isointensity to muscle or slightly increased intensity compared with that of muscle on T1-weighted images (T1WI), and homogeneous or heterogeneous hyperintensity on fat-suppressed T2WI. Following enhancement, all 17 enlarged lymph nodes associated with CSD demonstrated the following 3 different enhancement patterns: Moderate homogeneous enhancement (n=8), which was associated with histologically identified early disease stage; marked heterogeneous enhancement with no enhancement of the necrotic areas (n=4), and heterogeneous enhancement with progressively 'spoke-wheel-like' (defined as radiating enhancement from the center) enhancement of the patchy low-density area (n=1), which was associated with histologically identified intermediate disease stage; and astral low-density/hypointensity with marked enhancement (n=2) or a 'rose flower' sign (n=2), which was associated with histologically identified late disease stage. We hypothesized that the CT and MRI results of lymphadenitis in CSD may be associated with the pathological features. It may be suggested that the diagnosis of CSD may be formed when considering the characteristic CT and MRI features of astral low-density/hypointensity with marked enhancement or a 'rose flower' sign (defined as marginal petaloid enhancement) in the late disease stage, or the MRI results of homogeneous, moderate enhancement in the early disease stage, or the CT/MRI data of heterogeneous enhancement with non-enhancing area in the center in the intermediate disease stage, in solitary or multiple enlarged lymph nodes associated with general subcutaneous edema in the vicinity of the nodes on CT/MRI and with a history of cat exposure.
Han, Xu; Sun, Mei-Yu; Liu, Jing-Hong; Zhang, Xiao-Yan; Wang, Meng-Yao; Fan, Rui; Qamar, Sahrish
2017-12-01
Perivascular epithelioid cell tumor (PEComa) is a rare tumor which is most frequently found in uterus. The tumor arising from liver is extremely uncommon. A 36-year-old female with abdominal distention, cramps, and low-grade fever for over 15 days. The patient had a history of gastric adenocarcinoma with ovarian, celiac lymph nodes, and retroperitoneal lymph nodes metastases. Computed tomography (CT) imaging demonstrated an ill-defined heterogeneous hypo-dense mass in segment 8 (S8) of the liver. Contrast-enhanced CT imaging showed marked enhancement in arterial phase, mild-to-moderate enhancement in portal and equilibrium phases. Tumor-feeding artery was demonstrated from the right hepatic artery by the three-dimensional reconstruction images. Biopsy was performed, and a diagnosis of PEComa was rendered. No intervention for this tumor before liver biopsy. We present a rare case of hepatic PEComa. The information we provided is useful for summarizing the CT features of this kind of tumors. It should be included in differential diagnoses from common hypervascular neoplasms of liver. The final diagnosis is established on histopathological and immunohistochemical studies that are the "gold standard."
Computed tomography imaging features of hepatic perivascular epithelioid cell tumor
Han, Xu; Sun, Mei-Yu; Liu, Jing-Hong; Zhang, Xiao-Yan; Wang, Meng-Yao; Fan, Rui; Qamar, Sahrish
2017-01-01
Abstract Rationale: Perivascular epithelioid cell tumor (PEComa) is a rare tumor which is most frequently found in uterus. The tumor arising from liver is extremely uncommon. Patient concerns: A 36-year-old female with abdominal distention, cramps, and low-grade fever for over 15 days. The patient had a history of gastric adenocarcinoma with ovarian, celiac lymph nodes, and retroperitoneal lymph nodes metastases. Diagnoses: Computed tomography (CT) imaging demonstrated an ill-defined heterogeneous hypo-dense mass in segment 8 (S8) of the liver. Contrast-enhanced CT imaging showed marked enhancement in arterial phase, mild-to-moderate enhancement in portal and equilibrium phases. Tumor-feeding artery was demonstrated from the right hepatic artery by the three-dimensional reconstruction images. Biopsy was performed, and a diagnosis of PEComa was rendered. Interventions: No intervention for this tumor before liver biopsy. Lessons: We present a rare case of hepatic PEComa. The information we provided is useful for summarizing the CT features of this kind of tumors. It should be included in differential diagnoses from common hypervascular neoplasms of liver. The final diagnosis is established on histopathological and immunohistochemical studies that are the “gold standard.” PMID:29245304
Heat-driven liquid metal cooling device for the thermal management of a computer chip
NASA Astrophysics Data System (ADS)
Ma, Kun-Quan; Liu, Jing
2007-08-01
The tremendous heat generated in a computer chip or very large scale integrated circuit raises many challenging issues to be solved. Recently, liquid metal with a low melting point was established as the most conductive coolant for efficiently cooling the computer chip. Here, by making full use of the double merits of the liquid metal, i.e. superior heat transfer performance and electromagnetically drivable ability, we demonstrate for the first time the liquid-cooling concept for the thermal management of a computer chip using waste heat to power the thermoelectric generator (TEG) and thus the flow of the liquid metal. Such a device consumes no external net energy, which warrants it a self-supporting and completely silent liquid-cooling module. Experiments on devices driven by one or two stage TEGs indicate that a dramatic temperature drop on the simulating chip has been realized without the aid of any fans. The higher the heat load, the larger will be the temperature decrease caused by the cooling device. Further, the two TEGs will generate a larger current if a copper plate is sandwiched between them to enhance heat dissipation there. This new method is expected to be significant in future thermal management of a desk or notebook computer, where both efficient cooling and extremely low energy consumption are of major concern.
PlantTribes: a gene and gene family resource for comparative genomics in plants
Wall, P. Kerr; Leebens-Mack, Jim; Müller, Kai F.; Field, Dawn; Altman, Naomi S.; dePamphilis, Claude W.
2008-01-01
The PlantTribes database (http://fgp.huck.psu.edu/tribe.html) is a plant gene family database based on the inferred proteomes of five sequenced plant species: Arabidopsis thaliana, Carica papaya, Medicago truncatula, Oryza sativa and Populus trichocarpa. We used the graph-based clustering algorithm MCL [Van Dongen (Technical Report INS-R0010 2000) and Enright et al. (Nucleic Acids Res. 2002; 30: 1575–1584)] to classify all of these species’ protein-coding genes into putative gene families, called tribes, using three clustering stringencies (low, medium and high). For all tribes, we have generated protein and DNA alignments and maximum-likelihood phylogenetic trees. A parallel database of microarray experimental results is linked to the genes, which lets researchers identify groups of related genes and their expression patterns. Unified nomenclatures were developed, and tribes can be related to traditional gene families and conserved domain identifiers. SuperTribes, constructed through a second iteration of MCL clustering, connect distant, but potentially related gene clusters. The global classification of nearly 200 000 plant proteins was used as a scaffold for sorting ∼4 million additional cDNA sequences from over 200 plant species. All data and analyses are accessible through a flexible interface allowing users to explore the classification, to place query sequences within the classification, and to download results for further study. PMID:18073194
3'-terminal sequence of a small round structured virus (SRSV) in Japan.
Utagawa, E T; Takeda, N; Inouye, S; Kasuga, K; Yamazaki, S
1994-01-01
We determined the nucleotide sequence of about 1,000 bases from the 3'-terminus of a small round structured virus (SRSV), which caused a gastroenteritis outbreak in Chiba Prefecture, Japan, in 1987. The sequence was compared with the corresponding sequence region of Norwalk virus; it consisted of a part of the open reading frame 2 (ORF2), whole ORF3, and 3'-noncoding region (NCR). The 624-base-long ORF3 had sequence homology of 68% with the corresponding region of Norwalk virus. (The amino acid sequence homology was 74%.) The 94-base-long NCR had 65% homology with Norwalk virus. We then selected two consensus-sequence portions in the above sequence between Chiba and Norwalk viruses for primers in the reverse transcriptase-polymerase chain reaction (RT-PCR). Using this primer set, we detected 669-bp bands in agarose gel electrophoresis of RT-PCR products from feces containing Chiba or Norwalk viruses. Furthermore, in Southern hybridization with Chiba probes which were labeled with digoxigenin-dUTP in PCR, the bands of the two viruses were clearly stained under a low stringency condition. Since both Chiba and Norwalk viruses were detected by the above primer set although they are geographically and chronologically different viruses, our primer-pair may be useful for detection of a broad range of SRSVs which cause gastroenteritis in different areas.
McKinnon, R D; Danielson, P; Brow, M A; Bloom, F E; Sutcliffe, J G
1987-01-01
We examined the level of expression of small RNA transcripts hybridizing to a rodent repetitive DNA element, the identifier (ID) sequence, in a variety of cell types in vivo and in cultured mammalian cells. A 160-nucleotide (160n) cytoplasmic poly(A)+ RNA (BC1) appeared in late embryonic and early postnatal rat brain development, was enriched in the cerebral cortex, and appeared to be restricted to neural tissue and the anterior pituitary gland. A 110n RNA (BC2) was specifically enriched in brain, especially the postnatal cortex, but was detectable at low levels in peripheral tissues. A third, related 75n poly(A)- RNA (T3) was found in rat brain and at lower levels in peripheral tissues but was very abundant in the testes. The BC RNAs were found in a variety of rat cell lines, and their level of expression was dependent upon cell culture conditions. A rat ID probe detected BC-like RNAs in mouse brain but not liver and detected a 200n RNA in monkey brain but not liver at lower hybridization stringencies. These RNAs were expressed by mouse and primate cell lines. Thus, tissue-specific expression of small ID-sequence-related transcripts is conserved among mammals, but the tight regulation found in vivo is lost by cells in culture. Images PMID:2439903
Guanine nucleotide-binding regulatory proteins in retinal pigment epithelial cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Meisheng; Tran, V.T.; Fong, H.K.W.
1991-05-01
The expression of GTP-binding regulatory proteins (G proteins) in retinal pigment epithelial (RPE) cells was analyzed by RNA blot hybridization and cDNA amplification. Both adult and fetal human RPE cells contain mRNA for multiple G protein {alpha} subunits (G{alpha}) including G{sub s}{alpha}, G{sub i-1}{alpha}, G{sub i-2}{alpha}, G{sub i-3}{alpha}, and G{sub z}{alpha} (or G{sub x}{alpha}), where G{sub s} and G{sub i} are proteins that stimulate or inhibit adenylyl cyclase, respectively, and G{sub z} is a protein that may mediate pertussis toxin-insensitive events. Other G{alpha}-related mRNA transcripts were detected in fetal RPE cells by low-stringency hybridization to G{sub i-2}{alpha} and G{sub s}{alpha}more » protein-coding cDNA probes. The diversity of G proteins in RPE cells was further studied by cDNA amplification with reverse transcriptase and the polymerase chain reaction. This approach revealed that, besides the above mentioned members of the G{alpha} gene family, at least two other G{alpha} subunits are expressed in RPE cells. Human retinal cDNA clones that encode one of the additional G{alpha} subunits were isolated and characterized. The results indicate that this G{alpha} subunit belongs to a separate subfamily of G proteins that may be insensitive to inhibition by pertussis toxin.« less
NASA Technical Reports Server (NTRS)
Wright, Jeffrey; Thakur, Siddharth
2006-01-01
Loci-STREAM is an evolving computational fluid dynamics (CFD) software tool for simulating possibly chemically reacting, possibly unsteady flows in diverse settings, including rocket engines, turbomachines, oil refineries, etc. Loci-STREAM implements a pressure- based flow-solving algorithm that utilizes unstructured grids. (The benefit of low memory usage by pressure-based algorithms is well recognized by experts in the field.) The algorithm is robust for flows at all speeds from zero to hypersonic. The flexibility of arbitrary polyhedral grids enables accurate, efficient simulation of flows in complex geometries, including those of plume-impingement problems. The present version - Loci-STREAM version 0.9 - includes an interface with the Portable, Extensible Toolkit for Scientific Computation (PETSc) library for access to enhanced linear-equation-solving programs therein that accelerate convergence toward a solution. The name "Loci" reflects the creation of this software within the Loci computational framework, which was developed at Mississippi State University for the primary purpose of simplifying the writing of complex multidisciplinary application programs to run in distributed-memory computing environments including clusters of personal computers. Loci has been designed to relieve application programmers of the details of programming for distributed-memory computers.
Automated aortic calcium scoring on low-dose chest computed tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isgum, Ivana; Rutten, Annemarieke; Prokop, Mathias
Purpose: Thoracic computed tomography (CT) scans provide information about cardiovascular risk status. These scans are non-ECG synchronized, thus precise quantification of coronary calcifications is difficult. Aortic calcium scoring is less sensitive to cardiac motion, so it is an alternative to coronary calcium scoring as an indicator of cardiovascular risk. The authors developed and evaluated a computer-aided system for automatic detection and quantification of aortic calcifications in low-dose noncontrast-enhanced chest CT. Methods: The system was trained and tested on scans from participants of a lung cancer screening trial. A total of 433 low-dose, non-ECG-synchronized, noncontrast-enhanced 16 detector row examinations of themore » chest was randomly divided into 340 training and 93 test data sets. A first observer manually identified aortic calcifications on training and test scans. A second observer did the same on the test scans only. First, a multiatlas-based segmentation method was developed to delineate the aorta. Segmented volume was thresholded and potential calcifications (candidate objects) were extracted by three-dimensional connected component labeling. Due to image resolution and noise, in rare cases extracted candidate objects were connected to the spine. They were separated into a part outside and parts inside the aorta, and only the latter was further analyzed. All candidate objects were represented by 63 features describing their size, position, and texture. Subsequently, a two-stage classification with a selection of features and k-nearest neighbor classifiers was performed. Based on the detected aortic calcifications, total calcium volume score was determined for each subject. Results: The computer system correctly detected, on the average, 945 mm{sup 3} out of 965 mm{sup 3} (97.9%) calcified plaque volume in the aorta with an average of 64 mm{sup 3} of false positive volume per scan. Spearman rank correlation coefficient was {rho}=0.960 between the system and the first observer compared to {rho}=0.961 between the two observers. Conclusions: Automatic calcium scoring in the aorta thus appears feasible with good correlation between manual and automatic scoring.« less
Effect of low-dose scopolamine on autonomic control of the heart
NASA Technical Reports Server (NTRS)
Raeder, E. A.; Stys, A.; Cohen, R. J.
1997-01-01
Background: In low doses, scopolamine paradoxically enhances parasympathetic outflow to the heart. The mechanisms which mediate this action are not fully understood. Moreover, there are conflicting data regarding the potential role of sympathetic activity. This study in 17 healthy individuals was designed to characterize the influence of low dose transdermal scopolamine on the gain of the baroreflex and respiratory heart rate reflex and to determine the role of sympathetic activity. Methods: The effect of scopolamine was analyzed in the time and frequency domain by computing heart rate variability indices. The gains of the respiratory heart rate reflex and the baroreflex were estimated simultaneously by means of a cardiovascular system identification approach using an optimized autoregressive moving average algorithm. Measurements were repeated in the upright posture to assess the influence of enhanced sympathetic activity. In six subjects ambulatory ECGs were recorded to determine whether there are diurnal variations of the effect of scopolamine. Results: Scopolamine enhances vagal modulation of heart rate through both the respiratory-heart rate reflex and the baroreflex, as the gains of both were augmented by the drug in the supine and in the upright postures. Conclusions: Scopolamine increases parasympathetic cardiac control by augmenting the gain of the respiratory-heart rate and baroreflex. This action is not attenuated in the upright posture when sympathetic tone is increased.
Dosimetry study of diagnostic X-ray using doped iodide normoxic polymer gels
NASA Astrophysics Data System (ADS)
Huang, Y. R.; Chang, Y. J.; Hsieh, L. L.; Liu, M. H.; Liu, J. S.; Chu, C. H.; Hsieh, B. T.
2014-11-01
In radiotherapy, polymer gel dosimeters are used for three-dimensional (3D) dose distribution. However, the doses are within the Gy range. In this study, we attempted to develop a low-dose 3D dosimeter within the mGy range for diagnostic radiology. The effect of the iodinated compound was used as a dose enhancement sensitizer to enhance the dose sensitivity of normoxic polymer gel dosimeters. This study aims to use N-isopropylacrylamide(NIPAM)-based and methacrylic acid (MAGAT)-based gels to evaluate the potential dose enhancement sensitizer, as well as to compare two gels that may be suitable for measuring diagnostic radiation doses. The suitable formulation of NIPAM gel [5% (w/w) gelatin, 5% (w/w) NIPAM, 3% (w/w) N,N‧-methylenebisacrylamide (BIS), 5 mM tetrakis (hydroxymethyl) phosphonium chloride (THPC), and 87% (w/w) deionized distilled water] and MAGAT gel (4% MAA, 9% gelatin, 87% deionized water, and 10 mM THPC) were used and loaded with clinical iodinated contrast medium agent (Iobitridol, Xenetix® 350). Irradiation was conducted using X-ray computed tomography. The irradiation doses ranged from 0 mGy to 80 mGy. Optical computed tomography was the employed gel measurement system. The results indicate that the iodinated contrast agent yields a quantifiable dose enhancement ratio. The dose enhancement ratios of NIPAM and MAGAT gels are 3.35±0.6 and 1.36±0.3, respectively. The developed NIPAM gel in this study could be suitable for measuring diagnostic radiation doses.
Lim, Sara N.; Pradhan, Anil K.; Barth, Rolf F.; Nahar, Sultana N.; Nakkula, Robin J.; Yang, Weilian; Palmer, Alycia M.; Turro, Claudia; Weldon, Michael; Bell, Erica Hlavin; Mo, Xiaokui
2015-01-01
The purposes of this study were (i) to investigate the differences in effects between 160-kV low-energy and 6-MV high-energy X-rays, both by computational analysis and in vitro studies; (ii) to determine the effects of each on platinum-sensitized F98 rat glioma and murine B16 melanoma cells; and (iii) to describe the in vitro cytotoxicity and in vivo toxicity of a Pt(II) terpyridine platinum (Typ-Pt) complex. Simulations were performed using the Monte Carlo code Geant4 to determine enhancement in absorption of low- versus high-energy X-rays by Pt and to determine dose enhancement factors (DEFs) for a Pt-sensitized tumor phantom. In vitro studies were carried out using Typ-Pt and again with carboplatin due to the unexpected in vivo toxicity of Typ-Pt. Cell survival was determined using clonogenic assays. In agreement with computations and simulations, in vitro data showed up to one log unit reduction in surviving fractions (SFs) of cells treated with 1–4 µg/ml of Typ-Pt and irradiated with 160-kV versus 6-MV X-rays. DEFs showed radiosensitization in the 50–200 keV range, which fell to approximate unity at higher energies, suggesting marginal interactions at MeV energies. Cells sensitized with 1–5 or 7 µg/ml of carboplatin and then irradiated also showed a significant decrease (P < 0.05) in SFs. However, it was unlikely this was due to increased interactions. Theoretical and in vitro studies presented here demonstrated that the tumoricidal activity of low-energy X-rays was greater than that of high-energy X-rays against Pt-sensitized tumor cells. Determining whether radiosensitization is a function of increased interactions will require additional studies. PMID:25266332
Silicon photonics for neuromorphic information processing
NASA Astrophysics Data System (ADS)
Bienstman, Peter; Dambre, Joni; Katumba, Andrew; Freiberger, Matthias; Laporte, Floris; Lugnan, Alessio
2018-02-01
We present our latest results on silicon photonics neuromorphic information processing based a.o. on techniques like reservoir computing. We will discuss aspects like scalability, novel architectures for enhanced power efficiency, as well as all-optical readout. Additionally, we will touch upon new machine learning techniques to operate these integrated readouts. Finally, we will show how these systems can be used for high-speed low-power information processing for applications like recognition of biological cells.
2012-01-01
Background Despite computational challenges, elucidating conformations that a protein system assumes under physiologic conditions for the purpose of biological activity is a central problem in computational structural biology. While these conformations are associated with low energies in the energy surface that underlies the protein conformational space, few existing conformational search algorithms focus on explicitly sampling low-energy local minima in the protein energy surface. Methods This work proposes a novel probabilistic search framework, PLOW, that explicitly samples low-energy local minima in the protein energy surface. The framework combines algorithmic ingredients from evolutionary computation and computational structural biology to effectively explore the subspace of local minima. A greedy local search maps a conformation sampled in conformational space to a nearby local minimum. A perturbation move jumps out of a local minimum to obtain a new starting conformation for the greedy local search. The process repeats in an iterative fashion, resulting in a trajectory-based exploration of the subspace of local minima. Results and conclusions The analysis of PLOW's performance shows that, by navigating only the subspace of local minima, PLOW is able to sample conformations near a protein's native structure, either more effectively or as well as state-of-the-art methods that focus on reproducing the native structure for a protein system. Analysis of the actual subspace of local minima shows that PLOW samples this subspace more effectively that a naive sampling approach. Additional theoretical analysis reveals that the perturbation function employed by PLOW is key to its ability to sample a diverse set of low-energy conformations. This analysis also suggests directions for further research and novel applications for the proposed framework. PMID:22759582
Yang, Chin-Rang; Tongyoo, Pumipat; Emamian, Milad; Sandoval, Pablo C; Raghuram, Viswanathan; Knepper, Mark A
2015-12-15
The mouse mpkCCD cell line is a continuous cultured epithelial cell line with characteristics of renal collecting duct principal cells. This line is widely used to study epithelial transport and its regulation. To provide a data resource useful for experimental design and interpretation in studies using mpkCCD cells, we have carried out "deep" proteomic profiling of these cells using three levels of fractionation (differential centrifugation, SDS-PAGE, and HPLC) followed by tandem mass spectrometry to identify and quantify proteins. The analysis of all resulting samples generated 34.6 gigabytes of spectral data. As a result, we identified 6,766 proteins in mpkCCD cells at a high level of stringency. These proteins are expressed over eight orders of magnitude of protein abundance. The data are provided to users as a public data base (https://helixweb.nih.gov/ESBL/Database/mpkFractions/). The mass spectrometry data were mapped back to their gel slices to generate "virtual Western blots" for each protein. For most of the 6,766 proteins, the apparent molecular weight from SDS-PAGE agreed closely with the calculated molecular weight. However, a substantial fraction (>15%) of proteins was found to run aberrantly, with much higher or much lower mobilities than predicted. These proteins were analyzed to identify mechanisms responsible for altered mobility on SDS-PAGE, including high or low isoelectric point, high or low hydrophobicity, physiological cleavage, residence in the lysosome, posttranslational modifications, and expression of alternative isoforms due to alternative exon usage. Additionally, this analysis identified a previously unrecognized isoform of aquaporin-2 with apparent molecular mass <20 kDa. Copyright © 2015 the American Physiological Society.
Yang, Chin-Rang; Tongyoo, Pumipat; Emamian, Milad; Sandoval, Pablo C.; Raghuram, Viswanathan
2015-01-01
The mouse mpkCCD cell line is a continuous cultured epithelial cell line with characteristics of renal collecting duct principal cells. This line is widely used to study epithelial transport and its regulation. To provide a data resource useful for experimental design and interpretation in studies using mpkCCD cells, we have carried out “deep” proteomic profiling of these cells using three levels of fractionation (differential centrifugation, SDS-PAGE, and HPLC) followed by tandem mass spectrometry to identify and quantify proteins. The analysis of all resulting samples generated 34.6 gigabytes of spectral data. As a result, we identified 6,766 proteins in mpkCCD cells at a high level of stringency. These proteins are expressed over eight orders of magnitude of protein abundance. The data are provided to users as a public data base (https://helixweb.nih.gov/ESBL/Database/mpkFractions/). The mass spectrometry data were mapped back to their gel slices to generate “virtual Western blots” for each protein. For most of the 6,766 proteins, the apparent molecular weight from SDS-PAGE agreed closely with the calculated molecular weight. However, a substantial fraction (>15%) of proteins was found to run aberrantly, with much higher or much lower mobilities than predicted. These proteins were analyzed to identify mechanisms responsible for altered mobility on SDS-PAGE, including high or low isoelectric point, high or low hydrophobicity, physiological cleavage, residence in the lysosome, posttranslational modifications, and expression of alternative isoforms due to alternative exon usage. Additionally, this analysis identified a previously unrecognized isoform of aquaporin-2 with apparent molecular mass <20 kDa. PMID:26310816
Damiati, E; Borsani, G; Giacopuzzi, Edoardo
2016-05-01
The Ion Proton platform allows to perform whole exome sequencing (WES) at low cost, providing rapid turnaround time and great flexibility. Products for WES on Ion Proton system include the AmpliSeq Exome kit and the recently introduced HiQ sequencing chemistry. Here, we used gold standard variants from GIAB consortium to assess the performances in variants identification, characterize the erroneous calls and develop a filtering strategy to reduce false positives. The AmpliSeq Exome kit captures a large fraction of bases (>94 %) in human CDS, ClinVar genes and ACMG genes, but with 2,041 (7 %), 449 (13 %) and 11 (19 %) genes not fully represented, respectively. Overall, 515 protein coding genes contain hard-to-sequence regions, including 90 genes from ClinVar. Performance in variants detection was maximum at mean coverage >120×, while at 90× and 70× we measured a loss of variants of 3.2 and 4.5 %, respectively. WES using HiQ chemistry showed ~71/97.5 % sensitivity, ~37/2 % FDR and ~0.66/0.98 F1 score for indels and SNPs, respectively. The proposed low, medium or high-stringency filters reduced the amount of false positives by 10.2, 21.2 and 40.4 % for indels and 21.2, 41.9 and 68.2 % for SNP, respectively. Amplicon-based WES on Ion Proton platform using HiQ chemistry emerged as a competitive approach, with improved accuracy in variants identification. False-positive variants remain an issue for the Ion Torrent technology, but our filtering strategy can be applied to reduce erroneous variants.
Mertz, Joseph; Tan, Haiyan; Pagala, Vishwajeeth; Bai, Bing; Chen, Ping-Chung; Li, Yuxin; Cho, Ji-Hoon; Shaw, Timothy; Wang, Xusheng; Peng, Junmin
2015-01-01
The mind bomb 1 (Mib1) ubiquitin ligase is essential for controlling metazoan development by Notch signaling and possibly the Wnt pathway. It is also expressed in postmitotic neurons and regulates neuronal morphogenesis and synaptic activity by mechanisms that are largely unknown. We sought to comprehensively characterize the Mib1 interactome and study its potential function in neuron development utilizing a novel sequential elution strategy for affinity purification, in which Mib1 binding proteins were eluted under different stringency and then quantified by the isobaric labeling method. The strategy identified the Mib1 interactome with both deep coverage and the ability to distinguish high-affinity partners from low-affinity partners. A total of 817 proteins were identified during the Mib1 affinity purification, including 56 high-affinity partners and 335 low-affinity partners, whereas the remaining 426 proteins are likely copurified contaminants or extremely weak binding proteins. The analysis detected all previously known Mib1-interacting proteins and revealed a large number of novel components involved in Notch and Wnt pathways, endocytosis and vesicle transport, the ubiquitin-proteasome system, cellular morphogenesis, and synaptic activities. Immunofluorescence studies further showed colocalization of Mib1 with five selected proteins: the Usp9x (FAM) deubiquitinating enzyme, alpha-, beta-, and delta-catenins, and CDKL5. Mutations of CDKL5 are associated with early infantile epileptic encephalopathy-2 (EIEE2), a severe form of mental retardation. We found that the expression of Mib1 down-regulated the protein level of CDKL5 by ubiquitination, and antagonized CDKL5 function during the formation of dendritic spines. Thus, the sequential elution strategy enables biochemical characterization of protein interactomes; and Mib1 analysis provides a comprehensive interactome for investigating its role in signaling networks and neuronal development. PMID:25931508
An Approach to Improve the Quality of Infrared Images of Vein-Patterns
Lin, Chih-Lung
2011-01-01
This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images. PMID:22247674
An approach to improve the quality of infrared images of vein-patterns.
Lin, Chih-Lung
2011-01-01
This study develops an approach to improve the quality of infrared (IR) images of vein-patterns, which usually have noise, low contrast, low brightness and small objects of interest, thus requiring preprocessing to improve their quality. The main characteristics of the proposed approach are that no prior knowledge about the IR image is necessary and no parameters must be preset. Two main goals are sought: impulse noise reduction and adaptive contrast enhancement technologies. In our study, a fast median-based filter (FMBF) is developed as a noise reduction method. It is based on an IR imaging mechanism to detect the noisy pixels and on a modified median-based filter to remove the noisy pixels in IR images. FMBF has the advantage of a low computation load. In addition, FMBF can retain reasonably good edges and texture information when the size of the filter window increases. The most important advantage is that the peak signal-to-noise ratio (PSNR) caused by FMBF is higher than the PSNR caused by the median filter. A hybrid cumulative histogram equalization (HCHE) is proposed for adaptive contrast enhancement. HCHE can automatically generate a hybrid cumulative histogram (HCH) based on two different pieces of information about the image histogram. HCHE can improve the enhancement effect on hot objects rather than background. The experimental results are addressed and demonstrate that the proposed approach is feasible for use as an effective and adaptive process for enhancing the quality of IR vein-pattern images.
Photographic copy of computer enhanced color photographic image. Photographer and ...
Photographic copy of computer enhanced color photographic image. Photographer and computer draftsman unknown. Original photographic image located in the office of Modjeski and Masters, Consulting Engineers at 1055 St. Charles Avenue, New Orleans, LA 70130. COMPUTER ENHANCED COLOR PHOTOGRAPH SHOWING THE PROPOSED HUEY P. LONG BRIDGE WIDENING LOOKING FROM THE WEST BANK TOWARD THE EAST BANK. - Huey P. Long Bridge, Spanning Mississippi River approximately midway between nine & twelve mile points upstream from & west of New Orleans, Jefferson, Jefferson Parish, LA
Strategic parenting, birth order, and school performance.
Hotz, V Joseph; Pantano, Juan
2015-10-01
Fueled by new evidence, there has been renewed interest about the effects of birth order on human capital accumulation. The underlying causal mechanisms for such effects remain unsettled. We consider a model in which parents impose more stringent disciplinary environments in response to their earlier-born children's poor performance in school in order to deter such outcomes for their later-born offspring. We provide robust empirical evidence that school performance of children in the National Longitudinal Study Children (NLSY-C) declines with birth order as does the stringency of their parents' disciplinary restrictions. When asked how they will respond if a child brought home bad grades, parents state that they would be less likely to punish their later-born children. Taken together, these patterns are consistent with a reputation model of strategic parenting.
Methodological guidelines for developing accident modification functions.
Elvik, Rune
2015-07-01
This paper proposes methodological guidelines for developing accident modification functions. An accident modification function is a mathematical function describing systematic variation in the effects of road safety measures. The paper describes ten guidelines. An example is given of how to use the guidelines. The importance of exploratory analysis and an iterative approach in developing accident modification functions is stressed. The example shows that strict compliance with all the guidelines may be difficult, but represents a level of stringency that should be strived for. Currently the main limitations in developing accident modification functions are the small number of good evaluation studies and the often huge variation in estimates of effect. It is therefore still not possible to develop accident modification functions for very many road safety measures. Copyright © 2015 Elsevier Ltd. All rights reserved.
Open-Air Biowarfare Testing and the Evolution of Values
2016-01-01
The United States and the United Kingdom ended outdoor biological warfare testing in populated areas nearly half a century ago. Yet, the conduct, health effects, and propriety of those tests remain controversial. The varied views reflect the limits of currently available test information and evolving societal values on research involving human subjects. Western political culture has changed since the early days of the American and British testing programs. People have become less reluctant to question authority, and institutional review boards must now pre-approve research involving human subjects. Further, the heightened stringency of laboratory containment has accentuated the safety gap between a confined test space and one without physical boundaries. All this makes it less likely that masses of people would again be unwittingly subjected to secret open-air biological warfare tests. PMID:27564984
Strategic parenting, birth order, and school performance
Pantano, Juan
2015-01-01
Fueled by new evidence, there has been renewed interest about the effects of birth order on human capital accumulation. The underlying causal mechanisms for such effects remain unsettled. We consider a model in which parents impose more stringent disciplinary environments in response to their earlier-born children’s poor performance in school in order to deter such outcomes for their later-born offspring. We provide robust empirical evidence that school performance of children in the National Longitudinal Study Children (NLSY-C) declines with birth order as does the stringency of their parents’ disciplinary restrictions. When asked how they will respond if a child brought home bad grades, parents state that they would be less likely to punish their later-born children. Taken together, these patterns are consistent with a reputation model of strategic parenting. PMID:26366045
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flores, J.; Sears, J.; Schael, I.P.
1990-08-01
We have synthesized {sup 32}P-labeled hybridization probes from a hyperdivergent region (nucleotides 51 to 392) of the rotavirus gene encoding the VP7 glycoprotein by using the polymerase chain reaction method. Both RNA (after an initial reverse transcription step) and cloned cDNA from human rotavirus serotypes 1 through 4 could be used as templates to amplify this region. High-stringency hybridization of each of the four probes to rotavirus RNAs dotted on nylon membranes allowed the specific detection of corresponding sequences and thus permitted identification of the serotype of the strains dotted. The procedure was useful when applied to rotaviruses isolated frommore » field studies.« less
Selection of stable scFv antibodies by phage display.
Brockmann, Eeva-Christine
2012-01-01
ScFv fragments are popular recombinant antibody formats but often suffer from limited stability. Phage display is a powerful tool in antibody engineering and applicable also for stability selection. ScFv variants with improved stability can be selected from large randomly mutated phage displayed libraries with a specific antigen after the unstable variants have been inactivated by heat or GdmCl. Irreversible scFv denaturation, which is a prerequisite for efficient selection, is achieved by combining denaturation with reduction of the intradomain disulfide bonds. Repeated selection cycles of increasing stringency result in enrichment of stabilized scFv fragments. Procedures for constructing a randomly mutated scFv library by error-prone PCR and phage display selection for enrichment of stable scFv antibodies from the library are described here.
Kim, Tae Kyoung; Khalili, Korosh; Jang, Hyun-Jung
2015-01-01
A successful program for local ablation therapy for hepatocellular carcinoma (HCC) requires extensive imaging support for diagnosis and localization of HCC, imaging guidance for the ablation procedures, and post-treatment monitoring. Contrast-enhanced ultrasonography (CEUS) has several advantages over computed tomography/magnetic resonance imaging (CT/MRI), including real-time imaging capability, sensitive detection of arterial-phase hypervascularity and washout, no renal excretion, no ionizing radiation, repeatability, excellent patient compliance, and relatively low cost. CEUS is useful for image guidance for isoechoic lesions. While contrast-enhanced CT/MRI is the standard method for the diagnosis of HCC and post-ablation monitoring, CEUS is useful when CT/MRI findings are indeterminate or CT/MRI is contraindicated. This article provides a practical review of the role of CEUS in imaging algorithms for pre- and post-ablation therapy for HCC. PMID:26169081
A Rare Case of Malignant Melanoma of the Mandible: CT and MRI Findings.
Ogura, Ichiro; Sasaki, Yoshihiko; Kameta, Ayako; Sue, Mikiko; Oda, Takaaki
Malignant melanoma of the mandibular gingiva is extremely rare. It is a malignant tumour of melanocytes or their precursor cells, and often misinterpreted as a benign pigmented process. A few reports have described computed tomography (CT) and magnetic resonance imaging (MRI) findings of malignant melanoma in the oral cavity. We report a rare case of malignant melanoma of the mandible and the related CT and MRI findings. Soft tissue algorithm contrast-enhanced CT showed an expansile mass and irregular destruction of alveolar bone in the right side of the mandibular molar area. MR images showed an enhancing mass and the tumour had a low to intermediate signal intensity and a high-signal intensity. Soft tissue algorithm contrast-enhanced CT and MR images showed lymphadenopathy involving the submandibular lymph nodes. Histopathological examination confirmed the diagnosis of malignant melanoma.
Image enhancement by spectral-error correction for dual-energy computed tomography.
Park, Kyung-Kook; Oh, Chang-Hyun; Akay, Metin
2011-01-01
Dual-energy CT (DECT) was reintroduced recently to use the additional spectral information of X-ray attenuation and aims for accurate density measurement and material differentiation. However, the spectral information lies in the difference between low and high energy images or measurements, so that it is difficult to acquire accurate spectral information due to amplification of high pixel noise in the resulting difference image. In this work, an image enhancement technique for DECT is proposed, based on the fact that the attenuation of a higher density material decreases more rapidly as X-ray energy increases. We define as spectral error the case when a pixel pair of low and high energy images deviates far from the expected attenuation trend. After analyzing the spectral-error sources of DECT images, we propose a DECT image enhancement method, which consists of three steps: water-reference offset correction, spectral-error correction, and anti-correlated noise reduction. It is the main idea of this work that makes spectral errors distributed like random noise over the true attenuation and suppressed by the well-known anti-correlated noise reduction. The proposed method suppressed noise of liver lesions and improved contrast between liver lesions and liver parenchyma in DECT contrast-enhanced abdominal images and their two-material decomposition.
Designing Ubiquitous Computing to Enhance Children's Learning in Museums
ERIC Educational Resources Information Center
Hall, T.; Bannon, L.
2006-01-01
In recent years, novel paradigms of computing have emerged, which enable computational power to be embedded in artefacts and in environments in novel ways. These developments may create new possibilities for using computing to enhance learning. This paper presents the results of a design process that set out to explore interactive techniques,…
Line x-ray source for diffraction enhanced imaging in clinical and industrial applications
NASA Astrophysics Data System (ADS)
Wang, Xiaoqin
Mammography is one type of imaging modalities that uses a low-dose x-ray or other radiation sources for examination of breasts. It plays a central role in early detection of breast cancers. The material similarity of tumor-cell and health cell, breast implants surgery and other factors, make the breast cancers hard to visualize and detect. Diffraction enhanced imaging (DEI), first proposed and investigated by D. Chapman is a new x-ray radiographic imaging modality using monochromatic x-rays from a synchrotron source, which produced images of thick absorbing objects that are almost completely free of scatter. It shows dramatically improved contrast over standard imaging when applied to the same phantom. The contrast is based not only on attenuation but also on the refraction and diffraction properties of the sample. This imaging method may improve image quality of mammography, other medical applications, industrial radiography for non-destructive testing and x-ray computed tomography. However, the size, and cost, of a synchrotron source limits the application of the new modality to be applicable at clinical levels. This research investigates the feasibility of a designed line x-ray source to produce intensity compatible to synchrotron sources. It is composed of a 2-cm in length tungsten filament, installed on a carbon steel filament cup (backing plate), as the cathode and a stationary oxygen-free copper anode with molybdenum coating on the front surface serves as the target. Characteristic properties of the line x-ray source were computationally studied and the prototype was experimentally investigated. SIMIION code was used to computationally study the electron trajectories emanating from the filament towards the molybdenum target. A Faraday cup on the prototype device, proof-of-principle, was used to measure the distribution of electrons on the target, which compares favorably to computational results. The intensities of characteristic x-ray for molybdenum, tungsten and rhodium targets were investigated with different window materials for -30kV to -100kV applied potential. Heat loading and thermal management of the target has been investigated computationally using COMSOL code package, and experimental measurements of target temperature rise was taken via thermocouples attached to the target. Temperature measurements for low voltage, low current regime without active cooling were compared to computational results for code-experiment benchmarking. Two different phantoms were used in the simulation of DEI images, which showed that the designed x-ray source with DEI setup could produce images with significant improved contrast. The computational results, along with experimental measurements on the prototype setup, indicate the possibility of scale up to larger area x-ray source adequate for DEI applications.
Analysis of Selected Enhancements to the En Route Central Computing Complex
DOT National Transportation Integrated Search
1981-09-01
This report analyzes selected hardware enhancements that could improve the performance of the 9020 computer systems, which are used to provide en route air traffic control services. These enhancements could be implemented quickly, would be relatively...
Computational Investigations on the Effects of Gurney Flap on Airfoil Aerodynamics.
Jain, Shubham; Sitaram, Nekkanti; Krishnaswamy, Sriram
2015-01-01
The present study comprises steady state, two-dimensional computational investigations performed on NACA 0012 airfoil to analyze the effect of Gurney flap (GF) on airfoil aerodynamics using k-ε RNG turbulence model of FLUENT. Airfoil with GF is analyzed for six different heights from 0.5% to 4% of the chord length, seven positions from 0% to 20% of the chord length from the trailing edge, and seven mounting angles from 30° to 120° with the chord. Computed values of lift and drag coefficients with angle of attack are compared with experimental values and good agreement is found at low angles of attack. In addition static pressure distribution on the airfoil surface and pathlines and turbulence intensities near the trailing edge are present. From the computational investigation, it is recommended that Gurney flaps with a height of 1.5% chord be installed perpendicular to chord and as close to the trailing edge as possible to obtain maximum lift enhancement with minimum drag penalty.
Computational Investigations on the Effects of Gurney Flap on Airfoil Aerodynamics
Jain, Shubham; Sitaram, Nekkanti; Krishnaswamy, Sriram
2015-01-01
The present study comprises steady state, two-dimensional computational investigations performed on NACA 0012 airfoil to analyze the effect of Gurney flap (GF) on airfoil aerodynamics using k-ε RNG turbulence model of FLUENT. Airfoil with GF is analyzed for six different heights from 0.5% to 4% of the chord length, seven positions from 0% to 20% of the chord length from the trailing edge, and seven mounting angles from 30° to 120° with the chord. Computed values of lift and drag coefficients with angle of attack are compared with experimental values and good agreement is found at low angles of attack. In addition static pressure distribution on the airfoil surface and pathlines and turbulence intensities near the trailing edge are present. From the computational investigation, it is recommended that Gurney flaps with a height of 1.5% chord be installed perpendicular to chord and as close to the trailing edge as possible to obtain maximum lift enhancement with minimum drag penalty. PMID:27347517
Amoeba-inspired nanoarchitectonic computing implemented using electrical Brownian ratchets.
Aono, M; Kasai, S; Kim, S-J; Wakabayashi, M; Miwa, H; Naruse, M
2015-06-12
In this study, we extracted the essential spatiotemporal dynamics that allow an amoeboid organism to solve a computationally demanding problem and adapt to its environment, thereby proposing a nature-inspired nanoarchitectonic computing system, which we implemented using a network of nanowire devices called 'electrical Brownian ratchets (EBRs)'. By utilizing the fluctuations generated from thermal energy in nanowire devices, we used our system to solve the satisfiability problem, which is a highly complex combinatorial problem related to a wide variety of practical applications. We evaluated the dependency of the solution search speed on its exploration parameter, which characterizes the fluctuation intensity of EBRs, using a simulation model of our system called 'AmoebaSAT-Brownian'. We found that AmoebaSAT-Brownian enhanced the solution searching speed dramatically when we imposed some constraints on the fluctuations in its time series and it outperformed a well-known stochastic local search method. These results suggest a new computing paradigm, which may allow high-speed problem solving to be implemented by interacting nanoscale devices with low power consumption.
K-->pipi amplitudes from lattice QCD with a light charm quark.
Giusti, L; Hernández, P; Laine, M; Pena, C; Wennekers, J; Wittig, H
2007-02-23
We compute the leading-order low-energy constants of the DeltaS=1 effective weak Hamiltonian in the quenched approximation of QCD with up, down, strange, and charm quarks degenerate and light. They are extracted by comparing the predictions of finite-volume chiral perturbation theory with lattice QCD computations of suitable correlation functions carried out with quark masses ranging from a few MeV up to half of the physical strange mass. We observe a DeltaI=1/2 enhancement in this corner of the parameter space of the theory. Although matching with the experimental result is not observed for the DeltaI=1/2 amplitude, our computation suggests large QCD contributions to the physical DeltaI=1/2 rule in the GIM limit, and represents the first step to quantify the role of the charm-quark mass in K-->pipi amplitudes. The use of fermions with an exact chiral symmetry is an essential ingredient in our computation.
NASA Astrophysics Data System (ADS)
Stout, Jane G.; Blaney, Jennifer M.
2017-10-01
Research suggests growth mindset, or the belief that knowledge is acquired through effort, may enhance women's sense of belonging in male-dominated disciplines, like computing. However, other research indicates women who spend a great deal of time and energy in technical fields experience a low sense of belonging. The current study assessed the benefits of a growth mindset on women's (and men's) sense of intellectual belonging in computing, accounting for the amount of time and effort dedicated to academics. We define "intellectual belonging" as the sense that one is believed to be a competent member of the community. Whereas a stronger growth mindset was associated with stronger intellectual belonging for men, a growth mindset only boosted women's intellectual belonging when they did not work hard on academics. Our findings suggest, paradoxically, women may not benefit from a growth mindset in computing when they exert a lot of effort.
Beyer, Jonathan A.; Lumley, Mark A.; Latsch, Deborah A.; Oberleitner, Lindsay M.S.; Carty, Jennifer N.; Radcliffe, Alison M.
2014-01-01
Standard written emotional disclosure (WED) about stress, which is private and unguided, yields small health benefits. The effect of providing individualized guidance to writers may enhance WED, but has not been tested. This trial of computer-based WED compared two novel therapist-guided forms of WED—advance guidance (before sessions) or real-time guidance (during sessions, through instant messaging)—to both standard WED and control writing; it also tested Big 5 personality traits as moderators of guided WED. Young adult participants (n = 163) with unresolved stressful experiences were randomized to conditions, had three, 30-min computer-based writing sessions, and were reassessed 6 weeks later. Contrary to hypotheses, real-time guidance WED had poorer outcomes than the other conditions on several measures, and advance guidance WED also showed some poorer outcomes. Moderator analyses revealed that participants with low baseline agreeableness, low extraversion, or high conscientiousness had relatively poor responses to guidance. We conclude that providing guidance for WED, especially in real-time, may interfere with emotional processing of unresolved stress, particularly for people whose personalities have poor fit with this interactive form of WED. PMID:24266598
Land classification of south-central Iowa from computer enhanced images
NASA Technical Reports Server (NTRS)
Lucas, J. R. (Principal Investigator); Taranik, J. V.; Billingsley, F. C.
1976-01-01
The author has identified the following significant results. The Iowa Geological Survey developed its own capability for producing color products from digitally enhanced LANDSAT data. Research showed that efficient production of enhanced images required full utilization of both computer and photographic enhancement procedures. The 29 August 1972 photo-optically enhanced color composite was more easily interpreted for land classification purposes than standard color composites.
CT of hepatic schistosomiasis mansoni
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fataar, S.; Bassiony, H.; Satyanath, S.
1985-07-01
Schistosomal periportal fibrosis produced a typical pattern on computed tomography in five patients. Low-density periportal tissue, present throughout the liver, enhanced strongly after the administration of contrast medium. While rounded in cross section, the thickened periportal tissue produced linear and branching patterns when imaged in longitudinal section. In all cases, the sonographic features were typical of schistosomal periportal fibrosis. A lack of awareness of the distinctive features of periportal fibrosis may result in a mistaken diagnosis of hepatic metastases.
Mannil, Manoj; von Spiczak, Jochen; Manka, Robert; Alkadhi, Hatem
2018-06-01
The aim of this study was to test whether texture analysis and machine learning enable the detection of myocardial infarction (MI) on non-contrast-enhanced low radiation dose cardiac computed tomography (CCT) images. In this institutional review board-approved retrospective study, we included non-contrast-enhanced electrocardiography-gated low radiation dose CCT image data (effective dose, 0.5 mSv) acquired for the purpose of calcium scoring of 27 patients with acute MI (9 female patients; mean age, 60 ± 12 years), 30 patients with chronic MI (8 female patients; mean age, 68 ± 13 years), and in 30 subjects (9 female patients; mean age, 44 ± 6 years) without cardiac abnormality, hereafter termed controls. Texture analysis of the left ventricle was performed using free-hand regions of interest, and texture features were classified twice (Model I: controls versus acute MI versus chronic MI; Model II: controls versus acute and chronic MI). For both classifications, 6 commonly used machine learning classifiers were used: decision tree C4.5 (J48), k-nearest neighbors, locally weighted learning, RandomForest, sequential minimal optimization, and an artificial neural network employing deep learning. In addition, 2 blinded, independent readers visually assessed noncontrast CCT images for the presence or absence of MI. In Model I, best classification results were obtained using the k-nearest neighbors classifier (sensitivity, 69%; specificity, 85%; false-positive rate, 0.15). In Model II, the best classification results were found with the locally weighted learning classification (sensitivity, 86%; specificity, 81%; false-positive rate, 0.19) with an area under the curve from receiver operating characteristics analysis of 0.78. In comparison, both readers were not able to identify MI in any of the noncontrast, low radiation dose CCT images. This study indicates the ability of texture analysis and machine learning in detecting MI on noncontrast low radiation dose CCT images being not visible for the radiologists' eye.
Super-resolution Time-Lapse Seismic Waveform Inversion
NASA Astrophysics Data System (ADS)
Ovcharenko, O.; Kazei, V.; Peter, D. B.; Alkhalifah, T.
2017-12-01
Time-lapse seismic waveform inversion is a technique, which allows tracking changes in the reservoirs over time. Such monitoring is relatively computationally extensive and therefore it is barely feasible to perform it on-the-fly. Most of the expenses are related to numerous FWI iterations at high temporal frequencies, which is inevitable since the low-frequency components can not resolve fine scale features of a velocity model. Inverted velocity changes are also blurred when there is noise in the data, so the problem of low-resolution images is widely known. One of the problems intensively tackled by computer vision research community is the recovering of high-resolution images having their low-resolution versions. Usage of artificial neural networks to reach super-resolution from a single downsampled image is one of the leading solutions for this problem. Each pixel of the upscaled image is affected by all the pixels of its low-resolution version, which enables the workflow to recover features that are likely to occur in the corresponding environment. In the present work, we adopt machine learning image enhancement technique to improve the resolution of time-lapse full-waveform inversion. We first invert the baseline model with conventional FWI. Then we run a few iterations of FWI on a set of the monitoring data to find desired model changes. These changes are blurred and we enhance their resolution by using a deep neural network. The network is trained to map low-resolution model updates predicted by FWI into the real perturbations of the baseline model. For supervised training of the network we generate a set of random perturbations in the baseline model and perform FWI on the noisy data from the perturbed models. We test the approach on a realistic perturbation of Marmousi II model and demonstrate that it outperforms conventional convolution-based deblurring techniques.
From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models
Zhu, Hao
2017-01-01
Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837
NASA Astrophysics Data System (ADS)
Cheng, Fang-Yi; Jian, Shan-Ping; Yang, Zhih-Min; Yen, Ming-Cheng; Tsuang, Ben-Jei
2015-02-01
The objective of this study is to understand the influence of regional climate change on local meteorological conditions and their subsequent effect on local ozone (O3) dispersion in Taiwan. The 33-year NCEP-DOE Reanalysis 2 (NNR2) data set (1979-2011) was analyzed to understand the variations in regional-scale atmospheric conditions in East Asia and the western North Pacific. To save computational processing time, two scenarios representative of past (1979-86) and current (2004-11) atmospheric conditions were selected but only targeting the autumn season (September, October and November) when the O3 concentrations were at high levels. Numerical simulations were performed using weather research and forecasting (WRF) model and Community Multiscale Air Quality (CMAQ) model for the past and current scenarios individually but only for the month of October because of limited computational resources. Analysis of NNR2 data exhibited increased air temperature, weakened Asian continental anticyclone, enhanced northeasterly monsoonal flow, and a deepened low-pressure system forming near Taiwan. With enhanced evaporation from oceans along with a deepened low-pressure system, precipitation amounts increased in Taiwan in the current scenario. As demonstrated in the WRF simulation, the land surface physical process responded to the enhanced precipitation resulting in damper soil conditions, and reduced ground temperatures that in turn restricted the development of boundary layer height. The weakened land-sea breeze flow was simulated in the current scenario. With reduced dispersion capability, air pollutants would tend to accumulate near the emission source leading to a degradation of air quality in this region. The conditions would be even worse in southwestern Taiwan due to the fact that stagnant wind fields would occur more frequently in the current scenario. On the other hand, in northern Taiwan, the simulated O3 concentrations are lower during the day in the current scenario due to the enhanced cloud conditions and reduced solar radiation.
NASCAP modelling computations on large optics spacecraft in geosynchronous substorm environments
NASA Technical Reports Server (NTRS)
Stevens, N. J.; Purvis, C. K.
1980-01-01
The NASA Charging Analyzer Program (NASCAP) is used to evaluate qualitatively the possibility of such enhanced spacecraft contamination on a conceptual version of a large satellite. The evaluation is made by computing surface voltages on the satellite due to encounters with substorm environments and then computing charged particle trajectories in the electric fields around the satellite. Particular attention is paid to the possibility of contaminants reaching a mirror surface inside a dielectric tube because this mirror represents a shielded optical surface in the satellite model used. Deposition of low energy charged particles from other parts of the spacecraft onto the mirror was found to be possible in the assumed moderate substorm environment condition. In the assumed severe substorm environment condition, however, voltage build up on the inside and edges of the dielectric tube in which the mirror is located prevents contaminants from reaching the mirror surface.
pWeb: A High-Performance, Parallel-Computing Framework for Web-Browser-Based Medical Simulation.
Halic, Tansel; Ahn, Woojin; De, Suvranu
2014-01-01
This work presents a pWeb - a new language and compiler for parallelization of client-side compute intensive web applications such as surgical simulations. The recently introduced HTML5 standard has enabled creating unprecedented applications on the web. Low performance of the web browser, however, remains the bottleneck of computationally intensive applications including visualization of complex scenes, real time physical simulations and image processing compared to native ones. The new proposed language is built upon web workers for multithreaded programming in HTML5. The language provides fundamental functionalities of parallel programming languages as well as the fork/join parallel model which is not supported by web workers. The language compiler automatically generates an equivalent parallel script that complies with the HTML5 standard. A case study on realistic rendering for surgical simulations demonstrates enhanced performance with a compact set of instructions.
Technological change, depletion and environmental policy in the offshore oil and gas industry
NASA Astrophysics Data System (ADS)
Managi, Shunsuke
Technological change is central to maintaining standards of living in modern economies with finite resources and increasingly stringent environmental goals. Successful environmental policies can contribute to efficiency by encouraging, rather than inhibiting, technological innovation. However, little research to date has focused on the design and implementation of environmental regulations that encourage technological progress, or in insuring productivity improvements in the face of depletion of natural resources and increasing stringency of environmental regulations. This study models and measures productivity change, with an application to offshore oil and gas production in the Gulf of Mexico using Data Envelopment Analysis. This is an important application because energy resources are central to sustaining our economy. The net effects of technological progress and depletion on productivity of offshore oil and gas production are measured using a unique field-level set of data of production from all wells in the Gulf of Mexico over the time period from 1946--1998. Results are consistent with the hypothesis that technological progress has mitigated depletion effects over the study period, but the pattern differs from the conventional wisdom for nonrenewable resource industries. The Porter Hypothesis was recast, and revised version was tested. The Porter Hypothesis states that well designed environmental regulations can potentially contribute to productive efficiency in the long run by encouraging innovation. The Porter Hypothesis was recast to include market and nonmarket outputs. Our results support the recast version of Porter hypothesis, which examine productivity of joint production of market and environmental outputs. But we find no evidence for the standard formulation of the Porter hypothesis, that increased stringency of environmental regulation lead to increased productivity of market outputs and therefore increased industry profits. The model is used to forecast market and environmental outputs under alternative policy scenarios. Reliable baseline forecast and response to different policy actions of production and pollution are critical to the formation of sound energy and environmental policy. Forecast of production and pollution until year 2050 are generated from the model. Detailed policy scenarios provide quantitative assessments of potential benefits that indicate the significance of potential benefits of technological change and well-designed environmental policy.
NASA Astrophysics Data System (ADS)
Rao, S.; Dentener, F. J.; Klimont, Z.; Riahi, K.
2011-12-01
Outdoor air pollution is increasingly recognized as a significant contributor to global health outcomes. This has led to the implementation of a number of air quality policies worldwide, with total air pollution control costs in 2005 estimated at US$195 billion. More than 80% of the world's population is still found to be exposed to PM2.5 concentrations exceeding WHO air quality guidelines and health impacts resulting from these exposures estimated at around 2-5% of the global disease burden. Key questions to answer are 1) How will pollutant emissions evolve in the future given developments in the energy system and how will energy and environmental policies influence such emission trends. 2) What implications will this have for resulting exposures and related health outcomes. In order to answer these questions, varying levels of stringency of air quality legislation are analyzed in combination with policies on universal access to clean cooking fuels and limiting global temperature change to 2°C in 2100. Bottom-up methodologies using energy emissions modeling are used to derive sector-based pollutant emission trajectories until 2030. Emissions are spatially downscaled and used in combination with a global transport chemistry model to derive ambient concentrations of PM2.5. Health impacts of these exposures are further estimated consistent with WHO data and methodology. The results indicate that currently planned air quality legislation combined with rising energy demand will be insufficient in controlling future emissions growth in developing countries. In order to achieve significant reductions in pollutant emissions of the order of more than 50% from 2005 levels and reduce exposures to levels consistent with WHO standards, it will be necessary to increase the stringency of such legislations and combine them with policies on energy access and climate change. Combined policies also result in reductions in air pollution control costs as compared to those associated with current legislations. Health related co-benefits of combined policies are also found to be large, especially in developing countries- a reduction of more than 50% in terms of pollution related mortality impacts as compared to today.
Shimonobo, Toshiaki; Funama, Yoshinori; Utsunomiya, Daisuke; Nakaura, Takeshi; Oda, Seitaro; Kiguchi, Masao; Masuda, Takanori; Sakabe, Daisuke; Yamashita, Yasuyuki; Awai, Kazuo
2016-01-01
We used pediatric and adult anthropomorphic phantoms to compare the radiation dose of low- and standard tube voltage chest and abdominal non-contrast-enhanced computed tomography (CT) scans. We also discuss the optimal low tube voltage for non-contrast-enhanced CT. Using a female adult- and three differently-sized pediatric anthropomorphic phantoms we acquired chest and abdominal non-contrast-enhanced scans on a 320-multidetector CT volume scanner. The tube voltage was set at 80-, 100-, and 120 kVp. The tube current was automatically assigned on the CT scanner in response to the set image noise level. On each phantom and at each tube voltage we measured the surface and center dose using high-sensitivity metal-oxide-semiconductor field-effect transistor detectors. The mean surface dose of chest and abdominal CT scans in 5-year olds was 4.4 and 5.3 mGy at 80 kVp, 4.5 and 5.4 mGy at 100 kV, and 4.0 and 5.0 mGy at 120 kVp, respectively. These values were similar in our 3-pediatric phantoms (p > 0.05). The mean surface dose in the adult phantom increased from 14.7 to 19.4 mGy for chest- and from 18.7 to 24.8 mGy for abdominal CT as the tube voltage decreased from 120 to 80 kVp (p < 0.01). Compared to adults, the surface and center dose for pediatric patients is almost the same despite a decrease in the tube voltage and the low tube voltage technique can be used for non-contrast-enhanced chest- and abdominal scanning. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Ramos-Infante, Samuel Jesús; Ten-Esteve, Amadeo; Alberich-Bayarri, Angel; Pérez, María Angeles
2018-01-01
This paper proposes a discrete particle model based on the random-walk theory for simulating cement infiltration within open-cell structures to prevent osteoporotic proximal femur fractures. Model parameters consider the cement viscosity (high and low) and the desired direction of injection (vertical and diagonal). In vitro and in silico characterizations of augmented open-cell structures validated the computational model and quantified the improved mechanical properties (Young's modulus) of the augmented specimens. The cement injection pattern was successfully predicted in all the simulated cases. All the augmented specimens exhibited enhanced mechanical properties computationally and experimentally (maximum improvements of 237.95 ± 12.91% and 246.85 ± 35.57%, respectively). The open-cell structures with high porosity fraction showed a considerable increase in mechanical properties. Cement augmentation in low porosity fraction specimens resulted in a lesser increase in mechanical properties. The results suggest that the proposed discrete particle model is adequate for use as a femoroplasty planning framework.
NASA Astrophysics Data System (ADS)
Hadzibeganovic, Tarik; Stauffer, Dietrich; Han, Xiao-Pu
2018-04-01
Cooperation is fundamental for the long-term survival of biological, social, and technological networks. Previously, mechanisms for the enhancement of cooperation, such as network reciprocity, have largely been studied in isolation and with often inconclusive findings. Here, we present an evolutionary, multiagent-based, and spatially explicit computer model to specifically address the interactive interplay between such mechanisms. We systematically investigate the effects of phenotypic diversity, network structure, and rewards on cooperative behavior emerging in a population of reproducing artificial decision makers playing tag-mediated evolutionary games. Cooperative interactions are rewarded such that both the benefits of recipients and costs of donators are affected by the reward size. The reward size is determined by the number of cooperative acts occurring within a given reward time frame. Our computational experiments reveal that small reward frames promote unconditional cooperation in populations with both low and high diversity, whereas large reward frames lead to cycles of conditional and unconditional strategies at high but not at low diversity. Moreover, an interaction between rewards and spatial structure shows that relative to small reward frames, there is a strong difference between the frequency of conditional cooperators populating rewired versus non-rewired networks when the reward frame is large. Notably, in a less diverse population, the total number of defections is comparable across different network topologies, whereas in more diverse environments defections become more frequent in a regularly structured than in a rewired, small-world network of contacts. Acknowledging the importance of such interaction effects in social dilemmas will have inevitable consequences for the future design of cooperation-enhancing protocols in large-scale, distributed, and decentralized systems such as peer-to-peer networks.
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2012 CFR
2012-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2013 CFR
2013-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2011 CFR
2011-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
47 CFR 64.702 - Furnishing of enhanced services and customer-premises equipment.
Code of Federal Regulations, 2014 CFR
2014-10-01
... separate operating, marketing, installation, and maintenance personnel, and utilize separate computer... available to the separate corporation any capacity or computer system component on its computer system or... Enhanced Services and Customer-Premises Equipment by Bell Operating Companies; Telephone Operator Services...
NASA Astrophysics Data System (ADS)
Ibragimov, Bulat; Toesca, Diego; Chang, Daniel; Koong, Albert; Xing, Lei
2017-12-01
Automated segmentation of the portal vein (PV) for liver radiotherapy planning is a challenging task due to potentially low vasculature contrast, complex PV anatomy and image artifacts originated from fiducial markers and vasculature stents. In this paper, we propose a novel framework for automated segmentation of the PV from computed tomography (CT) images. We apply convolutional neural networks (CNNs) to learn the consistent appearance patterns of the PV using a training set of CT images with reference annotations and then enhance the PV in previously unseen CT images. Markov random fields (MRFs) were further used to smooth the results of the enhancement of the CNN enhancement and remove isolated mis-segmented regions. Finally, CNN-MRF-based enhancement was augmented with PV centerline detection that relied on PV anatomical properties such as tubularity and branch composition. The framework was validated on a clinical database with 72 CT images of patients scheduled for liver stereotactic body radiation therapy. The obtained accuracy of the segmentation was DSC= 0.83 and \
Current sheath behavior and its velocity enhancement in a low energy Mather-type plasma focus device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aghamir, F. M.; Behbahani, R. A.
The dynamics of the plasma sheath layer and its velocity enhancement have been studied in a low energy (4.9 kJ) Mather-type plasma focus device. Experiments were performed to study the effect of the Lorentz force variation on the current sheath expansion and movement, as well as the existence of traction between all parts of the sheath layer. Two different shape of anodes (cylindrical and step) along with an axial magnetic probe were used to investigate the effects of various experimental conditions, namely different charging voltages and gas pressures. In order to explore the upper limit of the current sheath velocity,more » a comparison has been made between the experimental data gathered by the probe and the Lee's computational model. The limitations governing the enhancement of the current sheath velocity that can lead to the deterioration of a good focusing phenomenon were also investigated. The increase of the current sheath velocity due to the usage of the step anode on ion generation and hard x-ray emissions have been demonstrated by means of an ion collector and a hard x-ray detector.« less
Enhancing programming logic thinking using analogy mapping
NASA Astrophysics Data System (ADS)
Sukamto, R. A.; Megasari, R.
2018-05-01
Programming logic thinking is the most important competence for computer science students. However, programming is one of the difficult subject in computer science program. This paper reports our work about enhancing students' programming logic thinking using Analogy Mapping for basic programming subject. Analogy Mapping is a computer application which converts source code into analogies images. This research used time series evaluation and the result showed that Analogy Mapping can enhance students' programming logic thinking.
Kim, Wansun; Lee, Jae-Chul; Lee, Gi-Ja; Park, Hun-Kuk; Lee, Anbok; Choi, Samjin
2017-06-20
We introduce a label-free biosensing cellulose strip sensor with surface-enhanced Raman spectroscopy (SERS)-encoded bimetallic core@shell nanoparticles. Bimetallic nanoparticles consisting of a synthesis of core Ag nanoparticles (AgNP) and a synthesis of shell gold nanoparticles (AuNPs) were fabricated on a cellulose substrate by two-stage successive ionic layer absorption and reaction (SILAR) techniques. The bimetallic nanoparticle-enhanced localized surface plasmon resonance (LSPR) effects were theoretically verified by computational calculations with finite element models of optimized bimetallic nanoparticles interacting with an incident laser source. Well-dispersed raspberry-like bimetallic nanoparticles with highly polycrystalline structure were confirmed through X-ray and electron analyses despite ionic reaction synthesis. The stability against silver oxidation and high sensitivity with superior SERS enhancement factor (EF) of the low-cost SERS-encoded cellulose strip, which achieved 3.98 × 10 8 SERS-EF, 6.1%-RSD reproducibility, and <10%-degraded sustainability, implicated the possibility of practical applications in high analytical screening methods, such as single-molecule detection. The remarkable sensitivity and selectivity of this bimetallic biosensing strip in determining aquatic toxicities for prohibited drugs, such as aniline, sodium azide, and malachite green, as well as monitoring the breast cancer progression for urine, confirmed its potential as a low-cost label-free point-of-care test chip for the early diagnosis of human diseases.
Low-cost conversion of the Polaroid MD-4 land camera to a digital gel documentation system.
Porch, Timothy G; Erpelding, John E
2006-04-30
A simple, inexpensive design is presented for the rapid conversion of the popular MD-4 Polaroid land camera to a high quality digital gel documentation system. Images of ethidium bromide stained DNA gels captured using the digital system were compared to images captured on Polaroid instant film. Resolution and sensitivity were enhanced using the digital system. In addition to the low cost and superior image quality of the digital system, there is also the added convenience of real-time image viewing through the swivel LCD of the digital camera, wide flexibility of gel sizes, accurate automatic focusing, variable image resolution, and consistent ease of use and quality. Images can be directly imported to a computer by using the USB port on the digital camera, further enhancing the potential of the digital system for documentation, analysis, and archiving. The system is appropriate for use as a start-up gel documentation system and for routine gel analysis.
Enhancing Learning through Human Computer Interaction
ERIC Educational Resources Information Center
McKay, Elspeth, Ed.
2007-01-01
Enhancing Learning Through Human Computer Interaction is an excellent reference source for human computer interaction (HCI) applications and designs. This "Premier Reference Source" provides a complete analysis of online business training programs and e-learning in the higher education sector. It describes a range of positive outcomes for linking…
Focal nodular hyperplasia coexistent with hepatoblastoma in a 36-d-old infant
Gong, Ying; Chen, Lian; Qiao, Zhong-Wei; Ma, Yang-Yang
2015-01-01
Focal nodular hyperplasia (FNH) is a benign hepatic tumor characterized by hepatocyte hyperplasia and a central stellate scar. The association of FNH with other hepatic lesions, such as adenomas, hemangiomas and hepatocellular carcinoma, has been previously reported, but FNH associated with another hepatic tumor is rare in infants. Here we report a case of FNH coexistent with hepatoblastoma in a 36-d-old girl. Computed tomography (CT) imaging showed an ill-delineated, inhomogeneous enhanced mass with a central star-like scar in the right lobe of the liver. The tumor showed early mild enhancement at the arterial phase (from 40HU without contrast to 52HU at the arterial phase), intense enhancement at the portal phase (87.7HU) and 98.1HU in the 3-min delay scan. A central scar in the tumor presented as low density on non-contrast CT and slightly enhanced at delayed contrast-enhanced scanning. This infant underwent surgical resection of the tumor. Histopathology demonstrated typical FNH coexistent with a focal hepatoblastoma, which showed epithelioid tumor cells separated by proliferated fibrous tissue. PMID:25624742
Focal nodular hyperplasia coexistent with hepatoblastoma in a 36-d-old infant.
Gong, Ying; Chen, Lian; Qiao, Zhong-Wei; Ma, Yang-Yang
2015-01-21
Focal nodular hyperplasia (FNH) is a benign hepatic tumor characterized by hepatocyte hyperplasia and a central stellate scar. The association of FNH with other hepatic lesions, such as adenomas, hemangiomas and hepatocellular carcinoma, has been previously reported, but FNH associated with another hepatic tumor is rare in infants. Here we report a case of FNH coexistent with hepatoblastoma in a 36-d-old girl. Computed tomography (CT) imaging showed an ill-delineated, inhomogeneous enhanced mass with a central star-like scar in the right lobe of the liver. The tumor showed early mild enhancement at the arterial phase (from 40HU without contrast to 52HU at the arterial phase), intense enhancement at the portal phase (87.7HU) and 98.1HU in the 3-min delay scan. A central scar in the tumor presented as low density on non-contrast CT and slightly enhanced at delayed contrast-enhanced scanning. This infant underwent surgical resection of the tumor. Histopathology demonstrated typical FNH coexistent with a focal hepatoblastoma, which showed epithelioid tumor cells separated by proliferated fibrous tissue.
Implicit preconditioned WENO scheme for steady viscous flow computation
NASA Astrophysics Data System (ADS)
Huang, Juan-Chen; Lin, Herng; Yang, Jaw-Yen
2009-02-01
A class of lower-upper symmetric Gauss-Seidel implicit weighted essentially nonoscillatory (WENO) schemes is developed for solving the preconditioned Navier-Stokes equations of primitive variables with Spalart-Allmaras one-equation turbulence model. The numerical flux of the present preconditioned WENO schemes consists of a first-order part and high-order part. For first-order part, we adopt the preconditioned Roe scheme and for the high-order part, we employ preconditioned WENO methods. For comparison purpose, a preconditioned TVD scheme is also given and tested. A time-derivative preconditioning algorithm is devised and a discriminant is devised for adjusting the preconditioning parameters at low Mach numbers and turning off the preconditioning at intermediate or high Mach numbers. The computations are performed for the two-dimensional lid driven cavity flow, low subsonic viscous flow over S809 airfoil, three-dimensional low speed viscous flow over 6:1 prolate spheroid, transonic flow over ONERA-M6 wing and hypersonic flow over HB-2 model. The solutions of the present algorithms are in good agreement with the experimental data. The application of the preconditioned WENO schemes to viscous flows at all speeds not only enhances the accuracy and robustness of resolving shock and discontinuities for supersonic flows, but also improves the accuracy of low Mach number flow with complicated smooth solution structures.
Low frequency electromagnetic fluctuations in Kappa magnetized plasmas
NASA Astrophysics Data System (ADS)
Kim, Sunjung; Lazar, M.; Schlickeiser, R.; López, R. A.; Yoon, P. H.
2018-07-01
The present paper provides a theoretical approach for the evaluation of the low frequency spontaneously emitted electromagnetic (EM) fluctuations in Kappa magnetized plasmas, which include the kinetic Alfvén, fast magnetosonic/whistler, kinetic slow mode, ion Bernstein cyclotron modes, and higher-order modes. The model predictions are consistent with particle-in-cell simulations. Effects of suprathermal particles on low frequency fluctuations are studied by varying the power index, either for ions (κ i) or for electrons (κ e). Computations for an arbitrary wave vector orientation and wave polarization provide the intensity of spontaneous emissions to be enhanced in the presence of suprathermal populations. These results strongly suggest that spontaneous fluctuations may significantly contribute to the EM fluctuations observed in space plasmas, where suprathermal Kappa distributed particles are ubiquitous.
Edge enhancement and noise suppression for infrared image based on feature analysis
NASA Astrophysics Data System (ADS)
Jiang, Meng
2018-06-01
Infrared images are often suffering from background noise, blurred edges, few details and low signal-to-noise ratios. To improve infrared image quality, it is essential to suppress noise and enhance edges simultaneously. To realize it in this paper, we propose a novel algorithm based on feature analysis in shearlet domain. Firstly, as one of multi-scale geometric analysis (MGA), we introduce the theory and superiority of shearlet transform. Secondly, after analyzing the defects of traditional thresholding technique to suppress noise, we propose a novel feature extraction distinguishing image structures from noise well and use it to improve the traditional thresholding technique. Thirdly, with computing the correlations between neighboring shearlet coefficients, the feature attribute maps identifying the weak detail and strong edges are completed to improve the generalized unsharped masking (GUM). At last, experiment results with infrared images captured in different scenes demonstrate that the proposed algorithm suppresses noise efficiently and enhances image edges adaptively.
Local delivery of FTY720 accelerates cranial allograft incorporation and bone formation
Huang, Cynthia; Das, Anusuya; Barker, Daniel; Tholpady, Sunil; Wang, Tiffany; Cui, Quanjun; Ogle, Roy
2012-01-01
Endogenous stem cell recruitment to the site of skeletal injury is key to enhanced osseous remodeling and neovascularization. To this end, this study utilized a novel bone allograft coating of poly(lactic-co-glycolic acid) (PLAGA) to sustain the release of FTY720, a selective agonist for sphingosine 1-phosphate (S1P) receptors, from calvarial allografts. Uncoated allografts, vehicle-coated, low dose FTY720 in PLAGA (1:200 w:w) and high dose FTY720 in PLAGA (1:40) were implanted into critical size calvarial bone defects. The ability of local FTY720 delivery to promote angiogenesis, maximize osteoinductivity and improve allograft incorporation by recruitment of bone progenitor cells from surrounding soft tissues and microcirculation was evaluated. FTY720 bioactivity after encapsulation and release was confirmed with sphingosine kinase 2 assays. HPLC-MS quantified about 50% loaded FTY720 release of the total encapsulated drug (4.5 µg) after 5 days. Following 2 weeks of defect healing, FTY720 delivery led to statistically significant increases in bone volumes compared to controls, with total bone volume increases for uncoated, coated, low FTY720 and high FTY720 of 5.98, 3.38, 7.2 and 8.9 mm3, respectively. The rate and extent of enhanced bone growth persisted through week 4 but, by week 8, increases in bone formation in FTY720 groups were no longer statistically significant. However, micro-computed tomography (microCT) of contrast enhanced vascular ingrowth (MICROFIL®) and histological analysis showed enhanced integration as well as directed bone growth in both high and low dose FTY720 groups compared to controls. PMID:21863314
Enhanced conformational sampling of carbohydrates by Hamiltonian replica-exchange simulation.
Mishra, Sushil Kumar; Kara, Mahmut; Zacharias, Martin; Koca, Jaroslav
2014-01-01
Knowledge of the structure and conformational flexibility of carbohydrates in an aqueous solvent is important to improving our understanding of how carbohydrates function in biological systems. In this study, we extend a variant of the Hamiltonian replica-exchange molecular dynamics (MD) simulation to improve the conformational sampling of saccharides in an explicit solvent. During the simulations, a biasing potential along the glycosidic-dihedral linkage between the saccharide monomer units in an oligomer is applied at various levels along the replica runs to enable effective transitions between various conformations. One reference replica runs under the control of the original force field. The method was tested on disaccharide structures and further validated on biologically relevant blood group B, Lewis X and Lewis A trisaccharides. The biasing potential-based replica-exchange molecular dynamics (BP-REMD) method provided a significantly improved sampling of relevant conformational states compared with standard continuous MD simulations, with modest computational costs. Thus, the proposed BP-REMD approach adds a new dimension to existing carbohydrate conformational sampling approaches by enhancing conformational sampling in the presence of solvent molecules explicitly at relatively low computational cost.
Cossio; Arrieta; Cebolla; Membrado; Vela; Garriga; Domingo
2000-07-27
Alkanes in the presence of berberine sulfate provide an enhancement of fluorescent signal, which depends on alkane concentration and structure, when the system is irradiated with monochromatic UV light. Computational analysis suggests that an ion-induced dipole between alkanes and berberine sulfate is responsible for this phenomenon. This interaction can properly model the experimentally obtained fluorescent response. The proposed explanation allows other interacting systems to be designed, which have been experimentally confirmed.
1983-02-01
s.,ccesstully modeled to enhance future computer design simulations; (2) a new methodology for conduc*n dynamic analysis of vehicle mechanics was...to prelminary design methodology for tilt rotors, advancing blade concepts configuration helicopters, and compound helicopters in conjunction with...feasibility of low-level personnel parachutes has been demon- strated. A study was begun to design a free-fall water contalner. An experimental program to
Bitsaki, Marina; Koutras, Christos; Koutras, George; Leymann, Frank; Steimle, Frank; Wagner, Sebastian; Wieland, Matthias
2017-09-01
Lack of time or economic difficulties prevent chronic obstructive pulmonary disease patients from communicating regularly with their physicians, thus inducing exacerbation of their chronic condition and possible hospitalization. Enhancing Chronic patients' Health Online proposes a new, sustainable and innovative business model that provides at low cost and at significant savings to the national health system, a preventive health service for chronic obstructive pulmonary disease patients, by combining human medical expertise with state-of-the-art online service delivery based on cloud computing, service-oriented architecture, data analytics, and mobile applications. In this article, we implement the frontend applications of the Enhancing Chronic patients' Health Online system and describe their functionality and the interfaces available to the users.
EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young
2003-02-27
Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less
Contrast enhancement in EIT imaging of the brain.
Nissinen, A; Kaipio, J P; Vauhkonen, M; Kolehmainen, V
2016-01-01
We consider electrical impedance tomography (EIT) imaging of the brain. The brain is surrounded by the poorly conducting skull which has low conductivity compared to the brain. The skull layer causes a partial shielding effect which leads to weak sensitivity for the imaging of the brain tissue. In this paper we propose an approach based on the Bayesian approximation error approach, to enhance the contrast in brain imaging. With this approach, both the (uninteresting) geometry and the conductivity of the skull are embedded in the approximation error statistics, which leads to a computationally efficient algorithm that is able to detect features such as internal haemorrhage with significantly increased sensitivity and specificity. We evaluate the approach with simulations and phantom data.
Techniques for using diazo materials in remote sensor data analysis
NASA Technical Reports Server (NTRS)
Whitebay, L. E.; Mount, S.
1978-01-01
The use of data derived from LANDSAT is facilitated when special products or computer enhanced images can be analyzed. However, the facilities required to produce and analyze such products prevent many users from taking full advantages of the LANDSAT data. A simple, low-cost method is presented by which users can make their own specially enhanced composite images from the four band black and white LANDSAT images by using the diazo process. The diazo process is described and a detailed procedure for making various color composites, such as color infrared, false natural color, and false color, is provided. The advantages and limitations of the diazo process are discussed. A brief discussion interpretation of diazo composites for land use mapping with some typical examples is included.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Paradella, W. R.; Vitorello, I.
1982-01-01
Several aspects of computer-assisted analysis techniques for image enhancement and thematic classification by which LANDSAT MSS imagery may be treated quantitatively are explained. On geological applications, computer processing of digital data allows, possibly, the fullest use of LANDSAT data, by displaying enhanced and corrected data for visual analysis and by evaluating and assigning each spectral pixel information to a given class.
Audio-Enhanced Computer Assisted Learning and Computer Controlled Audio-Instruction.
ERIC Educational Resources Information Center
Miller, K.; And Others
1983-01-01
Describes aspects of use of a microcomputer linked with a cassette recorder as a peripheral to enhance computer-assisted learning (CAL) and a microcomputer-controlled tape recorder linked with a microfiche reader in a commercially available teaching system. References and a listing of control programs are appended. (EJS)
Design of a modular digital computer system
NASA Technical Reports Server (NTRS)
1980-01-01
A Central Control Element (CCE) module which controls the Automatically Reconfigurable Modular System (ARMS) and allows both redundant processing and multi-computing in the same computer with real time mode switching, is discussed. The same hardware is used for either reliability enhancement, speed enhancement, or for a combination of both.
Enhancing the Undergraduate Computing Experience in Chemical Engineering CACHE Corporation
ERIC Educational Resources Information Center
Edgar, Thomas F.
2006-01-01
This white paper focuses on the integration and enhancement of the computing experience for undergraduates throughout the chemical engineering curriculum. The computing experience for undergraduates in chemical engineering should have continuity and be coordinated from course to course, because a single software solution is difficult to achieve in…
Using Computer-Assisted Instruction to Enhance Achievement of English Language Learners
ERIC Educational Resources Information Center
Keengwe, Jared; Hussein, Farhan
2014-01-01
Computer-assisted instruction (CAI) in English-Language environments offer practice time, motivates students, enhance student learning, increase authentic materials that students can study, and has the potential to encourage teamwork between students. The findings from this particular study suggested that students who used computer assisted…
Plasmonically Enhanced Reflectance of Heat Radiation from Low-Bandgap Semiconductor Microinclusions.
Tang, Janika; Thakore, Vaibhav; Ala-Nissila, Tapio
2017-07-18
Increased reflectance from the inclusion of highly scattering particles at low volume fractions in an insulating dielectric offers a promising way to reduce radiative thermal losses at high temperatures. Here, we investigate plasmonic resonance driven enhanced scattering from microinclusions of low-bandgap semiconductors (InP, Si, Ge, PbS, InAs and Te) in an insulating composite to tailor its infrared reflectance for minimizing thermal losses from radiative transfer. To this end, we compute the spectral properties of the microcomposites using Monte Carlo modeling and compare them with results from Fresnel equations. The role of particle size-dependent Mie scattering and absorption efficiencies, and, scattering anisotropy are studied to identify the optimal microinclusion size and material parameters for maximizing the reflectance of the thermal radiation. For composites with Si and Ge microinclusions we obtain reflectance efficiencies of 57-65% for the incident blackbody radiation from sources at temperatures in the range 400-1600 °C. Furthermore, we observe a broadbanding of the reflectance spectra from the plasmonic resonances due to charge carriers generated from defect states within the semiconductor bandgap. Our results thus open up the possibility of developing efficient high-temperature thermal insulators through use of the low-bandgap semiconductor microinclusions in insulating dielectrics.
Low-complex energy-aware image communication in visual sensor networks
NASA Astrophysics Data System (ADS)
Phamila, Yesudhas Asnath Victy; Amutha, Ramachandran
2013-10-01
A low-complex, low bit rate, energy-efficient image compression algorithm explicitly designed for resource-constrained visual sensor networks applied for surveillance, battle field, habitat monitoring, etc. is presented, where voluminous amount of image data has to be communicated over a bandwidth-limited wireless medium. The proposed method overcomes the energy limitation of individual nodes and is investigated in terms of image quality, entropy, processing time, overall energy consumption, and system lifetime. This algorithm is highly energy efficient and extremely fast since it applies energy-aware zonal binary discrete cosine transform (DCT) that computes only the few required significant coefficients and codes them using enhanced complementary Golomb Rice code without using any floating point operations. Experiments are performed using the Atmel Atmega128 and MSP430 processors to measure the resultant energy savings. Simulation results show that the proposed energy-aware fast zonal transform consumes only 0.3% of energy needed by conventional DCT. This algorithm consumes only 6% of energy needed by Independent JPEG Group (fast) version, and it suits for embedded systems requiring low power consumption. The proposed scheme is unique since it significantly enhances the lifetime of the camera sensor node and the network without any need for distributed processing as was traditionally required in existing algorithms.
Tian, Xiumei; Zeng, Dong; Zhang, Shanli; Huang, Jing; Zhang, Hua; He, Ji; Lu, Lijun; Xi, Weiwen; Ma, Jianhua; Bian, Zhaoying
2016-11-22
Dynamic cerebral perfusion x-ray computed tomography (PCT) imaging has been advocated to quantitatively and qualitatively assess hemodynamic parameters in the diagnosis of acute stroke or chronic cerebrovascular diseases. However, the associated radiation dose is a significant concern to patients due to its dynamic scan protocol. To address this issue, in this paper we propose an image restoration method by utilizing coupled dictionary learning (CDL) scheme to yield clinically acceptable PCT images with low-dose data acquisition. Specifically, in the present CDL scheme, the 2D background information from the average of the baseline time frames of low-dose unenhanced CT images and the 3D enhancement information from normal-dose sequential cerebral PCT images are exploited to train the dictionary atoms respectively. After getting the two trained dictionaries, we couple them to represent the desired PCT images as spatio-temporal prior in objective function construction. Finally, the low-dose dynamic cerebral PCT images are restored by using a general DL image processing. To get a robust solution, the objective function is solved by using a modified dictionary learning based image restoration algorithm. The experimental results on clinical data show that the present method can yield more accurate kinetic enhanced details and diagnostic hemodynamic parameter maps than the state-of-the-art methods.
Air quality co-benefits of carbon pricing in China
NASA Astrophysics Data System (ADS)
Li, Mingwei; Zhang, Da; Li, Chiao-Ting; Mulvaney, Kathleen M.; Selin, Noelle E.; Karplus, Valerie J.
2018-05-01
Climate policies targeting energy-related CO2 emissions, which act on a global scale over long time horizons, can result in localized, near-term reductions in both air pollution and adverse human health impacts. Focusing on China, the largest energy-using and CO2-emitting nation, we develop a cross-scale modelling approach to quantify these air quality co-benefits, and compare them to the economic costs of climate policy. We simulate the effects of an illustrative climate policy, a price on CO2 emissions. In a policy scenario consistent with China's recent pledge to reach a peak in CO2 emissions by 2030, we project that national health co-benefits from improved air quality would partially or fully offset policy costs depending on chosen health valuation. Net health co-benefits are found to rise with increasing policy stringency.
Reid-Bayliss, Kate S; Loeb, Lawrence A
2017-08-29
Transcriptional mutagenesis (TM) due to misincorporation during RNA transcription can result in mutant RNAs, or epimutations, that generate proteins with altered properties. TM has long been hypothesized to play a role in aging, cancer, and viral and bacterial evolution. However, inadequate methodologies have limited progress in elucidating a causal association. We present a high-throughput, highly accurate RNA sequencing method to measure epimutations with single-molecule sensitivity. Accurate RNA consensus sequencing (ARC-seq) uniquely combines RNA barcoding and generation of multiple cDNA copies per RNA molecule to eliminate errors introduced during cDNA synthesis, PCR, and sequencing. The stringency of ARC-seq can be scaled to accommodate the quality of input RNAs. We apply ARC-seq to directly assess transcriptome-wide epimutations resulting from RNA polymerase mutants and oxidative stress.
Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei
2015-01-01
Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928
Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei
2015-01-01
Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.
Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems
Wu, Jun; Su, Zhou; Li, Jianhua
2017-01-01
Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943
Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.
Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua
2017-07-30
Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.
Chen, Xiao; Salerno, Michael; Yang, Yang; Epstein, Frederick H.
2014-01-01
Purpose Dynamic contrast-enhanced MRI of the heart is well-suited for acceleration with compressed sensing (CS) due to its spatiotemporal sparsity; however, respiratory motion can degrade sparsity and lead to image artifacts. We sought to develop a motion-compensated CS method for this application. Methods A new method, Block LOw-rank Sparsity with Motion-guidance (BLOSM), was developed to accelerate first-pass cardiac MRI, even in the presence of respiratory motion. This method divides the images into regions, tracks the regions through time, and applies matrix low-rank sparsity to the tracked regions. BLOSM was evaluated using computer simulations and first-pass cardiac datasets from human subjects. Using rate-4 acceleration, BLOSM was compared to other CS methods such as k-t SLR that employs matrix low-rank sparsity applied to the whole image dataset, with and without motion tracking, and to k-t FOCUSS with motion estimation and compensation that employs spatial and temporal-frequency sparsity. Results BLOSM was qualitatively shown to reduce respiratory artifact compared to other methods. Quantitatively, using root mean squared error and the structural similarity index, BLOSM was superior to other methods. Conclusion BLOSM, which exploits regional low rank structure and uses region tracking for motion compensation, provides improved image quality for CS-accelerated first-pass cardiac MRI. PMID:24243528
Campos, Eleonora; Negro Alvarez, María José; Sabarís di Lorenzo, Gonzalo; Gonzalez, Sergio; Rorig, Marcela; Talia, Paola; Grasso, Daniel H; Sáez, Felicia; Manzanares Secades, Paloma; Ballesteros Perdices, Mercedes; Cataldi, Angel A
2014-01-01
The use of lignocellulosic biomass for second generation biofuels requires optimization of enzymatic breakdown of plant cell walls. In this work, cellulolytic bacteria were isolated from a native and two cultivated forest soil samples. Amplification of glycosyl hydrolases was attempted by using a low stringency-degenerate primer PCR strategy, using total soil DNA and bulk DNA pooled from positive colonies as template. A set of primers was designed based on Acidothermus cellulolyticus genome, by search of conserved domains of glycosyl hydrolases (GH) families of interest. Using this approach, a fragment containing an open reading frame (ORF) with 98% identity to a putative GH43 beta-xylosidase coding gene from Enterobacter cloacae was amplified and cloned. The full protein was expressed in Escherichia coli as N-terminal or C-terminal His-tagged fusions and purified under native conditions. Only N-terminal fusion protein, His-Xyl43, presented beta-xylosidase activity. On pNPX, optimal activity was achieved at pH 6 and 40 °C and Km and Kcat values were 2.92 mM and 1.32 seg(-1), respectively. Activity was also demonstrated on xylobiose (X2), with Km 17.8 mM and Kcat 380 s(-1). These results demonstrated that Xyl43 is a functional beta-xylosidase and it is the first evidence of this activity for Enterobacter sp. Copyright © 2013 Elsevier GmbH. All rights reserved.
[Invasive fungal infections in children with cancer, neutropenia and fever, in Chile].
Lucero, Yalda; Brücher, Roberto; Alvarez, Ana María; Becker, Ana; Cofré, José; Enríquez, Nancy; Payá, Ernesto; Salgado, Carmen; Santolaya, María Elena; Tordecilla, Juan; Varas, Mónica; Villarroel, Milena; Viviani, Tamara; Zubieta, Marcela; O'Ryan, Miguel
2002-10-01
Invasive fungal infections (IFI) cause prolonged hospitalizations and increase the possibility of death among patients with cancer and febrile neutropenia (FN). Up to 10% of febrile neutropenic episodes may be caused by IFI. To estimate the incidence of IFI among a large group of Chilean children with cancer and FN. Clinical and laboratory information was collected from a data base provided by the "Programa Infantil Nacional de Drogas Antineoplásicas" (PINDA) that included 445 FN episodes occurring in five hospitals in Santiago, Chile. This information was used to identify children that presented with signs and symptoms compatible with an IFI. According to predefined criteria based on a literature review, IFI episodes were categorized as "proven", "probable" or "possible". A total of 41/445 episodes (9.2%) were compatible with an IFI of which 4 (0.9%) were proven, 23 (5.2%) probable, and 14 (3.1%) possible. Hospitalization was longer (27 vs 8 days, p < .01), new infectious foci appeared with higher frequency (71 vs 38%, p < .01), and mortality was higher (10 vs 1.6%, p < .001) in children with IFI compatible episodes, when compared to children who did not have an IFI. The estimated incidence of IFI in Chilean children with cancer and FN ranged between 6-9% depending on the stringency of criteria selection used for classification. This estimate is similar to that reported by other studies. The low detection yield of clinically compatible IFI underscores the need of improved diagnosis of fungal infections in this population.
Accumulation of Polyphosphate in Lactobacillus spp. and Its Involvement in Stress Resistance
Alcántara, Cristina; Blasco, Amalia; Zúñiga, Manuel
2014-01-01
Polyphosphate (poly-P) is a polymer of phosphate residues synthesized and in some cases accumulated by microorganisms, where it plays crucial physiological roles such as the participation in the response to nutritional stringencies and environmental stresses. Poly-P metabolism has received little attention in Lactobacillus, a genus of lactic acid bacteria of relevance for food production and health of humans and animals. We show that among 34 strains of Lactobacillus, 18 of them accumulated intracellular poly-P granules, as revealed by specific staining and electron microscopy. Poly-P accumulation was generally dependent on the presence of elevated phosphate concentrations in the culture medium, and it correlated with the presence of polyphosphate kinase (ppk) genes in the genomes. The ppk gene from Lactobacillus displayed a genetic arrangement in which it was flanked by two genes encoding exopolyphosphatases of the Ppx-GppA family. The ppk functionality was corroborated by its disruption (LCABL_27820 gene) in Lactobacillus casei BL23 strain. The constructed ppk mutant showed a lack of intracellular poly-P granules and a drastic reduction in poly-P synthesis. Resistance to several stresses was tested in the ppk-disrupted strain, showing that it presented a diminished growth under high-salt or low-pH conditions and an increased sensitivity to oxidative stress. These results show that poly-P accumulation is a characteristic of some strains of lactobacilli and may thus play important roles in the physiology of these microorganisms. PMID:24375133
A novel method for detection of H9N2 influenza viruses by an aptamer-real time-PCR.
Hmila, Issam; Wongphatcharachai, Manoosak; Laamiri, Nacira; Aouini, Rim; Marnissi, Boutheina; Arbi, Marwa; Sreevatsan, Srinand; Ghram, Abdeljelil
2017-05-01
H9N2 Influenza subtype has emerged in Tunisia causing epidemics in poultry and resulting in major economic losses. New mutations in their hemagglutinin and neuraminidase proteins were acquired, suggesting their potential to directly infect humans. Effective surveillance tools should be implemented to help prevent potential spillover of the virus across species. We have developed a highly sensitive real time immuno-polymerase chain reaction (RT-I-PCR) method for detecting H9N2 virus. The assay applies aptamers as ligands to capture and detect the virus. First, a panel of specific ssDNA aptamers was selected via a one step high stringency protocol. Next, the panel of selected aptamers was characterized for their affinities and their specificity to H9N2 virus. The aptamer showing the highest binding affinity to the virus was used as ligand to develop a highly sensitive sandwich Aptamer I-PCR. A 3-log increase in analytical sensitivity was achieved as compared to a routinely used ELISA antigen test, highlighting the potential of this approach to detect very low levels of virus particles. The test was validated using clinical samples and constitutes a rapid and a label-free platform, opening a new venue for the development of aptamer -based viability sensing for a variety of microorganisms of economic importance in Tunisia and surrounding regions. Copyright © 2017 Elsevier B.V. All rights reserved.
Bomfim, Maria Rosa Quaresma; Koury, Matilde Cota
2006-12-20
We evaluated the use of low-stringency single specific primer PCR (LSSP-PCR) for genetically typing Leptospira directly from urine samples of cattle with clinical suspicion of leptospirosis. Urine samples obtained from 40 cattle with clinical suspicion of leptospirosis were amplified by specific PCR using the following primers: Internal 1/Internal 2 and G1/G2. The internal primers were designed from the gene sequence of the outer membrane lipoprotein Lip32 from Leptospira kirschneri, strain RM52. The PCR products were amplified with these two pairs of primers, which had approximately 497 and 285bp, respectively, and were subsequently used as a template for LSSP-PCR analysis. The genetic signatures from the leptospires which were present in the urine samples allowed us to make a preliminary identification of the leptospires by comparing the LSSP-PCR profiles obtained directly from urine samples with those from reference leptospires. The LSSP-PCR profiles obtained with the Internal 1 primer or with the G1 primer allowed the grouping of the leptospires into serogroups. LSSP-PCR was found to be a useful and sensitive approach capable of identifying leptospires directly from biological samples without the need for prior bacterial isolation. In conclusion, the LSSP-PCR technique may still be helpful in discriminating serogroups of Leptospira from different animal reservoirs, since the early identification of carrier animals and information on the shedding state are crucial to prevent the spread of leptospiral infection to other animals and humans.
Alvarenga, Janaína Sousa Campos; Ligeiro, Carla Maia; Gontijo, Célia Maria Ferreira; Cortes, Sofia; Campino, Lenea; Vago, Annamaria Ravara; Melo, Maria Norma
2012-01-01
Background Visceral Leishmaniasis (VL) caused by species from the Leishmania donovani complex is the most severe form of the disease, lethal if untreated. VL caused by Leishmania infantum is a zoonosis with an increasing number of human cases and millions of dogs infected in the Old and the New World. In this study, L. infantum (syn. L.chagasi) strains were isolated from human and canine VL cases. The strains were obtained from endemic areas from Brazil and Portugal and their genetic polymorphism was ascertained using the LSSP-PCR (Low-Stringency Single Specific Primer PCR) technique for analyzing the kinetoplastid DNA (kDNA) minicircles hypervariable region. Principal Findings KDNA genetic signatures obtained by minicircle LSSP-PCR analysis of forty L. infantum strains allowed the grouping of strains in several clades. Furthermore, LSSP-PCR profiles of L. infantum subpopulations were closely related to the host origin (human or canine). To our knowledge this is the first study which used this technique to compare genetic polymorphisms among strains of L. infantum originated from both the Old and the New World. Conclusions LSSP-PCR profiles obtained by analysis of L. infantum kDNA hypervariable region of parasites isolated from human cases and infected dogs from Brazil and Portugal exhibited a genetic correlation among isolates originated from the same reservoir, human or canine. However, no association has been detected among the kDNA signatures and the geographical origin of L. infantum strains. PMID:22912862
Space debris detection in optical image sequences.
Xi, Jiangbo; Wen, Desheng; Ersoy, Okan K; Yi, Hongwei; Yao, Dalei; Song, Zongxi; Xi, Shaobo
2016-10-01
We present a high-accuracy, low false-alarm rate, and low computational-cost methodology for removing stars and noise and detecting space debris with low signal-to-noise ratio (SNR) in optical image sequences. First, time-index filtering and bright star intensity enhancement are implemented to remove stars and noise effectively. Then, a multistage quasi-hypothesis-testing method is proposed to detect the pieces of space debris with continuous and discontinuous trajectories. For this purpose, a time-index image is defined and generated. Experimental results show that the proposed method can detect space debris effectively without any false alarms. When the SNR is higher than or equal to 1.5, the detection probability can reach 100%, and when the SNR is as low as 1.3, 1.2, and 1, it can still achieve 99%, 97%, and 85% detection probabilities, respectively. Additionally, two large sets of image sequences are tested to show that the proposed method performs stably and effectively.
ERIC Educational Resources Information Center
Eow, Yee Leng; Ali, Wan Zah bte Wan; Mahmud, Rosnaini bt.; Baki, Roselan
2010-01-01
Creativity is an important entity in developing human capital while computer games are the current generation's contemporary tool. This study focused on the teaching of computer games development in order to enhance the creative perception of secondary school children. The study applied randomised subjects, with control group experimental design,…
Let Me Share a Secret with You! Teaching with Computers.
ERIC Educational Resources Information Center
de Vasconcelos, Maria
The author describes her experiences teaching a computer-enhanced Modern Poetry course. The author argues that using computers enhances the concept of the classroom as learning community. It was the author's experience that students' postings on the discussion board created an atmosphere that encouraged student involvement, as opposed to the…
ERIC Educational Resources Information Center
McAndrews, Gina M.; Mullen, Russell E.; Chadwick, Scott A.
2005-01-01
Multi-media learning tools were developed to enhance student learning for an introductory agronomy course at Iowa State University. During fall 2002, the new interactive computer program, called Computer Interactive Multimedia Program for Learning Enhancement (CIMPLE) was incorporated into the teaching, learning, and assessment processes of the…
Shin, Hee Jeong; Kim, Song Soo; Lee, Jae-Hwan; Park, Jae-Hyeong; Jeong, Jin-Ok; Jin, Seon Ah; Shin, Byung Seok; Shin, Kyung-Sook; Ahn, Moonsang
2016-06-01
To evaluate the feasibility of low-concentration contrast medium (CM) for vascular enhancement, image quality, and radiation dose on computed tomography aortography (CTA) using a combined low-tube-voltage and iterative reconstruction (IR) technique. Ninety subjects underwent dual-source CT (DSCT) operating in dual-source, high-pitch mode. DSCT scans were performed using both high-concentration CM (Group A, n = 50; Iomeprol 400) and low-concentration CM (Group B, n = 40; Iodixanol 270). Group A was scanned using a reference tube potential of 120 kVp and 120 reference mAs under automatic exposure control with IR. Group B was scanned using low-tube-voltage (80 or 100 kVp if body mass index ≥25 kg/m(2)) at a fixed current of 150 mAs, along with IR. Images of the two groups were compared regarding attenuation, image noise, signal-to-noise ratio (SNR), contrast-to-noise ratio (CNR), iodine load, and radiation dose in various locations of the CTA. In comparison between Group A and Group B, the average mean attenuation (454.73 ± 86.66 vs. 515.96 ± 101.55 HU), SNR (25.28 ± 4.34 vs. 31.29 ± 4.58), and CNR (21.83 ± 4.20 vs. 27.55 ± 4.81) on CTA in Group B showed significantly greater values and significantly lower image noise values (18.76 ± 2.19 vs. 17.48 ± 3.34) than those in Group A (all Ps < 0.05). Homogeneous contrast enhancement from the ascending thoracic aorta to the infrarenal abdominal aorta was significantly superior in Group B (P < 0.05). Low-concentration CM and a low-tube-voltage combination technique using IR is a feasible method, showing sufficient contrast enhancement and image quality.
Fog-computing concept usage as means to enhance information and control system reliability
NASA Astrophysics Data System (ADS)
Melnik, E. V.; Klimenko, A. B.; Ivanov, D. Ya
2018-05-01
This paper focuses on the reliability issue of information and control systems (ICS). The authors propose using the elements of the fog-computing concept to enhance the reliability function. The key idea of fog-computing is to shift computations to the fog-layer of the network, and thus to decrease the workload of the communication environment and data processing components. As for ICS, workload also can be distributed among sensors, actuators and network infrastructure facilities near the sources of data. The authors simulated typical workload distribution situations for the “traditional” ICS architecture and for the one with fogcomputing concept elements usage. The paper contains some models, selected simulation results and conclusion about the prospects of the fog-computing as a means to enhance ICS reliability.
NASA Technical Reports Server (NTRS)
Harper, Richard E.; Babikyan, Carol A.; Butler, Bryan P.; Clasen, Robert J.; Harris, Chris H.; Lala, Jaynarayan H.; Masotto, Thomas K.; Nagle, Gail A.; Prizant, Mark J.; Treadwell, Steven
1994-01-01
The Army Avionics Research and Development Activity (AVRADA) is pursuing programs that would enable effective and efficient management of large amounts of situational data that occurs during tactical rotorcraft missions. The Computer Aided Low Altitude Night Helicopter Flight Program has identified automated Terrain Following/Terrain Avoidance, Nap of the Earth (TF/TA, NOE) operation as key enabling technology for advanced tactical rotorcraft to enhance mission survivability and mission effectiveness. The processing of critical information at low altitudes with short reaction times is life-critical and mission-critical necessitating an ultra-reliable/high throughput computing platform for dependable service for flight control, fusion of sensor data, route planning, near-field/far-field navigation, and obstacle avoidance operations. To address these needs the Army Fault Tolerant Architecture (AFTA) is being designed and developed. This computer system is based upon the Fault Tolerant Parallel Processor (FTPP) developed by Charles Stark Draper Labs (CSDL). AFTA is hard real-time, Byzantine, fault-tolerant parallel processor which is programmed in the ADA language. This document describes the results of the Detailed Design (Phase 2 and 3 of a 3-year project) of the AFTA development. This document contains detailed descriptions of the program objectives, the TF/TA NOE application requirements, architecture, hardware design, operating systems design, systems performance measurements and analytical models.
The microcomputer in the dental office: a new diagnostic aid.
van der Stelt, P F
1985-06-01
The first computer applications in the dental office were based upon standard accountancy procedures. Recently, more and more computer applications have become available to meet the specific requirements of dental practice. This implies not only business procedures, but also facilities to store patient records in the system and retrieve them easily. Another development concerns the automatic calculation of diagnostic data such as those provided in cephalometric analysis. Furthermore, growth and surgical results in the craniofacial area can be predicted by computerized extrapolation. Computers have been useful in obtaining the patient's anamnestic data objectively and for the making of decisions based on such data. Computer-aided instruction systems have been developed for undergraduate students to bridge the gap between textbook and patient interaction without the risks inherent in the latter. Radiology will undergo substantial changes as a result of the application of electronic imaging devices instead of the conventional radiographic films. Computer-assisted electronic imaging will enable image processing, image enhancement, pattern recognition and data transmission for consultation and storage purposes. Image processing techniques will increase image quality whilst still allowing low-dose systems. Standardization of software and system configuration and the development of 'user friendly' programs is the major concern for the near future.
ERIC Educational Resources Information Center
Kim, SugHee; Chung, KwangSik; Yu, HeonChang
2013-01-01
The purpose of this paper is to propose a training program for creative problem solving based on computer programming. The proposed program will encourage students to solve real-life problems through a creative thinking spiral related to cognitive skills with computer programming. With the goal of enhancing digital fluency through this proposed…
Transient Faults in Computer Systems
NASA Technical Reports Server (NTRS)
Masson, Gerald M.
1993-01-01
A powerful technique particularly appropriate for the detection of errors caused by transient faults in computer systems was developed. The technique can be implemented in either software or hardware; the research conducted thus far primarily considered software implementations. The error detection technique developed has the distinct advantage of having provably complete coverage of all errors caused by transient faults that affect the output produced by the execution of a program. In other words, the technique does not have to be tuned to a particular error model to enhance error coverage. Also, the correctness of the technique can be formally verified. The technique uses time and software redundancy. The foundation for an effective, low-overhead, software-based certification trail approach to real-time error detection resulting from transient fault phenomena was developed.
Overlapped Fourier coding for optical aberration removal
Horstmeyer, Roarke; Ou, Xiaoze; Chung, Jaebum; Zheng, Guoan; Yang, Changhuei
2014-01-01
We present an imaging procedure that simultaneously optimizes a camera’s resolution and retrieves a sample’s phase over a sequence of snapshots. The technique, termed overlapped Fourier coding (OFC), first digitally pans a small aperture across a camera’s pupil plane with a spatial light modulator. At each aperture location, a unique image is acquired. The OFC algorithm then fuses these low-resolution images into a full-resolution estimate of the complex optical field incident upon the detector. Simultaneously, the algorithm utilizes redundancies within the acquired dataset to computationally estimate and remove unknown optical aberrations and system misalignments via simulated annealing. The result is an imaging system that can computationally overcome its optical imperfections to offer enhanced resolution, at the expense of taking multiple snapshots over time. PMID:25321982
Bhinder, Bhavneet; Antczak, Christophe; Ramirez, Christina N.; Shum, David; Liu-Sullivan, Nancy; Radu, Constantin; Frattini, Mark G.
2013-01-01
Abstract RNA interference technology is becoming an integral tool for target discovery and validation.; With perhaps the exception of only few studies published using arrayed short hairpin RNA (shRNA) libraries, most of the reports have been either against pooled siRNA or shRNA, or arrayed siRNA libraries. For this purpose, we have developed a workflow and performed an arrayed genome-scale shRNA lethality screen against the TRC1 library in HeLa cells. The resulting targets would be a valuable resource of candidates toward a better understanding of cellular homeostasis. Using a high-stringency hit nomination method encompassing criteria of at least three active hairpins per gene and filtered for potential off-target effects (OTEs), referred to as the Bhinder–Djaballah analysis method, we identified 1,252 lethal and 6 rescuer gene candidates, knockdown of which resulted in severe cell death or enhanced growth, respectively. Cross referencing individual hairpins with the TRC1 validated clone database, 239 of the 1,252 candidates were deemed independently validated with at least three validated clones. Through our systematic OTE analysis, we have identified 31 microRNAs (miRNAs) in lethal and 2 in rescuer genes; all having a seed heptamer mimic in the corresponding shRNA hairpins and likely cause of the OTE observed in our screen, perhaps unraveling a previously unknown plausible essentiality of these miRNAs in cellular viability. Taken together, we report on a methodology for performing large-scale arrayed shRNA screens, a comprehensive analysis method to nominate high-confidence hits, and a performance assessment of the TRC1 library highlighting the intracellular inefficiencies of shRNA processing in general. PMID:23198867
Rodriguez, A Noel; DeWitt, Peter; Fisher, Jennifer; Broadfoot, Kirsten; Hurt, K Joseph
2016-06-11
To characterize the psychometric properties of a novel Obstetric Communication Assessment Tool (OCAT) in a pilot study of standardized difficult OB communication scenarios appropriate for undergraduate medical evaluation. We developed and piloted four challenging OB Standardized Patient (SP) scenarios in a sample of twenty-one third year OB/GYN clerkship students: Religious Beliefs (RB), Angry Father (AF), Maternal Smoking (MS), and Intimate Partner Violence (IPV). Five trained Standardized Patient Reviewers (SPRs) independently scored twenty-four randomized video-recorded encounters using the OCAT. Cronbach's alpha and Intraclass Correlation Coefficient-2 (ICC-2) were used to estimate internal consistency (IC) and inter-rater reliability (IRR), respectively. Systematic variation in reviewer scoring was assessed using the Stuart-Maxwell test. IC was acceptable to excellent with Cronbach's alpha values (and 95% Confidence Intervals [CI]): RB 0.91 (0.86, 0.95), AF 0.76 (0.62, 0.87), MS 0.91 (0.86, 0.95), and IPV 0.94 (0.91, 0.97). IRR was unacceptable to poor with ICC-2 values: RB 0.46 (0.40, 0.53), AF 0.48 (0.41, 0.54), MS 0.52 (0.45, 0.58), and IPV 0.67 (0.61, 0.72). Stuart-Maxwell analysis indicated systematic differences in reviewer stringency. Our initial characterization of the OCAT demonstrates important issues in communications assessment. We identify scoring inconsistencies due to differences in SPR rigor that require enhanced training to improve assessment reliability. We outline a rational process for initial communication tool validation that may be useful in undergraduate curriculum development, and acknowledge that rigorous validation of OCAT training and implementation is needed to create a valuable OB communication assessment tool.
Wilkinson, Katie; Harris, Samantha; Gaur, Prashant; Haile, Askale; Armour, Rosalind; Teramura, Gayle; Delaney, Meghan
2012-02-01
Sickle cell disease (SCD) patients have dissimilar red blood cell (RBC) phenotypes compared to the primarily Caucasian blood donor base due, in part, to underlying complex Rh and silenced Duffy expression. Gene array-based technology offers high-throughput antigen typing of blood donors and can identify patients with altered genotypes. The purpose of the study was to ascertain if RBC components drawn from predominantly Caucasian donors could provide highly antigen-matched products for molecularly typed SCD patients. SCD patients were genotyped by a molecular array (HEA Beadchip, BioArray Solutions). The extended antigen phenotype (C, c, E, e, K, k, Jk(a) , Jk(b) , Fy(a) , Fy(b) , S, s) was used to query the inventory using different matching algorithms; the resulting number of products was recorded. A mean of 96.2 RBC products was available for each patient at basic-level, 34 at mid-level, and 16.3 at high-level stringency. The number of negative antigens correlated negatively with the number of available products. The Duffy silencing mutation in the promoter region (67T>C) (GATA) was found in 96.5% of patients. Allowing Fy(b+) products for patients with GATA increased the number of available products by up to 180%, although it does not ensure prevention of Duffy antibodies in all patients. This feasibility study provides evidence that centers with primarily Caucasian donors may be able to provide highly antigen-matched products. Knowledge of the GATA status expands the inventory of antigen-matched products. Further work is needed to determine the most clinically appropriate match level for SCD patients. © 2012 American Association of Blood Banks.
Sonnaert, Maarten; Kerckhofs, Greet; Papantoniou, Ioannis; Van Vlierberghe, Sandra; Boterberg, Veerle; Dubruel, Peter; Luyten, Frank P; Schrooten, Jan; Geris, Liesbet
2015-01-01
To progress the fields of tissue engineering (TE) and regenerative medicine, development of quantitative methods for non-invasive three dimensional characterization of engineered constructs (i.e. cells/tissue combined with scaffolds) becomes essential. In this study, we have defined the most optimal staining conditions for contrast-enhanced nanofocus computed tomography for three dimensional visualization and quantitative analysis of in vitro engineered neo-tissue (i.e. extracellular matrix containing cells) in perfusion bioreactor-developed Ti6Al4V constructs. A fractional factorial 'design of experiments' approach was used to elucidate the influence of the staining time and concentration of two contrast agents (Hexabrix and phosphotungstic acid) and the neo-tissue volume on the image contrast and dataset quality. Additionally, the neo-tissue shrinkage that was induced by phosphotungstic acid staining was quantified to determine the operating window within which this contrast agent can be accurately applied. For Hexabrix the staining concentration was the main parameter influencing image contrast and dataset quality. Using phosphotungstic acid the staining concentration had a significant influence on the image contrast while both staining concentration and neo-tissue volume had an influence on the dataset quality. The use of high concentrations of phosphotungstic acid did however introduce significant shrinkage of the neo-tissue indicating that, despite sub-optimal image contrast, low concentrations of this staining agent should be used to enable quantitative analysis. To conclude, design of experiments allowed us to define the most optimal staining conditions for contrast-enhanced nanofocus computed tomography to be used as a routine screening tool of neo-tissue formation in Ti6Al4V constructs, transforming it into a robust three dimensional quality control methodology.
Local delivery of FTY720 accelerates cranial allograft incorporation and bone formation.
Huang, Cynthia; Das, Anusuya; Barker, Daniel; Tholpady, Sunil; Wang, Tiffany; Cui, Quanjun; Ogle, Roy; Botchwey, Edward
2012-03-01
Endogenous stem cell recruitment to the site of skeletal injury is key to enhanced osseous remodeling and neovascularization. To this end, this study utilized a novel bone allograft coating of poly(lactic-co-glycolic acid) (PLAGA) to sustain the release of FTY720, a selective agonist for sphingosine 1-phosphate (S1P) receptors, from calvarial allografts. Uncoated allografts, vehicle-coated, low dose FTY720 in PLAGA (1:200 w:w) and high dose FTY720 in PLAGA (1:40) were implanted into critical size calvarial bone defects. The ability of local FTY720 delivery to promote angiogenesis, maximize osteoinductivity and improve allograft incorporation by recruitment of bone progenitor cells from surrounding soft tissues and microcirculation was evaluated. FTY720 bioactivity after encapsulation and release was confirmed with sphingosine kinase 2 assays. HPLC-MS quantified about 50% loaded FTY720 release of the total encapsulated drug (4.5 μg) after 5 days. Following 2 weeks of defect healing, FTY720 delivery led to statistically significant increases in bone volumes compared to controls, with total bone volume increases for uncoated, coated, low FTY720 and high FTY720 of 5.98, 3.38, 7.2 and 8.9 mm(3), respectively. The rate and extent of enhanced bone growth persisted through week 4 but, by week 8, increases in bone formation in FTY720 groups were no longer statistically significant. However, micro-computed tomography (microCT) of contrast enhanced vascular ingrowth (MICROFIL®) and histological analysis showed enhanced integration as well as directed bone growth in both high and low dose FTY720 groups compared to controls.
Day, Eric Anthony; Boatman, Paul R; Kowollik, Vanessa; Espejo, Jazmine; McEntire, Lauren E; Sherwin, Rachel E
2007-12-01
This study examined the effectiveness of collaborative training for individuals with low pretraining self-efficacy versus individuals with high pretraining self-efficacy regarding the acquisition of a complex skill that involved strong cognitive and psychomotor demands. Despite support for collaborative learning from the educational literature and the similarities between collaborative learning and interventions designed to remediate low self-efficacy, no research has addressed how self-efficacy and collaborative learning interact in contexts concerning complex skills and human-machine interactions. One hundred fifty-five young male adults trained either individually or collaboratively with a more experienced partner on a complex computer task that simulated the demands of a dynamic aviation environment. Participants also completed a task-specific measure of self-efficacy before, during, and after training. Collaborative training enhanced skill acquisition significantly more for individuals with low pretraining self-efficacy than for individuals with high pretraining self-efficacy. However, collaborative training did not bring the skill acquisition levels of those persons with low pretraining self-efficacy to the levels found for persons with high pretraining self-efficacy. Moreover, tests of mediation suggested that collaborative training may have enhanced appropriate skill development strategies without actually raising self-efficacy. Although collaborative training can facilitate the skill acquisition process for trainees with low self-efficacy, future research is needed that examines how the negative effects of low pretraining self-efficacy on complex skill acquisition can be more fully remediated. The differential effects of collaborative training as a function of self-efficacy highlight the importance of person analysis and tailoring training to meet differing trainee needs.
Radar range data signal enhancement tracker
NASA Technical Reports Server (NTRS)
1975-01-01
The design, fabrication, and performance characteristics are described of two digital data signal enhancement filters which are capable of being inserted between the Space Shuttle Navigation Sensor outputs and the guidance computer. Commonality of interfaces has been stressed so that the filters may be evaluated through operation with simulated sensors or with actual prototype sensor hardware. The filters will provide both a smoothed range and range rate output. Different conceptual approaches are utilized for each filter. The first filter is based on a combination low pass nonrecursive filter and a cascaded simple average smoother for range and range rate, respectively. Filter number two is a tracking filter which is capable of following transient data of the type encountered during burn periods. A test simulator was also designed which generates typical shuttle navigation sensor data.
NASA Astrophysics Data System (ADS)
Kaganskiy, Arsenty; Fischbach, Sarah; Strittmatter, André; Rodt, Sven; Heindel, Tobias; Reitzenstein, Stephan
2018-04-01
We report on the realization of scalable single-photon sources (SPSs) based on single site-controlled quantum dots (SCQDs) and deterministically fabricated microlenses. The fabrication process comprises the buried-stressor growth technique complemented with low-temperature in-situ electron-beam lithography for the integration of SCQDs into microlens structures with high yield and high alignment accuracy. The microlens-approach leads to a broadband enhancement of the photon-extraction efficiency of up to (21 ± 2)% and a high suppression of multi-photon events with g (2)(τ = 0) < 0.06 without background subtraction. The demonstrated combination of site-controlled growth of QDs and in-situ electron-beam lithography is relevant for arrays of efficient SPSs which, can be applied in photonic quantum circuits and advanced quantum computation schemes.
Moore, William; Chaya, Yair; Chaudhry, Ammar; Depasquale, Britney; Glass, Samantha; Lee, Susan; Shin, James; Mikhail, George; Bhattacharji, Priya; Kim, Bong; Bilfinger, Thomas
2015-01-01
Stereotactic ablative radiotherapy (SABR) offers a curative treatment for lung cancer in patients who are marginal surgical candidates. However, unlike traditional surgery the lung cancer remains in place after treatment. Thus, imaging follow-up for evaluation of recurrence is of paramount importance. In this retrospective designed Institutional Review Board-approved study, follow-up contrast-enhanced computed tomography (CT) exams were performed on sixty one patients to evaluate enhancement pattern in the ablation zone at 1, 3, 6, and 12 months after SABR. Eleven patients had recurrence within the ablation zone after SABR. The postcontrast enhancement in the recurrence group showed a washin and washout phenomenon, whereas the radiation-induced lung injury group showed continuous enhancement suggesting an inflammatory process. The textural feature of the ablation zone of enhancement and perfusion as demonstrated in computed tomography nodule enhancement may allow early differentiation of recurrence from radiation-induced lung injury in patients' status after SABR or primary lung cancer.
Educational Technology: Best Practices from America's Schools.
ERIC Educational Resources Information Center
Bozeman, William C.; Baumbach, Donna J.
This book begins with an overview of computer technology concepts, including computer system configurations, computer communications, and software. Instructional computer applications are then discussed; topics include computer-assisted instruction, computer-managed instruction, computer-enhanced instruction, LOGO, authoring programs, presentation…
Learning a No-Reference Quality Assessment Model of Enhanced Images With Big Data.
Gu, Ke; Tao, Dacheng; Qiao, Jun-Fei; Lin, Weisi
2018-04-01
In this paper, we investigate into the problem of image quality assessment (IQA) and enhancement via machine learning. This issue has long attracted a wide range of attention in computational intelligence and image processing communities, since, for many practical applications, e.g., object detection and recognition, raw images are usually needed to be appropriately enhanced to raise the visual quality (e.g., visibility and contrast). In fact, proper enhancement can noticeably improve the quality of input images, even better than originally captured images, which are generally thought to be of the best quality. In this paper, we present two most important contributions. The first contribution is to develop a new no-reference (NR) IQA model. Given an image, our quality measure first extracts 17 features through analysis of contrast, sharpness, brightness and more, and then yields a measure of visual quality using a regression module, which is learned with big-data training samples that are much bigger than the size of relevant image data sets. The results of experiments on nine data sets validate the superiority and efficiency of our blind metric compared with typical state-of-the-art full-reference, reduced-reference and NA IQA methods. The second contribution is that a robust image enhancement framework is established based on quality optimization. For an input image, by the guidance of the proposed NR-IQA measure, we conduct histogram modification to successively rectify image brightness and contrast to a proper level. Thorough tests demonstrate that our framework can well enhance natural images, low-contrast images, low-light images, and dehazed images. The source code will be released at https://sites.google.com/site/guke198701/publications.
NASA Astrophysics Data System (ADS)
Rodriguez, Sarah L.; Lehman, Kathleen
2017-10-01
This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.
[Alpha power voluntary increasing training for cognition enhancement study].
Alekseeva, M V; Balioz, N V; Muravleva, K B; Sapina, E V; Bazanova, O M
2012-01-01
With the aim simultaneous alpha EEG stimulating and EMG decreasing biofeedback training impact on the alpha-activity and cognitive functions 27 healthy male subjects (18-34 years) were investigated in pre- and post 10 training sessions of the voluntary increasing alpha power in individual upper alpha range. The accuracy of conceptual span task, fluency and flexibility in alternatives use task performance and alpha-activity indices were compared in real (14 participants) and sham (13 participants) biofeedback groups for the discrimination of the feedback role in training. The follow up effect oftrainings was studied through month over the training sessions. Results showed that alpha biofeedback training enhanced the fluency and accuracy in cognitive performance, increased resting frequency, width and power in individual upper alpha range only in participants with low baseline alpha frequency. While mock biofeedback increased resting alpha power only in participants with high baseline resting alpha frequency and did not change the cognitive performance. Biofeedback training eliminated the alpha power decrease in response to arithmetic task in both with high and low alpha frequency participants and this effect was followed up over the month. Mock biofeedback training has no such effect. It could be concluded that alpha-EEG-EMG biofeedback has application not only for cognition enhancement, but also in prognostic aims in clinical practice and brain-computer interface technology.
NASA Astrophysics Data System (ADS)
Sridharan, S.; Sathishkumar, S.; Raghunath, K.
2009-01-01
Rayleigh lidar observations of temperature structure and gravity wave activity were carried out at Gadanki (13.5° N, 79.2° E) during January-February 2006. A major stratospheric warming event occurred at high latitude during the end of January and early February. There was a sudden enhancement in the stratopause temperature over Gadanki coinciding with the date of onset of the major stratospheric warming event which occurred at high latitudes. The temperature enhancement persisted even after the end of the high latitude major warming event. During the same time, the UKMO (United Kingdom Meteorological Office) zonal mean temperature showed a similar warming episode at 10° N and cooling episode at 60° N around the region of stratopause. This could be due to ascending (descending) motions at high (low) latitudes above the critical level of planetary waves, where there was no planetary wave flux. The time variation of the gravity wave potential energy computed from the temperature perturbations over Gadanki shows variabilities at planetary wave periods, suggesting a non-linear interaction between gravity waves and planetary waves. The space-time analysis of UKMO temperature data at high and low latitudes shows the presence of similar periodicities of planetary wave of zonal wavenumber 1.
NASA Astrophysics Data System (ADS)
Hao, Qiushi; Zhang, Xin; Wang, Yan; Shen, Yi; Makis, Viliam
2018-07-01
Acoustic emission (AE) technology is sensitive to subliminal rail defects, however strong wheel-rail contact rolling noise under high-speed condition has gravely impeded detecting of rail defects using traditional denoising methods. In this context, the paper develops an adaptive detection method for rail cracks, which combines multiresolution analysis with an improved adaptive line enhancer (ALE). To obtain elaborate multiresolution information of transient crack signals with low computational cost, lifting scheme-based undecimated wavelet packet transform is adopted. In order to feature the impulsive property of crack signals, a Shannon entropy-improved ALE is proposed as a signal enhancing approach, where Shannon entropy is introduced to improve the cost function. Then a rail defect detection plan based on the proposed method for high-speed condition is put forward. From theoretical analysis and experimental verification, it is demonstrated that the proposed method has superior performance in enhancing the rail defect AE signal and reducing the strong background noise, offering an effective multiresolution approach for rail defect detection under high-speed and strong-noise condition.
Molecular dynamics simulations of the dielectric properties of fructose aqueous solutions
NASA Astrophysics Data System (ADS)
Sonoda, Milton T.; Elola, M. Dolores; Skaf, Munir S.
2016-10-01
The static dielectric permittivity and dielectric relaxation properties of fructose aqueous solutions of different concentrations ranging from 1.0 to 4.0 mol l-1 are investigated by means of molecular dynamics simulations. The contributions from intra- and interspecies molecular correlations were computed individually for both the static and frequency-dependent dielectric properties, and the results were compared with the available experimental data. Simulation results in the time- and frequency-domains were analyzed and indicate that the presence of fructose has little effect on the position of the fast, high-frequency (>500 cm-1) components of the dielectric response spectrum. The low-frequency (<0.1 cm-1) components, however, are markedly influenced by sugar concentration. Our analysis indicates that fructose-fructose and fructose-water interactions strongly affect the rotational-diffusion regime of molecular motions in the solutions. Increasing fructose concentration not only enhances sugar-sugar and sugar-water low frequency contributions to the dielectric loss spectrum but also slows down the reorientational dynamics of water molecules. These results are consistent with previous computer simulations carried out for other disaccharide aqueous solutions.
Assistive technology applied to education of students with visual impairment.
Alves, Cássia Cristiane de Freitas; Monteiro, Gelse Beatriz Martins; Rabello, Suzana; Gasparetto, Maria Elisabete Rodrigues Freire; de Carvalho, Keila Monteiro
2009-08-01
Verify the application of assistive technology, especially information technology in the education of blind and low-vision students from the perceptions of their teachers. Descriptive survey study in public schools in three municipalities of the state of São Paulo, Brazil. The sample comprised 134 teachers. According to the teachers' opinions, there are differences in the specificities and applicability of assistive technology for blind and low-vision students, for whom specific computer programs are important. Information technology enhances reading and writing skills, as well as communication with the world on an equal basis, thereby improving quality of life and facilitating the learning process. The main reason for not using information technology is the lack of planning courses. The main requirements for the use of information technology in schools are enough computers for all students, advisers to help teachers, and pedagogical support. Assistive technology is applied to education of students with visual impairment; however, teachers indicate the need for infrastructure and pedagogical support. Information technology is an important tool in the inclusion process and can promote independence and autonomy of students with visual impairment.
Vectorlike particles, Z‧ and Yukawa unification in F-theory inspired E6
NASA Astrophysics Data System (ADS)
Karozas, Athanasios; Leontaris, George K.; Shafi, Qaisar
2018-03-01
We explore the low energy implications of an F-theory inspired E6 model whose breaking yields, in addition to the MSSM gauge symmetry, a Z‧ gauge boson associated with a U (1) symmetry broken at the TeV scale. The zero mode spectrum of the effective low energy theory is derived from the decomposition of the 27 and 27 ‾ representations of E6 and we parametrise their multiplicities in terms of a minimum number of flux parameters. We perform a two-loop renormalisation group analysis of the gauge and Yukawa couplings of the effective theory model and estimate lower bounds on the new vectorlike particles predicted in the model. We compute the third generation Yukawa couplings in an F-theory context assuming an E8 point of enhancement and express our results in terms of the local flux densities associated with the gauge symmetry breaking. We find that their values are compatible with the ones computed by the renormalisation group equations, and we identify points in the parameter space of the flux densities where the t - b - τ Yukawa couplings unify.
Computer-Supported Feedback Message Tailoring for Healthcare Providers in Malawi: Proof-of-Concept.
Landis-Lewis, Zach; Douglas, Gerald P; Hochheiser, Harry; Kam, Matthew; Gadabu, Oliver; Bwanali, Mwatha; Jacobson, Rebecca S
2015-01-01
Although performance feedback has the potential to help clinicians improve the quality and safety of care, healthcare organizations generally lack knowledge about how this guidance is best provided. In low-resource settings, tools for theory-informed feedback tailoring may enhance limited clinical supervision resources. Our objectives were to establish proof-of-concept for computer-supported feedback message tailoring in Malawi, Africa. We conducted this research in five stages: clinical performance measurement, modeling the influence of feedback on antiretroviral therapy (ART) performance, creating a rule-based message tailoring process, generating tailored messages for recipients, and finally analysis of performance and message tailoring data. We retrospectively generated tailored messages for 7,448 monthly performance reports from 11 ART clinics. We found that tailored feedback could be routinely generated for four guideline-based performance indicators, with 35% of reports having messages prioritized to optimize the effect of feedback. This research establishes proof-of-concept for a novel approach to improving the use of clinical performance feedback in low-resource settings and suggests possible directions for prospective evaluations comparing alternative designs of feedback messages.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1993-01-01
Critical issues concerning the modeling of low density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools, and the activity in the NASA Ames Research Center's Aerothermodynamics Branch is described. Inherent in the process is a strong synergism between ground test and real gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flowfield simulation codes are discussed. These models were partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions is sparse and reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high enthalpy flow facilities, such as shock tubes and ballistic ranges.
Robust stereo matching with trinary cross color census and triple image-based refinements
NASA Astrophysics Data System (ADS)
Chang, Ting-An; Lu, Xiao; Yang, Jar-Ferr
2017-12-01
For future 3D TV broadcasting systems and navigation applications, it is necessary to have accurate stereo matching which could precisely estimate depth map from two distanced cameras. In this paper, we first suggest a trinary cross color (TCC) census transform, which can help to achieve accurate disparity raw matching cost with low computational cost. The two-pass cost aggregation (TPCA) is formed to compute the aggregation cost, then the disparity map can be obtained by a range winner-take-all (RWTA) process and a white hole filling procedure. To further enhance the accuracy performance, a range left-right checking (RLRC) method is proposed to classify the results as correct, mismatched, or occluded pixels. Then, the image-based refinements for the mismatched and occluded pixels are proposed to refine the classified errors. Finally, the image-based cross voting and a median filter are employed to complete the fine depth estimation. Experimental results show that the proposed semi-global stereo matching system achieves considerably accurate disparity maps with reasonable computation cost.
Issues and approach to develop validated analysis tools for hypersonic flows: One perspective
NASA Technical Reports Server (NTRS)
Deiwert, George S.
1992-01-01
Critical issues concerning the modeling of low-density hypervelocity flows where thermochemical nonequilibrium effects are pronounced are discussed. Emphasis is on the development of validated analysis tools. A description of the activity in the Ames Research Center's Aerothermodynamics Branch is also given. Inherent in the process is a strong synergism between ground test and real-gas computational fluid dynamics (CFD). Approaches to develop and/or enhance phenomenological models and incorporate them into computational flow-field simulation codes are discussed. These models have been partially validated with experimental data for flows where the gas temperature is raised (compressive flows). Expanding flows, where temperatures drop, however, exhibit somewhat different behavior. Experimental data for these expanding flow conditions are sparse; reliance must be made on intuition and guidance from computational chemistry to model transport processes under these conditions. Ground-based experimental studies used to provide necessary data for model development and validation are described. Included are the performance characteristics of high-enthalpy flow facilities, such as shock tubes and ballistic ranges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wally Melnitchouk; John Tjon
We compute the corrections from two-photon and \\gamma-Z exchange in parity-violating elastic electron--proton scattering, used to extract the strange form factors of the proton. We use a hadronic formalism that successfully reconciled the earlier discrepancy in the proton's electron to magnetic form factor ratio, suitably extended to the weak sector. Implementing realistic electroweak form factors, we find effects of the order 2-3% at Q^2 <~ 0.1 GeV^2, which are largest at backward angles, and have a strong Q^2 dependence at low Q^2. Two-boson contributions to the weak axial current are found to be enhanced at low Q^2 and for forwardmore » angles. We provide corrections at kinematics relevant for recent and upcoming parity-violating experiments.« less
Genomic clones for human cholinesterase
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kott, M.; Venta, P.J.; Larsen, J.
1987-05-01
A human genomic library was prepared from peripheral white blood cells from a single donor by inserting an MboI partial digest into BamHI poly-linker sites of EMBL3. This library was screened using an oligolabeled human cholinesterase cDNA probe over 700 bp long. The latter probe was obtained from a human basal ganglia cDNA library. Of approximately 2 million clones screened with high stringency conditions several positive clones were identified; two have been plaque purified. One of these clones has been partially mapped using restriction enzymes known to cut within the coded region of the cDNA for human serum cholinesterase. Hybridizationmore » of the fragments and their sizes are as expected if the genomic clone is cholinesterase. Sequencing of the DNA fragments in M13 is in progress to verify the identify of the clone and the location of introns.« less
Mosquito Infectivity and Parasitemia after Controlled Human Malaria Infection.
Walk, Jona; van Gemert, Geert-Jan; Graumans, Wouter; Sauerwein, Robert; Bijker, Else M
2018-04-30
Controlled Human Malaria Infection (CHMI) has become an increasingly important tool for the evaluation of drugs and vaccines. Controlled Human Malaria Infection has been demonstrated to be a reproducible model; however, there is some variability in time to onset of parasitemia between volunteers and studies. At our center, mosquitoes infected with Plasmodium falciparum by membrane feeding have variable and high salivary gland sporozoite load (mean 78,415; range 26,500-160,500). To determine whether this load influences parasitemia after CHMI, we analyzed data from 13 studies. We found no correlation between the sporozoite load of a mosquito batch and time to parasitemia or parasite density of first-wave parasitemia. These findings support the use of infected mosquito bite as a reproducible means of inducing P. falciparum infection and suggest that within this range, salivary gland sporozoite load does not influence the stringency of a CHMI.
Computer-assisted Behavioral Therapy and Contingency Management for Cannabis Use Disorder
Budney, Alan J.; Stanger, Catherine; Tilford, J. Mick; Scherer, Emily; Brown, Pamela C.; Li, Zhongze; Li, Zhigang; Walker, Denise
2015-01-01
Computer-assisted behavioral treatments hold promise for enhancing access to and reducing costs of treatments for substance use disorders. This study assessed the efficacy of a computer-assisted version of an efficacious, multicomponent treatment for cannabis use disorders (CUD), i.e., motivational enhancement therapy, cognitive-behavioral therapy, and abstinence-based contingency-management (MET/CBT/CM). An initial cost comparison was also performed. Seventy-five adult participants, 59% African Americans, seeking treatment for CUD received either, MET only (BRIEF), therapist-delivered MET/CBT/CM (THERAPIST), or computer-delivered MET/CBT/CM (COMPUTER). During treatment, the THERAPIST and COMPUTER conditions engendered longer durations of continuous cannabis abstinence than BRIEF (p < .05), but did not differ from each other. Abstinence rates and reduction in days of use over time were maintained in COMPUTER at least as well as in THERAPIST. COMPUTER averaged approximately $130 (p < .05) less per case than THERAPIST in therapist costs, which offset most of the costs of CM. Results add to promising findings that illustrate potential for computer-assisted delivery methods to enhance access to evidence-based care, reduce costs, and possibly improve outcomes. The observed maintenance effects and the cost findings require replication in larger clinical trials. PMID:25938629
DEEP: a general computational framework for predicting enhancers
Kleftogiannis, Dimitrios; Kalnis, Panos; Bajic, Vladimir B.
2015-01-01
Transcription regulation in multicellular eukaryotes is orchestrated by a number of DNA functional elements located at gene regulatory regions. Some regulatory regions (e.g. enhancers) are located far away from the gene they affect. Identification of distal regulatory elements is a challenge for the bioinformatics research. Although existing methodologies increased the number of computationally predicted enhancers, performance inconsistency of computational models across different cell-lines, class imbalance within the learning sets and ad hoc rules for selecting enhancer candidates for supervised learning, are some key questions that require further examination. In this study we developed DEEP, a novel ensemble prediction framework. DEEP integrates three components with diverse characteristics that streamline the analysis of enhancer's properties in a great variety of cellular conditions. In our method we train many individual classification models that we combine to classify DNA regions as enhancers or non-enhancers. DEEP uses features derived from histone modification marks or attributes coming from sequence characteristics. Experimental results indicate that DEEP performs better than four state-of-the-art methods on the ENCODE data. We report the first computational enhancer prediction results on FANTOM5 data where DEEP achieves 90.2% accuracy and 90% geometric mean (GM) of specificity and sensitivity across 36 different tissues. We further present results derived using in vivo-derived enhancer data from VISTA database. DEEP-VISTA, when tested on an independent test set, achieved GM of 80.1% and accuracy of 89.64%. DEEP framework is publicly available at http://cbrc.kaust.edu.sa/deep/. PMID:25378307
Robotic space simulation integration of vision algorithms into an orbital operations simulation
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.
1987-01-01
In order to successfully plan and analyze future space activities, computer-based simulations of activities in low earth orbit will be required to model and integrate vision and robotic operations with vehicle dynamics and proximity operations procedures. The orbital operations simulation (OOS) is configured and enhanced as a testbed for robotic space operations. Vision integration algorithms are being developed in three areas: preprocessing, recognition, and attitude/attitude rates. The vision program (Rice University) was modified for use in the OOS. Systems integration testing is now in progress.
Solutions for medical databases optimal exploitation.
Branescu, I; Purcarea, V L; Dobrescu, R
2014-03-15
The paper discusses the methods to apply OLAP techniques for multidimensional databases that leverage the existing, performance-enhancing technique, known as practical pre-aggregation, by making this technique relevant to a much wider range of medical applications, as a logistic support to the data warehousing techniques. The transformations have practically low computational complexity and they may be implemented using standard relational database technology. The paper also describes how to integrate the transformed hierarchies in current OLAP systems, transparently to the user and proposes a flexible, "multimodel" federated system for extending OLAP querying to external object databases.
Patterns of contrast enhancement in the brain and meninges.
Smirniotopoulos, James G; Murphy, Frances M; Rushing, Elizabeth J; Rees, John H; Schroeder, Jason W
2007-01-01
Contrast material enhancement for cross-sectional imaging has been used since the mid 1970s for computed tomography and the mid 1980s for magnetic resonance imaging. Knowledge of the patterns and mechanisms of contrast enhancement facilitate radiologic differential diagnosis. Brain and spinal cord enhancement is related to both intravascular and extravascular contrast material. Extraaxial enhancing lesions include primary neoplasms (meningioma), granulomatous disease (sarcoid), and metastases (which often manifest as mass lesions). Linear pachymeningeal (dura-arachnoid) enhancement occurs after surgery and with spontaneous intracranial hypotension. Leptomeningeal (pia-arachnoid) enhancement is present in meningitis and meningoencephalitis. Superficial gyral enhancement is seen after reperfusion in cerebral ischemia, during the healing phase of cerebral infarction, and with encephalitis. Nodular subcortical lesions are typical for hematogenous dissemination and may be neoplastic (metastases) or infectious (septic emboli). Deeper lesions may form rings or affect the ventricular margins. Ring enhancement that is smooth and thin is typical of an organizing abscess, whereas thick irregular rings suggest a necrotic neoplasm. Some low-grade neoplasms are "fluid-secreting," and they may form heterogeneously enhancing lesions with an incomplete ring sign as well as the classic "cyst-with-nodule" morphology. Demyelinating lesions, including both classic multiple sclerosis and tumefactive demyelination, may also create an open ring or incomplete ring sign. Thick and irregular periventricular enhancement is typical for primary central nervous system lymphoma. Thin enhancement of the ventricular margin occurs with infectious ependymitis. Understanding the classic patterns of lesion enhancement--and the radiologic-pathologic mechanisms that produce them--can improve image assessment and differential diagnosis.
Comparison of subpixel image registration algorithms
NASA Astrophysics Data System (ADS)
Boye, R. R.; Nelson, C. L.
2009-02-01
Research into the use of multiframe superresolution has led to the development of algorithms for providing images with enhanced resolution using several lower resolution copies. An integral component of these algorithms is the determination of the registration of each of the low resolution images to a reference image. Without this information, no resolution enhancement can be attained. We have endeavored to find a suitable method for registering severely undersampled images by comparing several approaches. To test the algorithms, an ideal image is input to a simulated image formation program, creating several undersampled images with known geometric transformations. The registration algorithms are then applied to the set of low resolution images and the estimated registration parameters compared to the actual values. This investigation is limited to monochromatic images (extension to color images is not difficult) and only considers global geometric transformations. Each registration approach will be reviewed and evaluated with respect to the accuracy of the estimated registration parameters as well as the computational complexity required. In addition, the effects of image content, specifically spatial frequency content, as well as the immunity of the registration algorithms to noise will be discussed.
High job control enhances vagal recovery in media work.
Lindholm, Harri; Sinisalo, Juha; Ahlberg, Jari; Jahkola, Antti; Partinen, Markku; Hublin, Christer; Savolainen, Aslak
2009-12-01
Job strain has been linked to increased risk of cardiovascular diseases. In modern media work, time pressures, rapidly changing situations, computer work and irregular working hours are common. Heart rate variability (HRV) has been widely used to monitor sympathovagal balance. Autonomic imbalance may play an additive role in the development of cardiovascular diseases. To study the effects of work demands and job control on the autonomic nervous system recovery among the media personnel. From the cross-sectional postal survey of the employees in Finnish Broadcasting Company (n = 874), three age cohorts (n = 132) were randomly selected for an analysis of HRV in 24 h electrocardiography recordings. In the middle-aged group, those who experienced high job control had significantly better vagal recovery than those with low or moderate control (P < 0.01). Among young and ageing employees, job control did not associate with autonomic recovery. High job control over work rather than low demands seemed to enhance autonomic recovery in middle-aged media workers. This was independent of poor health habits such as smoking, physical inactivity or alcohol consumption.
Implementing Computer Technology in the Rehabilitation Process.
ERIC Educational Resources Information Center
McCollum, Paul S., Ed.; Chan, Fong, Ed.
1985-01-01
This special issue contains seven articles, addressing rehabilitation in the information age, computer-assisted rehabilitation services, computer technology in rehabilitation counseling, computer-assisted career exploration and vocational decision making, computer-assisted assessment, computer enhanced employment opportunities for persons with…
Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blyth, Taylor S.; Avramova, Maria
The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics- based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR)more » cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal- hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.« less
Development and Implementation of CFD-Informed Models for the Advanced Subchannel Code CTF
NASA Astrophysics Data System (ADS)
Blyth, Taylor S.
The research described in this PhD thesis contributes to the development of efficient methods for utilization of high-fidelity models and codes to inform low-fidelity models and codes in the area of nuclear reactor core thermal-hydraulics. The objective is to increase the accuracy of predictions of quantities of interests using high-fidelity CFD models while preserving the efficiency of low-fidelity subchannel core calculations. An original methodology named Physics-based Approach for High-to-Low Model Information has been further developed and tested. The overall physical phenomena and corresponding localized effects, which are introduced by the presence of spacer grids in light water reactor (LWR) cores, are dissected in corresponding four building basic processes, and corresponding models are informed using high-fidelity CFD codes. These models are a spacer grid-directed cross-flow model, a grid-enhanced turbulent mixing model, a heat transfer enhancement model, and a spacer grid pressure loss model. The localized CFD-models are developed and tested using the CFD code STAR-CCM+, and the corresponding global model development and testing in sub-channel formulation is performed in the thermal-hydraulic subchannel code CTF. The improved CTF simulations utilize data-files derived from CFD STAR-CCM+ simulation results covering the spacer grid design desired for inclusion in the CTF calculation. The current implementation of these models is examined and possibilities for improvement and further development are suggested. The validation experimental database is extended by including the OECD/NRC PSBT benchmark data. The outcome is an enhanced accuracy of CTF predictions while preserving the computational efficiency of a low-fidelity subchannel code.
Aşkar, Petek; Altun, Arif; Cangöz, Banu; Cevik, Vildan; Kaya, Galip; Türksoy, Hasan
2012-04-01
The purpose of this study was to assess whether a computerized battery of neuropsychological tests could produce similar results as the conventional forms. Comparisons on 77 volunteer undergraduates were carried out with two neuropsychological tests: Line Orientation Test and Enhanced Cued Recall Test. Firstly, students were assigned randomly across the test medium (paper-and-pencil versus computerized). Secondly, the groups were given the same test in the other medium after a 30-day interval between tests. Results showed that the Enhanced Cued Recall Test-Computer-based did not correlate with the Enhanced Cued Recall Test-Paper-and-pencil results. Line Orientation Test-Computer-based scores, on the other hand, did correlate significantly with the Line Orientation Test-Paper-and-pencil version. In both tests, scores were higher on paper-and-pencil tests compared to computer-based tests. Total score difference between modalities was statistically significant for both Enhanced Cued Recall Tests and for the Line Orientation Test. In both computer-based tests, it took less time for participants to complete the tests.
Enhanced Fan Noise Modeling for Turbofan Engines
NASA Technical Reports Server (NTRS)
Krejsa, Eugene A.; Stone, James R.
2014-01-01
This report describes work by consultants to Diversitech Inc. for the NASA Glenn Research Center (GRC) to revise the fan noise prediction procedure based on fan noise data obtained in the 9- by 15 Foot Low-Speed Wind Tunnel at GRC. The purpose of this task is to begin development of an enhanced, analytical, more physics-based, fan noise prediction method applicable to commercial turbofan propulsion systems. The method is to be suitable for programming into a computational model for eventual incorporation into NASA's current aircraft system noise prediction computer codes. The scope of this task is in alignment with the mission of the Propulsion 21 research effort conducted by the coalition of NASA, state government, industry, and academia to develop aeropropulsion technologies. A model for fan noise prediction was developed based on measured noise levels for the R4 rotor with several outlet guide vane variations and three fan exhaust areas. The model predicts the complete fan noise spectrum, including broadband noise, tones, and for supersonic tip speeds, combination tones. Both spectra and directivity are predicted. Good agreement with data was achieved for all fan geometries. Comparisons with data from a second fan, the ADP fan, also showed good agreement.
The technological obsolescence of the Brazilian eletronic ballot box.
Camargo, Carlos Rogério; Faust, Richard; Merino, Eugênio; Stefani, Clarissa
2012-01-01
The electronic ballot box has played a significant role in the consolidation of Brazilian political process. It has enabled paper ballots extinction as a support for the elector's vote as well as for voting counting processes. It is also widely known that election automation has decisively collaborated to the legitimization of Brazilian democracy, getting rid of doubts about the winning candidates. In 1995, when the project was conceived, it represented a compromise solution, balancing technical efficiency and costs trade-offs. However, this architecture currently limits the ergonomic enhancements to the device operation, transportation, maintenance and storage. Nowadays are available in the market devices of reduced dimensions, based on novel computational architecture, namely tablet computers, which emphasizes usability, autonomy, portability, security and low power consumption. Therefore, the proposal under discussion is the replacement of the current electronic ballot boxes for tablet-based devices to improve the ergonomics aspects of the Brazilian voting process. These devices offer a plethora of integrated features (e.g., capacitive touchscreen, speakers, microphone) that enable highly usable and simple user interfaces, in addition to enhancing the voting process security mechanisms. Finally, their operational systems features allow for the development of highly secure applications, suitable to the requirements of a voting process.
Code of Federal Regulations, 2011 CFR
2011-01-01
... “technology” controlled by 9A011, 9E003.a.1, or 9E003.a.3.a; and (H) Controlled by 9D002, specially designed... computationally access computers that have been enhanced by “electronic assemblies”, which have been exported or reexported under License Exception GOV and have been used to enhance such computers by aggregation of...
DURIP: High Performance Computing in Biomathematics Applications
2017-05-10
Mathematics and Statistics (AMS) at the University of California, Santa Cruz (UCSC) to conduct research and research-related education in areas of...Computing in Biomathematics Applications Report Title The goal of this award was to enhance the capabilities of the Department of Applied Mathematics and...DURIP: High Performance Computing in Biomathematics Applications The goal of this award was to enhance the capabilities of the Department of Applied
Code of Federal Regulations, 2013 CFR
2013-01-01
... “technology” controlled by 9A011, 9E003.a.1, or 9E003.a.3.a; and (H) Controlled by 9D002, specially designed... computationally access computers that have been enhanced by “electronic assemblies”, which have been exported or reexported under License Exception GOV and have been used to enhance such computers by aggregation of...
Code of Federal Regulations, 2012 CFR
2012-01-01
... “technology” controlled by 9A011, 9E003.a.1, or 9E003.a.3.a; and (H) Controlled by 9D002, specially designed... computationally access computers that have been enhanced by “electronic assemblies”, which have been exported or reexported under License Exception GOV and have been used to enhance such computers by aggregation of...
Kanumuri, Prathima; Ganai, Sabha; Wohaibi, Eyad M.; Bush, Ronald W.; Grow, Daniel R.
2008-01-01
Background: The study aim was to compare the effectiveness of virtual reality and computer-enhanced video-scopic training devices for training novice surgeons in complex laparoscopic skills. Methods: Third-year medical students received instruction on laparoscopic intracorporeal suturing and knot tying and then underwent a pretraining assessment of the task using a live porcine model. Students were then randomized to objectives-based training on either the virtual reality (n=8) or computer-enhanced (n=8) training devices for 4 weeks, after which the assessment was repeated. Results: Posttraining performance had improved compared with pretraining performance in both task completion rate (94% versus 18%; P<0.001*) and time [181±58 (SD) versus 292±24*]. Performance of the 2 groups was comparable before and after training. Of the subjects, 88% thought that haptic cues were important in simulators. Both groups agreed that their respective training systems were effective teaching tools, but computer-enhanced device trainees were more likely to rate their training as representative of reality (P<0.01). Conclusions: Training on virtual reality and computer-enhanced devices had equivalent effects on skills improvement in novices. Despite the perception that haptic feedback is important in laparoscopic simulation training, its absence in the virtual reality device did not impede acquisition of skill. PMID:18765042
A QRS Detection and R Point Recognition Method for Wearable Single-Lead ECG Devices.
Chen, Chieh-Li; Chuang, Chun-Te
2017-08-26
In the new-generation wearable Electrocardiogram (ECG) system, signal processing with low power consumption is required to transmit data when detecting dangerous rhythms and to record signals when detecting abnormal rhythms. The QRS complex is a combination of three of the graphic deflection seen on a typical ECG. This study proposes a real-time QRS detection and R point recognition method with low computational complexity while maintaining a high accuracy. The enhancement of QRS segments and restraining of P and T waves are carried out by the proposed ECG signal transformation, which also leads to the elimination of baseline wandering. In this study, the QRS fiducial point is determined based on the detected crests and troughs of the transformed signal. Subsequently, the R point can be recognized based on four QRS waveform templates and preliminary heart rhythm classification can be also achieved at the same time. The performance of the proposed approach is demonstrated using the benchmark of the MIT-BIH Arrhythmia Database, where the QRS detected sensitivity (Se) and positive prediction (+P) are 99.82% and 99.81%, respectively. The result reveals the approach's advantage of low computational complexity, as well as the feasibility of the real-time application on a mobile phone and an embedded system.
Are some CEMP-s stars the daughters of spinstars?
NASA Astrophysics Data System (ADS)
Choplin, Arthur; Hirschi, Raphael; Meynet, Georges; Ekström, Sylvia
2017-11-01
Carbon-enhanced metal-poor (CEMP)-s stars are long-lived low-mass stars with a very low iron content as well as overabundances of carbon and s-elements. Their peculiar chemical pattern is often explained by pollution from an asymptotic giant branch (AGB) star companion. Recent observations have shown that most CEMP-s stars are in binary systems, providing support to the AGB companion scenario. A few CEMP-s stars, however, appear to be single. We inspect four apparently single CEMP-s stars and discuss the possibility that they formed from the ejecta of a previous-generation massive star, referred to as the "source" star. In order to investigate this scenario, we computed low-metallicity massive-star models with and without rotation and including complete s-process nucleosynthesis. We find that non-rotating source stars cannot explain the observed abundance of any of the four CEMP-s stars. Three out of the four CEMP-s stars can be explained by a 25M⊙ source star with vini 500 km s-1 (spinstar). The fourth CEMP-s star has a high Pb abundance that cannot be explained by any of the models we computed. Since spinstars and AGB predict different ranges of [O/Fe] and [ls/hs], these ratios could be an interesting way to further test these two scenarios.
Face recognition using tridiagonal matrix enhanced multivariance products representation
NASA Astrophysics Data System (ADS)
Ã-zay, Evrim Korkmaz
2017-01-01
This study aims to retrieve face images from a database according to a target face image. For this purpose, Tridiagonal Matrix Enhanced Multivariance Products Representation (TMEMPR) is taken into consideration. TMEMPR is a recursive algorithm based on Enhanced Multivariance Products Representation (EMPR). TMEMPR decomposes a matrix into three components which are a matrix of left support terms, a tridiagonal matrix of weight parameters for each recursion, and a matrix of right support terms, respectively. In this sense, there is an analogy between Singular Value Decomposition (SVD) and TMEMPR. However TMEMPR is a more flexible algorithm since its initial support terms (or vectors) can be chosen as desired. Low computational complexity is another advantage of TMEMPR because the algorithm has been constructed with recursions of certain arithmetic operations without requiring any iteration. The algorithm has been trained and tested with ORL face image database with 400 different grayscale images of 40 different people. TMEMPR's performance has been compared with SVD's performance as a result.
Television, computer and portable display device use by people with central vision impairment
Woods, Russell L; Satgunam, PremNandhini
2011-01-01
Purpose To survey the viewing experience (e.g. hours watched, difficulty) and viewing metrics (e.g. distance viewed, display size) for television (TV), computers and portable visual display devices for normally-sighted (NS) and visually impaired participants. This information may guide visual rehabilitation. Methods Survey was administered either in person or in a telephone interview on 223 participants of whom 104 had low vision (LV, worse than 6/18, age 22 to 90y, 54 males), and 94 were NS (visual acuity 6/9 or better, age 20 to 86y, 50 males). Depending on their situation, NS participants answered up to 38 questions and LV participants answered up to a further 10 questions. Results Many LV participants reported at least “some” difficulty watching TV (71/103), reported at least “often” having difficulty with computer displays (40/76) and extreme difficulty watching videos on handheld devices (11/16). The average daily TV viewing was slightly, but not significantly, higher for the LV participants (3.6h) than the NS (3.0h). Only 18% of LV participants used visual aids (all optical) to watch TV. Most LV participants obtained effective magnification from a reduced viewing distance for both TV and computer display. Younger LV participants also used a larger display when compared to older LV participants to obtain increased magnification. About half of the TV viewing time occurred in the absence of a companion for both the LV and the NS participants. The mean number of TVs at home reported by LV participants (2.2) was slightly but not significantly (p=0.09) higher than NS participants (2.0). LV participants were equally likely to have a computer but were significantly (p=0.004) less likely to access the internet (73/104) compared to NS participants (82/94). Most LV participants expressed an interest in image enhancing technology for TV viewing (67/104) and for computer use (50/74), if they used a computer. Conclusion In this study, both NS and LV participants had comparable video viewing habits. Most LV participants in our sample reported difficulty watching TV, and indicated an interest in assistive technology, such as image enhancement. As our participants reported that at least half their video viewing hours are spent alone and that there is usually more than one TV per household, this suggests that there are opportunities to use image enhancement on the TVs of LV viewers without interfering with the viewing experience of NS viewers. PMID:21410501
NASA Astrophysics Data System (ADS)
Tian, Rui; Yan, Dongpeng; Li, Chunyang; Xu, Simin; Liang, Ruizheng; Guo, Lingyan; Wei, Min; Evans, David G.; Duan, Xue
2016-05-01
Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC/ELDH hybrid material exhibits excellent imaging properties with good stability and biocompatibility in the intracellular environment. Therefore, this work provides a facile strategy to achieve highly luminescent Au NCs via surface-confined emission enhancement imposed by ultrathin inorganic nanosheets, which can be potentially used in bio-imaging and cell labelling.Gold nanoclusters (Au NCs) as ultrasmall fluorescent nanomaterials possess discrete electronic energy and unique physicochemical properties, but suffer from relatively low quantum yield (QY) which severely affects their application in displays and imaging. To solve this conundrum and obtain highly-efficient fluorescent emission, 2D exfoliated layered double hydroxide (ELDH) nanosheets were employed to localize Au NCs with a density as high as 5.44 × 1013 cm-2, by virtue of the surface confinement effect of ELDH. Both experimental studies and computational simulations testify that the excited electrons of Au NCs are strongly confined by MgAl-ELDH nanosheets, which results in a largely promoted QY as well as prolonged fluorescence lifetime (both ~7 times enhancement). In addition, the as-fabricated Au NC/ELDH hybrid material exhibits excellent imaging properties with good stability and biocompatibility in the intracellular environment. Therefore, this work provides a facile strategy to achieve highly luminescent Au NCs via surface-confined emission enhancement imposed by ultrathin inorganic nanosheets, which can be potentially used in bio-imaging and cell labelling. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr01624c
Impact of fitting algorithms on errors of parameter estimates in dynamic contrast-enhanced MRI
NASA Astrophysics Data System (ADS)
Debus, C.; Floca, R.; Nörenberg, D.; Abdollahi, A.; Ingrisch, M.
2017-12-01
Parameter estimation in dynamic contrast-enhanced MRI (DCE MRI) is usually performed by non-linear least square (NLLS) fitting of a pharmacokinetic model to a measured concentration-time curve. The two-compartment exchange model (2CXM) describes the compartments ‘plasma’ and ‘interstitial volume’ and their exchange in terms of plasma flow and capillary permeability. The model function can be defined by either a system of two coupled differential equations or a closed-form analytical solution. The aim of this study was to compare these two representations in terms of accuracy, robustness and computation speed, depending on parameter combination and temporal sampling. The impact on parameter estimation errors was investigated by fitting the 2CXM to simulated concentration-time curves. Parameter combinations representing five tissue types were used, together with two arterial input functions, a measured and a theoretical population based one, to generate 4D concentration images at three different temporal resolutions. Images were fitted by NLLS techniques, where the sum of squared residuals was calculated by either numeric integration with the Runge-Kutta method or convolution. Furthermore two example cases, a prostate carcinoma and a glioblastoma multiforme patient, were analyzed in order to investigate the validity of our findings in real patient data. The convolution approach yields improved results in precision and robustness of determined parameters. Precision and stability are limited in curves with low blood flow. The model parameter ve shows great instability and little reliability in all cases. Decreased temporal resolution results in significant errors for the differential equation approach in several curve types. The convolution excelled in computational speed by three orders of magnitude. Uncertainties in parameter estimation at low temporal resolution cannot be compensated by usage of the differential equations. Fitting with the convolution approach is superior in computational time, with better stability and accuracy at the same time.
Implementation of a low-cost Interim 21CFR11 compliance solution for laboratory environments.
Greene, Jack E
2003-01-01
In the recent past, compliance with 21CFR11 has become a major buzzword within the pharmaceutical and biotechnology industries. While commercial solutions exist, implementation and validation are expensive and cumbersome. Frequent implementation of new features via point releases further complicates purchasing decisions by making it difficult to weigh the risk of non-compliance against the costs of too frequent upgrades. This presentation discusses a low-cost interim solution to the problem. While this solution does not address 100% of the issues raised by 21CFR11, it does implement and validate: (1) computer system security; (2) backup and restore ability on the electronic records store; and (3) an automated audit trail mechanism that captures the date, time and user identification whenever electronic records are created, modified or deleted. When coupled with enhanced procedural controls, this solution provides an acceptable level of compliance at extremely low cost.
High-Speed Noninvasive Eye-Tracking System
NASA Technical Reports Server (NTRS)
Talukder, Ashit; LaBaw, Clayton; Michael-Morookian, John; Monacos, Steve; Serviss, Orin
2007-01-01
The figure schematically depicts a system of electronic hardware and software that noninvasively tracks the direction of a person s gaze in real time. Like prior commercial noninvasive eye-tracking systems, this system is based on (1) illumination of an eye by a low-power infrared light-emitting diode (LED); (2) acquisition of video images of the pupil, iris, and cornea in the reflected infrared light; (3) digitization of the images; and (4) processing the digital image data to determine the direction of gaze from the centroids of the pupil and cornea in the images. Relative to the prior commercial systems, the present system operates at much higher speed and thereby offers enhanced capability for applications that involve human-computer interactions, including typing and computer command and control by handicapped individuals,and eye-based diagnosis of physiological disorders that affect gaze responses.
SNP discovery by high-throughput sequencing in soybean
2010-01-01
Background With the advance of new massively parallel genotyping technologies, quantitative trait loci (QTL) fine mapping and map-based cloning become more achievable in identifying genes for important and complex traits. Development of high-density genetic markers in the QTL regions of specific mapping populations is essential for fine-mapping and map-based cloning of economically important genes. Single nucleotide polymorphisms (SNPs) are the most abundant form of genetic variation existing between any diverse genotypes that are usually used for QTL mapping studies. The massively parallel sequencing technologies (Roche GS/454, Illumina GA/Solexa, and ABI/SOLiD), have been widely applied to identify genome-wide sequence variations. However, it is still remains unclear whether sequence data at a low sequencing depth are enough to detect the variations existing in any QTL regions of interest in a crop genome, and how to prepare sequencing samples for a complex genome such as soybean. Therefore, with the aims of identifying SNP markers in a cost effective way for fine-mapping several QTL regions, and testing the validation rate of the putative SNPs predicted with Solexa short sequence reads at a low sequencing depth, we evaluated a pooled DNA fragment reduced representation library and SNP detection methods applied to short read sequences generated by Solexa high-throughput sequencing technology. Results A total of 39,022 putative SNPs were identified by the Illumina/Solexa sequencing system using a reduced representation DNA library of two parental lines of a mapping population. The validation rates of these putative SNPs predicted with low and high stringency were 72% and 85%, respectively. One hundred sixty four SNP markers resulted from the validation of putative SNPs and have been selectively chosen to target a known QTL, thereby increasing the marker density of the targeted region to one marker per 42 K bp. Conclusions We have demonstrated how to quickly identify large numbers of SNPs for fine mapping of QTL regions by applying massively parallel sequencing combined with genome complexity reduction techniques. This SNP discovery approach is more efficient for targeting multiple QTL regions in a same genetic population, which can be applied to other crops. PMID:20701770
2012-01-01
Background DNA microarrays are used both for research and for diagnostics. In research, Affymetrix arrays are commonly used for genome wide association studies, resequencing, and for gene expression analysis. These arrays provide large amounts of data. This data is analyzed using statistical methods that quite often discard a large portion of the information. Most of the information that is lost comes from probes that systematically fail across chips and from batch effects. The aim of this study was to develop a comprehensive model for hybridization that predicts probe intensities for Affymetrix arrays and that could provide a basis for improved microarray analysis and probe development. The first part of the model calculates probe binding affinities to all the possible targets in the hybridization solution using the Langmuir isotherm. In the second part of the model we integrate details that are specific to each experiment and contribute to the differences between hybridization in solution and on the microarray. These details include fragmentation, wash stringency, temperature, salt concentration, and scanner settings. Furthermore, the model fits probe synthesis efficiency and target concentration parameters directly to the data. All the parameters used in the model have a well-established physical origin. Results For the 302 chips that were analyzed the mean correlation between expected and observed probe intensities was 0.701 with a range of 0.88 to 0.55. All available chips were included in the analysis regardless of the data quality. Our results show that batch effects arise from differences in probe synthesis, scanner settings, wash strength, and target fragmentation. We also show that probe synthesis efficiencies for different nucleotides are not uniform. Conclusions To date this is the most complete model for binding on microarrays. This is the first model that includes both probe synthesis efficiency and hybridization kinetics/cross-hybridization. These two factors are sequence dependent and have a large impact on probe intensity. The results presented here provide novel insight into the effect of probe synthesis errors on Affymetrix microarrays; furthermore, the algorithms developed in this work provide useful tools for the analysis of cross-hybridization, probe synthesis efficiency, fragmentation, wash stringency, temperature, and salt concentration on microarray intensities. PMID:23270536
Tao, Hui; Steel, John; Lowen, Anice C.
2015-01-01
A high particle to infectivity ratio is a feature common to many RNA viruses, with ~90–99% of particles unable to initiate a productive infection under low multiplicity conditions. A recent publication by Brooke et al. revealed that, for influenza A virus (IAV), a proportion of these seemingly non-infectious particles are in fact semi-infectious. Semi-infectious (SI) particles deliver an incomplete set of viral genes to the cell, and therefore cannot support a full cycle of replication unless complemented through co-infection. In addition to SI particles, IAV populations often contain defective-interfering (DI) particles, which actively interfere with production of infectious progeny. With the aim of understanding the significance to viral evolution of these incomplete particles, we tested the hypothesis that SI and DI particles promote diversification through reassortment. Our approach combined computational simulations with experimental determination of infection, co-infection and reassortment levels following co-inoculation of cultured cells with two distinct influenza A/Panama/2007/99 (H3N2)-based viruses. Computational results predicted enhanced reassortment at a given % infection or multiplicity of infection with increasing semi-infectious particle content. Comparison of experimental data to the model indicated that the likelihood that a given segment is missing varies among the segments and that most particles fail to deliver ≥1 segment. To verify the prediction that SI particles augment reassortment, we performed co-infections using viruses exposed to low dose UV. As expected, the introduction of semi-infectious particles with UV-induced lesions enhanced reassortment. In contrast to SI particles, inclusion of DI particles in modeled virus populations could not account for observed reassortment outcomes. DI particles were furthermore found experimentally to suppress detectable reassortment, relative to that seen with standard virus stocks, most likely by interfering with production of infectious progeny from co-infected cells. These data indicate that semi-infectious particles increase the rate of reassortment and may therefore accelerate adaptive evolution of IAV. PMID:26440404
Hot horizontal branch stars: Predictions for mass loss. Winds, rotation, and the low gravity problem
NASA Astrophysics Data System (ADS)
Vink, Jorick S.; Cassisi, Santi
2002-09-01
We predict mass-loss rates for the late evolutionary phases of low-mass stars, with special emphasis on the consequences for the morphology of the Horizontal Branch (HB). We show that the computed rates, as predicted by the most plausible mechanism of radiation pressure on spectral lines, are too low to produce EHB/sdB stars. This invalidates the scenario recently outlined by Yong et al. (2000) to create these objects by mass loss on the HB. We argue, however, that mass loss plays a role in the distribution of rotational velocities of hot HB stars, and may - together with the enhancement of heavy element abundances due to radiative levitation - provide an explanation for the so-called ``low gravity'' problem. The mass loss recipe derived for hot HB (and extreme HB, sdB, sdOB) stars may also be applied to post-HB (AGB-manqué, UV-bright) stars over a range in effective temperatures between 12 500-40 000 K.
CT Imaging Biomarkers Predict Clinical Outcomes After Pancreatic Cancer Surgery
Zhu, Liang; Shi, Xiaohua; Xue, Huadan; Wu, Huanwen; Chen, Ge; Sun, Hao; He, Yonglan; Jin, Zhengyu; Liang, Zhiyong; Zhang, Zhuoli
2016-01-01
Abstract This study aimed to determine whether changes in contrast-enhanced computed tomography (CT) parameters could predict postsurgery overall and progression-free survival (PFS) in pancreatic cancer patients. Seventy-nine patients with a final pathological diagnosis of pancreatic adenocarcinoma were included in this study from June 2008 to August 2012. Dynamic contrast-enhanced (DCE) CT of tumors was obtained before curative-intent surgery. Absolute enhancement change (AEC) and relative enhancement change (REC) were evaluated on DCE-CT. PFS and overall survival (OS) were compared based on CT enhancement patterns. The markers of fibrogenic alpha-smooth muscle antigen (α-SMA) and periostin in tumor specimens were evaluated by immunohistochemical staining. The χ2 test was performed to determine whether CT enhancement patterns were associated with α-SMA-periostin expression levels (recorded as positive or negative). Lower REC (<0.9) was associated with shorter PFS (HR 0.51, 95% CI: 0.31–0.89) and OS (HR 0.44, 95% CI: 0.25–0.78). The α-SMA and periostin expression level were negatively correlated with REC (both P = 0). Among several CT enhancement parameters, REC was the best predictor of patient postsurgery survival. Low REC was associated with a short progression-free time and poor survival. The pathological studies suggested that REC might be a reflection of cancer fibrogenic potential. PMID:26844495
NASA Astrophysics Data System (ADS)
Kwon, Jihun; Sutherland, Kenneth; Hashimoto, Takayuki; Shirato, Hiroki; Date, Hiroyuki
2016-10-01
Gold nanoparticles (GNPs) have been recognized as a promising candidate for a radiation sensitizer. A proton beam incident on a GNP can produce secondary electrons, resulting in an enhancement of the dose around the GNP. However, little is known about the spatial distribution of dose enhancement around the GNP, especially in the direction along the incident proton. The purpose of this study is to determine the spatial distribution of dose enhancement by taking the incident direction into account. Two steps of calculation were conducted using the Geant4 Monte Carlo simulation toolkit. First, the energy spectra of 100 and 195 MeV protons colliding with a GNP were calculated at the Bragg peak and three other depths around the peak in liquid water. Second, the GNP was bombarded by protons with the obtained energy spectra. Radial dose distributions were computed along the incident beam direction. The spatial distributions of the dose enhancement factor (DEF) and subtracted dose (Dsub) were then evaluated. The spatial DEF distributions showed hot spots in the distal radial region from the proton beam axis. The spatial Dsub distribution isotropically spread out around the GNP. Low energy protons caused higher and wider dose enhancement. The macroscopic dose enhancement in clinical applications was also evaluated. The results suggest that the consideration of the spatial distribution of GNPs in treatment planning will maximize the potential of GNPs.
Association between mammogram density and background parenchymal enhancement of breast MRI
NASA Astrophysics Data System (ADS)
Aghaei, Faranak; Danala, Gopichandh; Wang, Yunzhi; Zarafshani, Ali; Qian, Wei; Liu, Hong; Zheng, Bin
2018-02-01
Breast density has been widely considered as an important risk factor for breast cancer. The purpose of this study is to examine the association between mammogram density results and background parenchymal enhancement (BPE) of breast MRI. A dataset involving breast MR images was acquired from 65 high-risk women. Based on mammography density (BIRADS) results, the dataset was divided into two groups of low and high breast density cases. The Low-Density group has 15 cases with mammographic density (BIRADS 1 and 2), while the High-density group includes 50 cases, which were rated by radiologists as mammographic density BIRADS 3 and 4. A computer-aided detection (CAD) scheme was applied to segment and register breast regions depicted on sequential images of breast MRI scans. CAD scheme computed 20 global BPE features from the entire two breast regions, separately from the left and right breast region, as well as from the bilateral difference between left and right breast regions. An image feature selection method namely, CFS method, was applied to remove the most redundant features and select optimal features from the initial feature pool. Then, a logistic regression classifier was built using the optimal features to predict the mammogram density from the BPE features. Using a leave-one-case-out validation method, the classifier yields the accuracy of 82% and area under ROC curve, AUC=0.81+/-0.09. Also, the box-plot based analysis shows a negative association between mammogram density results and BPE features in the MRI images. This study demonstrated a negative association between mammogram density and BPE of breast MRI images.
Lv, Hai-Ting; Cui, Ying; Zhang, Yu-Min; Li, Hua-Min; Zou, Guo-Dong; Duan, Rui-Huan; Cao, Jun-Tao; Jing, Qiang-Shan; Fan, Yang
2017-09-28
Organic donor-π-bridge-acceptor (D-π-A) dyes with arylamines as an electron donor have been widely used as photosensitizers for dye-sensitized solar cells (DSSCs). However, titanium-oxo clusters (TOCs) functionalized with this kind of D-π-A structured dye-molecule have rarely been explored. In the present study, the 4-dimethylaminobenzoate-functionalized titanium-oxo cluster [Ti 6 (μ 3 -O) 6 (OiPr) 6 (DMABA) 6 ]·2C 6 H 5 CH 3 (DMABA = 4-dimethylaminobenzoate) was synthesized and structurally characterized by single-crystal X-ray diffraction. For comparison, two other Ti 6 -oxo clusters, namely [Ti 6 (μ 3 -O) 6 (OiPr) 6 (AD) 6 ] (AD = 1-adamantanecarboxylate) and [Ti 6 (μ 3 -O) 2 (μ 2 -O)(μ 2 -OiPr) 4 (OiPr) 10 (DMM) 2 ] (DMM = dimethylmalonate), were also studied. The DMABA-functionalized cluster exhibits a remarkably reduced band gap of ∼2.5 eV and much enhanced photocurrent response in comparison with the other two clusters. The electronic structures and electronic transitions of the clusters were studied by DFT and TDDFT calculations. The computational results suggest that the low-energy transitions of the DMABA-functionalized cluster have a substantial charge-transfer character arising from the DMABA → {Ti 6 } cluster core ligand-to-core charge transfer (LCCT), along with the DMABA-based intra-ligand charge transfer (ILCT). These low-energy charge transfer transitions provide efficient electron injection pathways for photon-to-electron conversion.
Additional development of the XTRAN3S computer program
NASA Technical Reports Server (NTRS)
Borland, C. J.
1989-01-01
Additional developments and enhancements to the XTRAN3S computer program, a code for calculation of steady and unsteady aerodynamics, and associated aeroelastic solutions, for 3-D wings in the transonic flow regime are described. Algorithm improvements for the XTRAN3S program were provided including an implicit finite difference scheme to enhance the allowable time step and vectorization for improved computational efficiency. The code was modified to treat configurations with a fuselage, multiple stores/nacelles/pylons, and winglets. Computer program changes (updates) for error corrections and updates for version control are provided.
Mihl, Casper; Wildberger, Joachim E; Jurencak, Tomas; Yanniello, Michael J; Nijssen, Estelle C; Kalafut, John F; Nalbantov, Georgi; Mühlenbruch, Georg; Behrendt, Florian F; Das, Marco
2013-11-01
Both iodine delivery rate (IDR) and iodine concentration are decisive factors for vascular enhancement in computed tomographic angiography. It is unclear, however, whether the use of high-iodine concentration contrast media is beneficial to lower iodine concentrations when IDR is kept identical. This study evaluates the effect of using different iodine concentrations on intravascular attenuation in a circulation phantom while maintaining a constant IDR. A circulation phantom with a low-pressure venous compartment and a high-pressure arterial compartment simulating physiological circulation parameters was used (heart rate, 60 beats per minute; stroke volume, 60 mL; blood pressure, 120/80 mm Hg). Maintaining a constant IDR (2.0 g/s) and a constant total iodine load (20 g), prewarmed (37°C) contrast media with differing iodine concentrations (240-400 mg/mL) were injected into the phantom using a double-headed power injector. Serial computed tomographic scans at the level of the ascending aorta (AA), the descending aorta (DA), and the left main coronary artery (LM) were obtained. Total amount of contrast volume (milliliters), iodine delivery (grams of iodine), peak flow rate (milliliter per second), and intravascular pressure (pounds per square inch) were monitored using a dedicated data acquisition program. Attenuation values in the AA, the DA, and the LM were constantly measured (Hounsfield unit [HU]). In addition, time-enhancement curves, aortic peak enhancement, and time to peak were determined. All contrast injection protocols resulted in similar attenuation values: the AA (516 [11] to 531 [37] HU), the DA (514 [17] to 531 [32] HU), and the LM (490 [10] to 507 [17] HU). No significant differences were found between the AA, the DA, and the LM for either peak enhancement (all P > 0.05) or mean time to peak (AA, 19.4 [0.58] to 20.1 [1.05] seconds; DA, 21.1 [1.0] to 21.4 [1.15] seconds; LM, 19.8 [0.58] to 20.1 [1.05] seconds). This phantom study demonstrates that constant injection parameters (IDR, overall iodine load) lead to robust enhancement patterns, regardless of the contrast material used. Higher iodine concentration itself does not lead to higher attenuation levels. These results may stimulate a shift in paradigm toward clinical usage of contrast media with lower iodine concentrations (eg, 240 mg iodine/mL) in individual tailored contrast protocols. The use of low-iodine concentration contrast media is desirable because of the lower viscosity and the resulting lower injection pressure.
An enhanced hydrogen adsorption enthalpy for fluoride intercalated graphite compounds.
Cheng, Hansong; Sha, Xianwei; Chen, Liang; Cooper, Alan C; Foo, Maw-Lin; Lau, Garret C; Bailey, Wade H; Pez, Guido P
2009-12-16
We present a combined theoretical and experimental study on H(2) physisorption in partially fluorinated graphite. This material, first predicted computationally using ab initio molecular dynamics simulation and subsequently synthesized and characterized experimentally, represents a novel class of "acceptor type" graphite intercalated compounds that exhibit significantly higher isosteric heat of adsorption for H(2) at near ambient temperatures than previously demonstrated for commonly available porous carbon-based materials. The unusually strong interaction arises from the semi-ionic nature of the C-F bonds. Although a high H(2) storage capacity (>4 wt %) at room temperature is predicted not to be feasible due to the low heat of adsorption, enhanced storage properties can be envisaged by doping the graphitic host with appropriate species to promote higher levels of charge transfer from graphene to F(-) anions.
NASA Technical Reports Server (NTRS)
Gordon, H. R.
1979-01-01
The radiative transfer equation is modified to include the effect of fluorescent substances and solved in the quasi-single scattering approximation for a homogeneous ocean containing fluorescent particles with wavelength independent quantum efficiency and a Gaussian shaped emission line. The results are applied to the in vivo fluorescence of chlorophyll a (in phytoplankton) in the ocean to determine if the observed quantum efficiencies are large enough to explain the enhancement of the ocean's diffuse reflectance near 685 nm in chlorophyll rich waters without resorting to anomalous dispersion. The computations indicate that the required efficiencies are sufficiently low to account completely for the enhanced reflectance. The validity of the theory is further demonstrated by deriving values for the upwelling irradiance attenuation coefficient at 685 nm which are in close agreement with the observations.
Costales, Jaime A; Kotton, Camille N; Zurita-Leal, Andrea C; Garcia-Perez, Josselyn; Llewellyn, Martin S; Messenger, Louisa A; Bhattacharyya, Tapan; Burleigh, Barbara A
2015-08-25
Trypanosoma cruzi, causative agent of Chagas disease, displays high intraspecific genetic diversity: six genetic lineages or discrete typing units (DTUs) are currently recognized, termed TcI through TcVI. Each DTU presents a particular distribution pattern across the Americas, and is loosely associated with different transmission cycles and hosts. Several DTUs are known to circulate in Central America. It has been previously suggested that TcI infection is benign and does not lead to chronic chagasic cardiomyopathy (CCC). In this study, we genotyped T. cruzi parasites circulating in the blood and from explanted cardiac tissue of an El Salvadorian patient who developed reactivation Chagas disease while on immunosuppressive medications after undergoing heart transplant in the U.S. as treatment for end-stage CCC. Parasite typing was performed through molecular methods (restriction fragment length polymorphism of polymerase reaction chain amplified products, microsatellite typing, maxicircle sequence typing and low-stringency single primer PCR, [LSSP-PCR]) as well as lineage-specific serology. We show that the parasites infecting the patient belong to the TcI DTU exclusively. Our data indicate that the parasites isolated from the patient belong to a genotype frequently associated with human infection throughout the Americas (TcIDOM). Our results constitute compelling evidence in support of TcI DTU's ability to cause end-stage CCC and help dispel any residual bias that infection with this lineage is benign, pointing to the need for increased surveillance for dissemination of this genotype in endemic regions, the USA and globally.
Bayesian Redshift Classification of Emission-line Galaxies with Photometric Equivalent Widths
NASA Astrophysics Data System (ADS)
Leung, Andrew S.; Acquaviva, Viviana; Gawiser, Eric; Ciardullo, Robin; Komatsu, Eiichiro; Malz, A. I.; Zeimann, Gregory R.; Bridge, Joanna S.; Drory, Niv; Feldmeier, John J.; Finkelstein, Steven L.; Gebhardt, Karl; Gronwall, Caryl; Hagen, Alex; Hill, Gary J.; Schneider, Donald P.
2017-07-01
We present a Bayesian approach to the redshift classification of emission-line galaxies when only a single emission line is detected spectroscopically. We consider the case of surveys for high-redshift Lyα-emitting galaxies (LAEs), which have traditionally been classified via an inferred rest-frame equivalent width (EW {W}{Lyα }) greater than 20 Å. Our Bayesian method relies on known prior probabilities in measured emission-line luminosity functions and EW distributions for the galaxy populations, and returns the probability that an object in question is an LAE given the characteristics observed. This approach will be directly relevant for the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX), which seeks to classify ˜106 emission-line galaxies into LAEs and low-redshift [{{O}} {{II}}] emitters. For a simulated HETDEX catalog with realistic measurement noise, our Bayesian method recovers 86% of LAEs missed by the traditional {W}{Lyα } > 20 Å cutoff over 2 < z < 3, outperforming the EW cut in both contamination and incompleteness. This is due to the method’s ability to trade off between the two types of binary classification error by adjusting the stringency of the probability requirement for classifying an observed object as an LAE. In our simulations of HETDEX, this method reduces the uncertainty in cosmological distance measurements by 14% with respect to the EW cut, equivalent to recovering 29% more cosmological information. Rather than using binary object labels, this method enables the use of classification probabilities in large-scale structure analyses. It can be applied to narrowband emission-line surveys as well as upcoming large spectroscopic surveys including Euclid and WFIRST.
CRISPR-Cas Adaptive Immune Systems of the Sulfolobales: Unravelling Their Complexity and Diversity
Garrett, Roger A.; Shah, Shiraz A.; Erdmann, Susanne; Liu, Guannan; Mousaei, Marzieh; León-Sobrino, Carlos; Peng, Wenfang; Gudbergsdottir, Soley; Deng, Ling; Vestergaard, Gisle; Peng, Xu; She, Qunxin
2015-01-01
The Sulfolobales have provided good model organisms for studying CRISPR-Cas systems of the crenarchaeal kingdom of the archaea. These organisms are infected by a wide range of exceptional archaea-specific viruses and conjugative plasmids, and their CRISPR-Cas systems generally exhibit extensive structural and functional diversity. They carry large and multiple CRISPR loci and often multiple copies of diverse Type I and Type III interference modules as well as more homogeneous adaptation modules. These acidothermophilic organisms have recently provided seminal insights into both the adaptation process, the diverse modes of interference, and their modes of regulation. The functions of the adaptation and interference modules tend to be loosely coupled and the stringency of the crRNA-DNA sequence matching during DNA interference is relatively low, in contrast to some more streamlined CRISPR-Cas systems of bacteria. Despite this, there is evidence for a complex and differential regulation of expression of the diverse functional modules in response to viral infection. Recent work also supports critical roles for non-core Cas proteins, especially during Type III-directed interference, and this is consistent with these proteins tending to coevolve with core Cas proteins. Various novel aspects of CRISPR-Cas systems of the Sulfolobales are considered including an alternative spacer acquisition mechanism, reversible spacer acquisition, the formation and significance of antisense CRISPR RNAs, and a novel mechanism for avoidance of CRISPR-Cas defense. Finally, questions regarding the basis for the complexity, diversity, and apparent redundancy, of the intracellular CRISPR-Cas systems are discussed. PMID:25764276
Kaňková, Sárka; Sulc, Jan; Křivohlavá, Romana; Kuběna, Aleš; Flegr, Jaroslav
2012-11-01
Toxoplasmosis, a zoonosis caused by a protozoan, Toxoplasma gondii, is probably the most widespread human parasitosis in developed countries. Pregnant women with latent toxoplasmosis have seemingly younger fetuses especially in the 16th week of gestation, which suggests that fetuses of Toxoplasma-infected mothers have slower rates of development in the first trimester of pregnancy. In the present retrospective cohort study, we analyzed data on postnatal motor development of infants from 331 questionnaire respondents including 53 Toxoplasma-infected mothers to search for signs of early postnatal development disorders. During the first year of life, a slower postnatal motor development was observed in infants of mothers with latent toxoplasmosis. These infants significantly later developed the ability to control the head position (p=0.039), to roll from supine to prone position (p=0.022) and were slightly later to begin crawling (p=0.059). Our results are compatible with the hypothesis that the difference in the rates of prenatal and early postnatal development between children of Toxoplasma-negative and Toxoplasma-positive mothers might be caused by a decreased stringency of embryo quality control in partly immunosuppressed Toxoplasma-positive mothers resulting in a higher proportion of infants with genetic or developmental disorders in offspring. However, because of relatively low return rate of questionnaires and an associated risk of a sieve effect, our results should be considered as preliminary and performing a large scale prospective study in the future is critically needed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hanratty, Barbara; Lowson, Elizabeth; Holmes, Louise; Grande, Gunn; Addington-Hall, Julia; Payne, Sheila; Seymour, Jane
2012-05-01
This study explores the views of older adults who are receiving health and social care at the end of their lives, on how services should be funded, and describes their health-related expenditure. Qualitative interview study. North West England. 30 people aged 69-93 years, diagnosed with lung cancer, heart failure or stroke and judged by health professionals to be in their last year of life. Sixteen participants lived in disadvantaged areas. Views of older adults on funding of services. Participants expressed a belief in an earned entitlement to services funded from taxation, based on a broad sense of being a good citizen. Irrespective of social background, older people felt that those who could afford to pay for social care, should do so. Sale of assets and use of children's inheritance to fund care was widely perceived as an injustice. The costs of living with illness are a burden, and families are filling many of the gaps left by welfare provision. People who had worked in low-wage occupations were most concerned to justify their current acceptance of services, and distance themselves from what they described as welfare 'spongers' or 'layabouts.' There is a gap between the health and social care system that older adults expect and what may be provided by a reformed welfare state at a time of financial stringencies. The values that underpinned the views expressed--mutuality, care for the most needy, and the importance of working to contribute to society--are an important contribution to the debate on welfare funding.
Matthews, R J; Cahir, E D; Thomas, M L
1990-01-01
Protein-tyrosine-phosphatases (protein-tyrosine-phosphate phosphohydrolase, EC 3.13.48) have been implicated in the regulation of cell growth; however, to date few tyrosine phosphatases have been characterized. To identify additional family members, the cDNA for the human tyrosine phosphatase leukocyte common antigen (LCA; CD45) was used to screen, under low stringency, a mouse pre-B-cell cDNA library. Two cDNA clones were isolated and sequence analysis predicts a protein sequence of 793 amino acids. We have named the molecule LRP (LCA-related phosphatase). RNA transfer analysis indicates that the cDNAs were derived from a 3.2-kilobase mRNA. The LRP mRNA is transcribed in a wide variety of tissues. The predicted protein structure can be divided into the following structural features: a short 19-amino acid leader sequence, an exterior domain of 123 amino acids that is predicted to be highly glycosylated, a 24-amino acid membrane-spanning region, and a 627-amino acid cytoplasmic region. The cytoplasmic region contains two approximately 260-amino acid domains, each with homology to the tyrosine phosphatase family. One of the cDNA clones differed in that it had a 108-base-pair insertion that, while preserving the reading frame, would disrupt the first protein-tyrosine-phosphatase domain. Analysis of genomic DNA indicates that the insertion is due to an alternatively spliced exon. LRP appears to be evolutionarily conserved as a putative homologue has been identified in the invertebrate Styela plicata. Images PMID:2162042
Computer-based, Jeopardy™-like game in general chemistry for engineering majors
NASA Astrophysics Data System (ADS)
Ling, S. S.; Saffre, F.; Kadadha, M.; Gater, D. L.; Isakovic, A. F.
2013-03-01
We report on the design of Jeopardy™-like computer game for enhancement of learning of general chemistry for engineering majors. While we examine several parameters of student achievement and attitude, our primary concern is addressing the motivation of students, which tends to be low in a traditionally run chemistry lectures. The effect of the game-playing is tested by comparing paper-based game quiz, which constitutes a control group, and computer-based game quiz, constituting a treatment group. Computer-based game quizzes are Java™-based applications that students run once a week in the second part of the last lecture of the week. Overall effectiveness of the semester-long program is measured through pretest-postest conceptual testing of general chemistry. The objective of this research is to determine to what extent this ``gamification'' of the course delivery and course evaluation processes may be beneficial to the undergraduates' learning of science in general, and chemistry in particular. We present data addressing gender-specific difference in performance, as well as background (pre-college) level of general science and chemistry preparation. We outline the plan how to extend such approach to general physics courses and to modern science driven electives, and we offer live, in-lectures examples of our computer gaming experience. We acknowledge support from Khalifa University, Abu Dhabi
Improving finite element results in modeling heart valve mechanics.
Earl, Emily; Mohammadi, Hadi
2018-06-01
Finite element analysis is a well-established computational tool which can be used for the analysis of soft tissue mechanics. Due to the structural complexity of the leaflet tissue of the heart valve, the currently available finite element models do not adequately represent the leaflet tissue. A method of addressing this issue is to implement computationally expensive finite element models, characterized by precise constitutive models including high-order and high-density mesh techniques. In this study, we introduce a novel numerical technique that enhances the results obtained from coarse mesh finite element models to provide accuracy comparable to that of fine mesh finite element models while maintaining a relatively low computational cost. Introduced in this study is a method by which the computational expense required to solve linear and nonlinear constitutive models, commonly used in heart valve mechanics simulations, is reduced while continuing to account for large and infinitesimal deformations. This continuum model is developed based on the least square algorithm procedure coupled with the finite difference method adhering to the assumption that the components of the strain tensor are available at all nodes of the finite element mesh model. The suggested numerical technique is easy to implement, practically efficient, and requires less computational time compared to currently available commercial finite element packages such as ANSYS and/or ABAQUS.
Lu, Xiaofeng; Song, Li; Shen, Sumin; He, Kang; Yu, Songyu; Ling, Nam
2013-01-01
Hough Transform has been widely used for straight line detection in low-definition and still images, but it suffers from execution time and resource requirements. Field Programmable Gate Arrays (FPGA) provide a competitive alternative for hardware acceleration to reap tremendous computing performance. In this paper, we propose a novel parallel Hough Transform (PHT) and FPGA architecture-associated framework for real-time straight line detection in high-definition videos. A resource-optimized Canny edge detection method with enhanced non-maximum suppression conditions is presented to suppress most possible false edges and obtain more accurate candidate edge pixels for subsequent accelerated computation. Then, a novel PHT algorithm exploiting spatial angle-level parallelism is proposed to upgrade computational accuracy by improving the minimum computational step. Moreover, the FPGA based multi-level pipelined PHT architecture optimized by spatial parallelism ensures real-time computation for 1,024 × 768 resolution videos without any off-chip memory consumption. This framework is evaluated on ALTERA DE2-115 FPGA evaluation platform at a maximum frequency of 200 MHz, and it can calculate straight line parameters in 15.59 ms on the average for one frame. Qualitative and quantitative evaluation results have validated the system performance regarding data throughput, memory bandwidth, resource, speed and robustness. PMID:23867746
Lu, Xiaofeng; Song, Li; Shen, Sumin; He, Kang; Yu, Songyu; Ling, Nam
2013-07-17
Hough Transform has been widely used for straight line detection in low-definition and still images, but it suffers from execution time and resource requirements. Field Programmable Gate Arrays (FPGA) provide a competitive alternative for hardware acceleration to reap tremendous computing performance. In this paper, we propose a novel parallel Hough Transform (PHT) and FPGA architecture-associated framework for real-time straight line detection in high-definition videos. A resource-optimized Canny edge detection method with enhanced non-maximum suppression conditions is presented to suppress most possible false edges and obtain more accurate candidate edge pixels for subsequent accelerated computation. Then, a novel PHT algorithm exploiting spatial angle-level parallelism is proposed to upgrade computational accuracy by improving the minimum computational step. Moreover, the FPGA based multi-level pipelined PHT architecture optimized by spatial parallelism ensures real-time computation for 1,024 × 768 resolution videos without any off-chip memory consumption. This framework is evaluated on ALTERA DE2-115 FPGA evaluation platform at a maximum frequency of 200 MHz, and it can calculate straight line parameters in 15.59 ms on the average for one frame. Qualitative and quantitative evaluation results have validated the system performance regarding data throughput, memory bandwidth, resource, speed and robustness.
NASA Technical Reports Server (NTRS)
Dlugach, Janna M.; Mishchenko, Michael I.; Liu, Li; Mackowski, Daniel W.
2011-01-01
Direct computer simulations of electromagnetic scattering by discrete random media have become an active area of research. In this progress review, we summarize and analyze our main results obtained by means of numerically exact computer solutions of the macroscopic Maxwell equations. We consider finite scattering volumes with size parameters in the range, composed of varying numbers of randomly distributed particles with different refractive indices. The main objective of our analysis is to examine whether all backscattering effects predicted by the low-density theory of coherent backscattering (CB) also take place in the case of densely packed media. Based on our extensive numerical data we arrive at the following conclusions: (i) all backscattering effects predicted by the asymptotic theory of CB can also take place in the case of densely packed media; (ii) in the case of very large particle packing density, scattering characteristics of discrete random media can exhibit behavior not predicted by the low-density theories of CB and radiative transfer; (iii) increasing the absorptivity of the constituent particles can either enhance or suppress typical manifestations of CB depending on the particle packing density and the real part of the refractive index. Our numerical data strongly suggest that spectacular backscattering effects identified in laboratory experiments and observed for a class of high-albedo Solar System objects are caused by CB.
Glatt, Vaida; Bartnikowski, Nicole; Quirk, Nicholas; Schuetz, Michael; Evans, Christopher
2016-01-01
Background: Reverse dynamization is a technology for enhancing the healing of osseous defects. With use of an external fixator, the axial stiffness across the defect is initially set low and subsequently increased. The purpose of the study described in this paper was to explore the efficacy of reverse dynamization under different conditions. Methods: Rat femoral defects were stabilized with external fixators that allowed the stiffness to be modulated on living animals. Recombinant human bone morphogenetic protein-2 (rhBMP-2) was implanted into the defects on a collagen sponge. Following a dose-response experiment, 5.5 μg of rhBMP-2 was placed into the defect under conditions of very low (25.4-N/mm), low (114-N/mm), medium (185-N/mm), or high (254-N/mm) stiffness. Reverse dynamization was evaluated with 2 different starting stiffnesses: low (114 N/mm) and very low (25.4 N/mm). In both cases, high stiffness (254 N/mm) was imposed after 2 weeks. Healing was assessed with radiographs, micro-computed tomography (μCT), histological analysis, and mechanical testing. Results: In the absence of dynamization, the medium-stiffness fixators provided the best healing. Reverse dynamization starting with very low stiffness was detrimental to healing. However, with low initial stiffness, reverse dynamization considerably improved healing with minimal residual cartilage, enhanced cortication, increased mechanical strength, and smaller callus. Histological analysis suggested that, in all cases, healing provoked by rhBMP-2 occurred by endochondral ossification. Conclusions: These data confirm the potential utility of reverse dynamization as a way of improving bone healing but indicate that the stiffness parameters need to be selected carefully. Clinical Relevance: Reverse dynamization may reduce the amount of rhBMP-2 needed to induce healing of recalcitrant osseous lesions, reduce the time to union, and decrease the need for prolonged external fixation. PMID:27098327
Computer-aided classification of breast masses using contrast-enhanced digital mammograms
NASA Astrophysics Data System (ADS)
Danala, Gopichandh; Aghaei, Faranak; Heidari, Morteza; Wu, Teresa; Patel, Bhavika; Zheng, Bin
2018-02-01
By taking advantages of both mammography and breast MRI, contrast-enhanced digital mammography (CEDM) has emerged as a new promising imaging modality to improve efficacy of breast cancer screening and diagnosis. The primary objective of study is to develop and evaluate a new computer-aided detection and diagnosis (CAD) scheme of CEDM images to classify between malignant and benign breast masses. A CEDM dataset consisting of 111 patients (33 benign and 78 malignant) was retrospectively assembled. Each case includes two types of images namely, low-energy (LE) and dual-energy subtracted (DES) images. First, CAD scheme applied a hybrid segmentation method to automatically segment masses depicting on LE and DES images separately. Optimal segmentation results from DES images were also mapped to LE images and vice versa. Next, a set of 109 quantitative image features related to mass shape and density heterogeneity was initially computed. Last, four multilayer perceptron-based machine learning classifiers integrated with correlationbased feature subset evaluator and leave-one-case-out cross-validation method was built to classify mass regions depicting on LE and DES images, respectively. Initially, when CAD scheme was applied to original segmentation of DES and LE images, the areas under ROC curves were 0.7585+/-0.0526 and 0.7534+/-0.0470, respectively. After optimal segmentation mapping from DES to LE images, AUC value of CAD scheme significantly increased to 0.8477+/-0.0376 (p<0.01). Since DES images eliminate overlapping effect of dense breast tissue on lesions, segmentation accuracy was significantly improved as compared to regular mammograms, the study demonstrated that computer-aided classification of breast masses using CEDM images yielded higher performance.
Numerical aspects in modeling high Deborah number flow and elastic instability
NASA Astrophysics Data System (ADS)
Kwon, Youngdon
2014-05-01
Investigating highly nonlinear viscoelastic flow in 2D domain, we explore problem as well as property possibly inherent in the streamline upwinding technique (SUPG) and then present various results of elastic instability. The mathematically stable Leonov model written in tensor-logarithmic formulation is employed in the framework of finite element method for spatial discretization of several representative problem domains. For enhancement of computation speed, decoupled integration scheme is applied for shear thinning and Boger-type fluids. From the analysis of 4:1 contraction flow at low and moderate values of the Deborah number (De) the solution with SUPG method does not show noticeable difference from the one by the computation without upwinding. On the other hand, in the flow regime of high De, especially in the state of elastic instability the SUPG significantly distorts the flow field and the result differs considerably from the solution acquired straightforwardly. When the strength of elastic flow and thus the nonlinearity further increase, the computational scheme with upwinding fails to converge and evolutionary solution does not become available any more. All this result suggests that extreme care has to be taken on occasions where upwinding is applied, and one has to first of all prove validity of this algorithm in the case of high nonlinearity. On the contrary, the straightforward computation with no upwinding can efficiently model representative phenomena of elastic instability in such benchmark problems as 4:1 contraction flow, flow over a circular cylinder and flow over asymmetric array of cylinders. Asymmetry of the flow field occurring in the symmetric domain, enhanced spatial and temporal fluctuation of dynamic variables and flow effects caused by extension hardening are properly described in this study.
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
NASA Astrophysics Data System (ADS)
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-07-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.
Publishing an "imej" Journal for Computer-Enhanced Learning.
ERIC Educational Resources Information Center
Burg, Jennifer; Wong, Yue-Ling; Pfeifer, Dan; Boyle, Anne; Yip, Ching-Wan
Interactive multimedia electronic journals, or IMEJ journals, are a publication medium particularly suited for research in computer-enhanced learning. This paper describes the challenges and potential rewards in publishing such a journal; presents ideas for design and layout; and discusses issues of collaboration, copyrighting, and archiving that…
Xpatch prediction improvements to support multiple ATR applications
NASA Astrophysics Data System (ADS)
Andersh, Dennis J.; Lee, Shung W.; Moore, John T.; Sullivan, Douglas P.; Hughes, Jeff A.; Ling, Hao
1998-08-01
This paper describes an electromagnetic computer prediction code for generating radar cross section (RCS), time-domain signature sand synthetic aperture radar (SAR) images of realistic 3D vehicles. The vehicle, typically an airplane or a ground vehicle, is represented by a computer-aided design (CAD) file with triangular facets, IGES curved surfaces, or solid geometries.The computer code, Xpatch, based on the shooting-and-bouncing-ray technique, is used to calculate the polarimetric radar return from the vehicles represented by these different CAD files. Xpatch computers the first- bounce physical optics (PO) plus the physical theory of diffraction (PTD) contributions. Xpatch calculates the multi-bounce ray contributions by using geometric optics and PO for complex vehicles with materials. It has been found that the multi-bounce calculations, the radar return in typically 10 to 15 dB too low. Examples of predicted range profiles, SAR, imagery, and RCS for several different geometries are compared with measured data to demonstrate the quality of the predictions. Recent enhancements to Xpatch include improvements for millimeter wave applications and hybridization with finite element method for small geometric features and augmentation of additional IGES entities to support trimmed and untrimmed surfaces.
XPATCH: a high-frequency electromagnetic scattering prediction code using shooting and bouncing rays
NASA Astrophysics Data System (ADS)
Hazlett, Michael; Andersh, Dennis J.; Lee, Shung W.; Ling, Hao; Yu, C. L.
1995-06-01
This paper describes an electromagnetic computer prediction code for generating radar cross section (RCS), time domain signatures, and synthetic aperture radar (SAR) images of realistic 3-D vehicles. The vehicle, typically an airplane or a ground vehicle, is represented by a computer-aided design (CAD) file with triangular facets, curved surfaces, or solid geometries. The computer code, XPATCH, based on the shooting and bouncing ray technique, is used to calculate the polarimetric radar return from the vehicles represented by these different CAD files. XPATCH computes the first-bounce physical optics plus the physical theory of diffraction contributions and the multi-bounce ray contributions for complex vehicles with materials. It has been found that the multi-bounce contributions are crucial for many aspect angles of all classes of vehicles. Without the multi-bounce calculations, the radar return is typically 10 to 15 dB too low. Examples of predicted range profiles, SAR imagery, and radar cross sections (RCS) for several different geometries are compared with measured data to demonstrate the quality of the predictions. The comparisons are from the UHF through the Ka frequency ranges. Recent enhancements to XPATCH for MMW applications and target Doppler predictions are also presented.
Yao, Jianhua; Burns, Joseph E.; Sanoria, Vic; Summers, Ronald M.
2017-01-01
Abstract. Bone metastases are a frequent occurrence with cancer, and early detection can guide the patient’s treatment regimen. Metastatic bone disease can present in density extremes as sclerotic (high density) and lytic (low density) or in a continuum with an admixture of both sclerotic and lytic components. We design a framework to detect and characterize the varying spectrum of presentation of spine metastasis on positron emission tomography/computed tomography (PET/CT) data. A technique is proposed to synthesize CT and PET images to enhance the lesion appearance for computer detection. A combination of watershed, graph cut, and level set algorithms is first run to obtain the initial detections. Detections are then sent to multiple classifiers for sclerotic, lytic, and mixed lesions. The system was tested on 44 cases with 225 sclerotic, 139 lytic, and 92 mixed lesions. The results showed that sensitivity (false positive per patient) was 0.81 (2.1), 0.81 (1.3), and 0.76 (2.1) for sclerotic, lytic, and mixed lesions, respectively. It also demonstrates that using PET/CT data significantly improves the computer aided detection performance over using CT alone. PMID:28612036
Reinforcement learning for resource allocation in LEO satellite networks.
Usaha, Wipawee; Barria, Javier A
2007-06-01
In this paper, we develop and assess online decision-making algorithms for call admission and routing for low Earth orbit (LEO) satellite networks. It has been shown in a recent paper that, in a LEO satellite system, a semi-Markov decision process formulation of the call admission and routing problem can achieve better performance in terms of an average revenue function than existing routing methods. However, the conventional dynamic programming (DP) numerical solution becomes prohibited as the problem size increases. In this paper, two solution methods based on reinforcement learning (RL) are proposed in order to circumvent the computational burden of DP. The first method is based on an actor-critic method with temporal-difference (TD) learning. The second method is based on a critic-only method, called optimistic TD learning. The algorithms enhance performance in terms of requirements in storage, computational complexity and computational time, and in terms of an overall long-term average revenue function that penalizes blocked calls. Numerical studies are carried out, and the results obtained show that the RL framework can achieve up to 56% higher average revenue over existing routing methods used in LEO satellite networks with reasonable storage and computational requirements.
Modeling Responses of Naturally Fractured Geothermal Reservoir to Low-Pressure Stimulation
Fu, Pengcheng; Carrigan, Charles R.
2012-01-01
Hydraulic shearing is an appealing reservoir stimulation strategy for Enhanced Geothermal Systems. It is believed that hydro-shearing is likely to simulate a fracture network that covers a relatively large volume of the reservoir whereas hydro-fracturing tends to create a small number of fractures. In this paper, we examine the geomechanical and hydraulic behaviors of natural fracture systems subjected to hydro-shearing stimulation and develop a coupled numerical model within the framework of discrete fracture network modeling. We found that in the low pressure hydro-shearing regime, the coupling between the fluid phase and the rock solid phase is relatively simple, and the numerical model is computationally efficient. Using this modified model, we study the behavior of a random fracture network subjected to hydro-shearing stimulation.
NASA Astrophysics Data System (ADS)
Lim, Sara N.; Pradhan, Anil K.; Nahar, Sultana N.; Barth, Rolf F.; Yang, Weilian; Nakkula, Robin J.; Palmer, Alycia; Turro, Claudia
2013-06-01
High energy X-rays in the MeV range are generally employed in conventional radiation therapy from linear accelerators (LINAC) to ensure sufficient penetration depths. However, lower energy X-rays in the keV range may be more effective when coupled with heavy element (high-Z or HZ) radiosensitizers. Numerical simulations of X-ray energy deposition for tumor phantoms sensitized with HZ radiosensitizers were performed using the Monte Carlo code Geant4. The results showed enhancement in energy deposition to radiosensitized phantoms relative to unsensitized phantoms for low energy X-rays in the keV range. In contrast, minimal enhancement was seen using high energy X-rays in the MeV range. Dose enhancement factors (DEFs) were computed and showed radiosensitization only in the low energy range < 200 keV, far lower than the energy of the majority of photons in the LINAC energy range. In vitro studies were carried to demonstrate the tumoricidal effects of HZ sensitized F98 rat glioma cells following irradiation with both low energy 160 kV and high energy 6 MV X-ray sources. The platinum compound, pyridine terpyridine Pt(II) nitrate, was initially used because it was 7x less toxic that an equivalent amount of carboplatin in vitro studies. This would allow us to separate the radiotoxic and the chemotoxic effects of HZ sensitizers. Results from this study showed a 10-fold dose dependent reduction in surviving fractions (SF) of radiosensitized cells treated with low energy 160 kV X-rays compared to those treated with 6 MV X-rays. This is in agreement with our simulations that show an increase in dose deposition in radiosensitized tumors for low energy X-rays. Due to unforeen in vivo toxicity, however, another in vitro study was performed using the commonly used, Pt-based chemotherapeutic drug carboplatin which confirmed earlier results. This lays the ground work for a planned in vivo study using F98 glioma bearing rats. This study demonstrates that while high energy X-rays are commonly used in cancer radiotherapy, low energy keV X-rays might be much more effective with HZ radiosensitization.
Enhancing atlas based segmentation with multiclass linear classifiers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sdika, Michaël, E-mail: michael.sdika@creatis.insa-lyon.fr
Purpose: To present a method to enrich atlases for atlas based segmentation. Such enriched atlases can then be used as a single atlas or within a multiatlas framework. Methods: In this paper, machine learning techniques have been used to enhance the atlas based segmentation approach. The enhanced atlas defined in this work is a pair composed of a gray level image alongside an image of multiclass classifiers with one classifier per voxel. Each classifier embeds local information from the whole training dataset that allows for the correction of some systematic errors in the segmentation and accounts for the possible localmore » registration errors. The authors also propose to use these images of classifiers within a multiatlas framework: results produced by a set of such local classifier atlases can be combined using a label fusion method. Results: Experiments have been made on the in vivo images of the IBSR dataset and a comparison has been made with several state-of-the-art methods such as FreeSurfer and the multiatlas nonlocal patch based method of Coupé or Rousseau. These experiments show that their method is competitive with state-of-the-art methods while having a low computational cost. Further enhancement has also been obtained with a multiatlas version of their method. It is also shown that, in this case, nonlocal fusion is unnecessary. The multiatlas fusion can therefore be done efficiently. Conclusions: The single atlas version has similar quality as state-of-the-arts multiatlas methods but with the computational cost of a naive single atlas segmentation. The multiatlas version offers a improvement in quality and can be done efficiently without a nonlocal strategy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waye, S.; Narumanchi, S.; Moreno, G.
Jet impingement is one means to improve thermal management for power electronics in electric-drive traction vehicles. Jet impingement on microfin-enhanced surfaces further augments heat transfer and thermal performance. A channel flow heat exchanger from a commercial inverter was characterized as a baseline system for comparison with two new prototype designs using liquid jet impingement on plain and microfinned enhanced surfaces. The submerged jets can target areas with the highest heat flux to provide local cooling, such as areas under insulated-gate bipolar transistors and diode devices. Low power experiments, where four diodes were powered, dissipated 105 W of heat and weremore » used to validate computational fluid dynamics modeling of the baseline and prototype designs. Experiments and modeling used typical automotive flow rates using water-ethylene glycol as a coolant (50%-50% by volume). The computational fluid dynamics model was used to predict full inverter power heat dissipation. The channel flow and jet impingement configurations were tested at full inverter power of 40 to 100 kW (output power) on a dynamometer, translating to an approximate heat dissipation of 1 to 2 kW. With jet impingement, the cold plate material is not critical for the thermal pathway. A high-temperature plastic was used that could eventually be injection molded or formed, with the jets formed from a basic aluminum plate with orifices acting as nozzles. Long-term reliability of the jet nozzles and impingement on enhanced surfaces was examined. For jet impingement on microfinned surfaces, thermal performance increased 17%. Along with a weight reduction of approximately 3 kg, the specific power (kW/kg) increased by 36%, with an increase in power density (kW/L) of 12% compared with the baseline channel flow configuration.« less
Naidu, Sailen G; Kriegshauser, J Scott; Paden, Robert G; He, Miao; Wu, Qing; Hara, Amy K
2014-12-01
An ultra-low-dose radiation protocol reconstructed with model-based iterative reconstruction was compared with our standard-dose protocol. This prospective study evaluated 20 men undergoing surveillance-enhanced computed tomography after endovascular aneurysm repair. All patients underwent standard-dose and ultra-low-dose venous phase imaging; images were compared after reconstruction with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction. Objective measures of aortic contrast attenuation and image noise were averaged. Images were subjectively assessed (1 = worst, 5 = best) for diagnostic confidence, image noise, and vessel sharpness. Aneurysm sac diameter and endoleak detection were compared. Quantitative image noise was 26% less with ultra-low-dose model-based iterative reconstruction than with standard-dose adaptive statistical iterative reconstruction and 58% less than with ultra-low-dose adaptive statistical iterative reconstruction. Average subjective noise scores were not different between ultra-low-dose model-based iterative reconstruction and standard-dose adaptive statistical iterative reconstruction (3.8 vs. 4.0, P = .25). Subjective scores for diagnostic confidence were better with standard-dose adaptive statistical iterative reconstruction than with ultra-low-dose model-based iterative reconstruction (4.4 vs. 4.0, P = .002). Vessel sharpness was decreased with ultra-low-dose model-based iterative reconstruction compared with standard-dose adaptive statistical iterative reconstruction (3.3 vs. 4.1, P < .0001). Ultra-low-dose model-based iterative reconstruction and standard-dose adaptive statistical iterative reconstruction aneurysm sac diameters were not significantly different (4.9 vs. 4.9 cm); concordance for the presence of endoleak was 100% (P < .001). Compared with a standard-dose technique, an ultra-low-dose model-based iterative reconstruction protocol provides comparable image quality and diagnostic assessment at a 73% lower radiation dose.
Cognitive Enhancement and Education
ERIC Educational Resources Information Center
Buchanan, Allen
2011-01-01
Cognitive enhancement--augmenting normal cognitive capacities--is not new. Literacy, numeracy, computers, and the practices of science are all cognitive enhancements. Science is now making new cognitive enhancements possible. Biomedical cognitive enhancements (BCEs) include the administration of drugs, implants of genetically engineered or…
78 FR 39617 - Data Practices, Computer III Further Remand: BOC Provision of Enhanced Services
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-02
... Docket No. 10-132; FCC 13-69] Data Practices, Computer III Further Remand: BOC Provision of Enhanced... eliminates comparably efficient interconnection (CEI) and open network architecture (ONA) narrowband... disseminates data, including by altering or eliminating collections that are no longer useful or necessary to...
Enhancing Instruction through Constructivism, Cooperative Learning, and Cloud Computing
ERIC Educational Resources Information Center
Denton, David W.
2012-01-01
Cloud computing technologies, such as Google Docs and Microsoft Office Live, have the potential to enhance instructional methods predicated on constructivism and cooperative learning. Cloud-based application features like file sharing and online publishing are prompting departments of education across the nation to adopt these technologies.…
ERIC Educational Resources Information Center
What Works Clearinghouse, 2012
2012-01-01
"Technology Enhanced Elementary and Middle School Science" ("TEEMSS") is a physical science curriculum for grades 3-8 that utilizes computers, sensors, and interactive models to support investigations of real-world phenomena. Through 15 inquiry-based instructional units, students interact with computers, gather and analyze…
Enhancing Learning Outcomes in Computer-Based Training via Self-Generated Elaboration
ERIC Educational Resources Information Center
Cuevas, Haydee M.; Fiore, Stephen M.
2014-01-01
The present study investigated the utility of an instructional strategy known as the "query method" for enhancing learning outcomes in computer-based training. The query method involves an embedded guided, sentence generation task requiring elaboration of key concepts in the training material that encourages learners to "stop and…
The correlation study of parallel feature extractor and noise reduction approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dewi, Deshinta Arrova; Sundararajan, Elankovan; Prabuwono, Anton Satria
2015-05-15
This paper presents literature reviews that show variety of techniques to develop parallel feature extractor and finding its correlation with noise reduction approaches for low light intensity images. Low light intensity images are normally displayed as darker images and low contrast. Without proper handling techniques, those images regularly become evidences of misperception of objects and textures, the incapability to section them. The visual illusions regularly clues to disorientation, user fatigue, poor detection and classification performance of humans and computer algorithms. Noise reduction approaches (NR) therefore is an essential step for other image processing steps such as edge detection, image segmentation,more » image compression, etc. Parallel Feature Extractor (PFE) meant to capture visual contents of images involves partitioning images into segments, detecting image overlaps if any, and controlling distributed and redistributed segments to extract the features. Working on low light intensity images make the PFE face challenges and closely depend on the quality of its pre-processing steps. Some papers have suggested many well established NR as well as PFE strategies however only few resources have suggested or mentioned the correlation between them. This paper reviews best approaches of the NR and the PFE with detailed explanation on the suggested correlation. This finding may suggest relevant strategies of the PFE development. With the help of knowledge based reasoning, computational approaches and algorithms, we present the correlation study between the NR and the PFE that can be useful for the development and enhancement of other existing PFE.« less
A Fog Computing and Cloudlet Based Augmented Reality System for the Industry 4.0 Shipyard.
Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Vilar-Montesinos, Miguel
2018-06-02
Augmented Reality (AR) is one of the key technologies pointed out by Industry 4.0 as a tool for enhancing the next generation of automated and computerized factories. AR can also help shipbuilding operators, since they usually need to interact with information (e.g., product datasheets, instructions, maintenance procedures, quality control forms) that could be handled easily and more efficiently through AR devices. This is the reason why Navantia, one of the 10 largest shipbuilders in the world, is studying the application of AR (among other technologies) in different shipyard environments in a project called "Shipyard 4.0". This article presents Navantia's industrial AR (IAR) architecture, which is based on cloudlets and on the fog computing paradigm. Both technologies are ideal for supporting physically-distributed, low-latency and QoS-aware applications that decrease the network traffic and the computational load of traditional cloud computing systems. The proposed IAR communications architecture is evaluated in real-world scenarios with payload sizes according to demanding Microsoft HoloLens applications and when using a cloud, a cloudlet and a fog computing system. The results show that, in terms of response delay, the fog computing system is the fastest when transferring small payloads (less than 128 KB), while for larger file sizes, the cloudlet solution is faster than the others. Moreover, under high loads (with many concurrent IAR clients), the cloudlet in some cases is more than four times faster than the fog computing system in terms of response delay.
Stephenson, Aoife; McDonough, Suzanne M; Murphy, Marie H; Nugent, Chris D; Mair, Jacqueline L
2017-08-11
High levels of sedentary behaviour (SB) are associated with negative health consequences. Technology enhanced solutions such as mobile applications, activity monitors, prompting software, texts, emails and websites are being harnessed to reduce SB. The aim of this paper is to evaluate the effectiveness of such technology enhanced interventions aimed at reducing SB in healthy adults and to examine the behaviour change techniques (BCTs) used. Five electronic databases were searched to identify randomised-controlled trials (RCTs), published up to June 2016. Interventions using computer, mobile or wearable technologies to facilitate a reduction in SB, using a measure of sedentary time as an outcome, were eligible for inclusion. Risk of bias was assessed using the Cochrane Collaboration's tool and interventions were coded using the BCT Taxonomy (v1). Meta-analysis of 15/17 RCTs suggested that computer, mobile and wearable technology tools resulted in a mean reduction of -41.28 min per day (min/day) of sitting time (95% CI -60.99, -21.58, I2 = 77%, n = 1402), in favour of the intervention group at end point follow-up. The pooled effects showed mean reductions at short (≤ 3 months), medium (>3 to 6 months), and long-term follow-up (>6 months) of -42.42 min/day, -37.23 min/day and -1.65 min/day, respectively. Overall, 16/17 studies were deemed as having a high or unclear risk of bias, and 1/17 was judged to be at a low risk of bias. A total of 46 BCTs (14 unique) were coded for the computer, mobile and wearable components of the interventions. The most frequently coded were "prompts and cues", "self-monitoring of behaviour", "social support (unspecified)" and "goal setting (behaviour)". Interventions using computer, mobile and wearable technologies can be effective in reducing SB. Effectiveness appeared most prominent in the short-term and lessened over time. A range of BCTs have been implemented in these interventions. Future studies need to improve reporting of BCTs within interventions and address the methodological flaws identified within the review through the use of more rigorously controlled study designs with longer-term follow-ups, objective measures of SB and the incorporation of strategies to reduce attrition. The review protocol was registered with PROSPERO: CRD42016038187.
NASA Astrophysics Data System (ADS)
Linn, Marcia C.
1995-06-01
Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.
[Coil Embolization of Bronchial Artery Aneurysm;Report of a Case].
Hagiwara, Kenichi; Moriya, Hiroshi; Sato, Yoshiyuki
2018-05-01
An 82-year-old male was admitted due to mild chest discomfort. Enhanced computed tomography showed a large bronchial artery aneurysm(BAA) of 26×27 mm at the left hilus. To avoid the rupture of BAA, coil embolization alone was performed. There has been no enlargement of BAA for these 4 years. In general, coil embolization only should be indicated in a patient with BAA with a stalk because of thoracic endovascular aortic repair (TEVAR) being off-label and low cost performance. TEVAR would be considered as a last resort only in case of enlarging BAA even after coil embolization.
High-brightness displays in integrated weapon sight systems
NASA Astrophysics Data System (ADS)
Edwards, Tim; Hogan, Tim
2014-06-01
In the past several years Kopin has demonstrated the ability to provide ultra-high brightness, low power display solutions in VGA, SVGA, SXGA and 2k x 2k display formats. This paper will review various approaches for integrating high brightness overlay displays with existing direct view rifle sights and augmenting their precision aiming and targeting capability. Examples of overlay display systems solutions will be presented and discussed. This paper will review significant capability enhancements that are possible when augmenting the real-world as seen through a rifle sight with other soldier system equipment including laser range finders, ballistic computers and sensor systems.
Central neurocytoma presenting with gigantism: case report.
Araki, Y; Sakai, N; Andoh, T; Yoshimura, S; Yamada, H
1992-08-01
We report a case of central neurocytoma presenting with gigantism. The patient was a 19-year-old man with a 2-year history of rapid growth. Computed tomography revealed a round, slightly enhancing calcified tumor in the septal region. This lesion was resected, and postoperative radiotherapy was given. The preoperative serum growth hormone level was 20.7 ng/mL, and postoperatively this fell to 0.9 ng/mL. Pituitary dysfunction was not noted either before or after the operation. A low level of production of growth hormone releasing factor was detected when tumor cells obtained during surgery were cultured.
Real-time motion artifacts compensation of ToF sensors data on GPU
NASA Astrophysics Data System (ADS)
Lefloch, Damien; Hoegg, Thomas; Kolb, Andreas
2013-05-01
Over the last decade, ToF sensors attracted many computer vision and graphics researchers. Nevertheless, ToF devices suffer from severe motion artifacts for dynamic scenes as well as low-resolution depth data which strongly justifies the importance of a valid correction. To counterbalance this effect, a pre-processing approach is introduced to greatly improve range image data on dynamic scenes. We first demonstrate the robustness of our approach using simulated data to finally validate our method using sensor range data. Our GPU-based processing pipeline enhances range data reliability in real-time.
Solutions for medical databases optimal exploitation
Branescu, I; Purcarea, VL; Dobrescu, R
2014-01-01
The paper discusses the methods to apply OLAP techniques for multidimensional databases that leverage the existing, performance-enhancing technique, known as practical pre-aggregation, by making this technique relevant to a much wider range of medical applications, as a logistic support to the data warehousing techniques. The transformations have practically low computational complexity and they may be implemented using standard relational database technology. The paper also describes how to integrate the transformed hierarchies in current OLAP systems, transparently to the user and proposes a flexible, “multimodel" federated system for extending OLAP querying to external object databases. PMID:24653769
Enhanced fault-tolerant quantum computing in d-level systems.
Campbell, Earl T
2014-12-05
Error-correcting codes protect quantum information and form the basis of fault-tolerant quantum computing. Leading proposals for fault-tolerant quantum computation require codes with an exceedingly rare property, a transversal non-Clifford gate. Codes with the desired property are presented for d-level qudit systems with prime d. The codes use n=d-1 qudits and can detect up to ∼d/3 errors. We quantify the performance of these codes for one approach to quantum computation known as magic-state distillation. Unlike prior work, we find performance is always enhanced by increasing d.
Thermally assisted adiabatic quantum computation.
Amin, M H S; Love, Peter J; Truncik, C J S
2008-02-15
We study the effect of a thermal environment on adiabatic quantum computation using the Bloch-Redfield formalism. We show that in certain cases the environment can enhance the performance in two different ways: (i) by introducing a time scale for thermal mixing near the anticrossing that is smaller than the adiabatic time scale, and (ii) by relaxation after the anticrossing. The former can enhance the scaling of computation when the environment is super-Ohmic, while the latter can only provide a prefactor enhancement. We apply our method to the case of adiabatic Grover search and show that performance better than classical is possible with a super-Ohmic environment, with no a priori knowledge of the energy spectrum.
Hybrid discrete/continuum algorithms for stochastic reaction networks
Safta, Cosmin; Sargsyan, Khachik; Debusschere, Bert; ...
2014-10-22
Direct solutions of the Chemical Master Equation (CME) governing Stochastic Reaction Networks (SRNs) are generally prohibitively expensive due to excessive numbers of possible discrete states in such systems. To enhance computational efficiency we develop a hybrid approach where the evolution of states with low molecule counts is treated with the discrete CME model while that of states with large molecule counts is modeled by the continuum Fokker-Planck equation. The Fokker-Planck equation is discretized using a 2nd order finite volume approach with appropriate treatment of flux components to avoid negative probability values. The numerical construction at the interface between the discretemore » and continuum regions implements the transfer of probability reaction by reaction according to the stoichiometry of the system. As a result, the performance of this novel hybrid approach is explored for a two-species circadian model with computational efficiency gains of about one order of magnitude.« less
Systems Engineering and Integration (SE and I)
NASA Technical Reports Server (NTRS)
Chevers, ED; Haley, Sam
1990-01-01
The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.
NASA Technical Reports Server (NTRS)
Rousseau, J.; Hwang, K. C.
1975-01-01
Investigations aimed at the optimization of a baseline Rankine cycle solar powered air conditioner and the development of a preliminary system specification were conducted. Efforts encompassed the following: (1) investigations of the use of recuperators/regenerators to enhance the performance of the baseline system, (2) development of an off-design computer program for system performance prediction, (3) optimization of the turbocompressor design to cover a broad range of conditions and permit operation at low heat source water temperatures, (4) generation of parametric data describing system performance (COP and capacity), (5) development and evaluation of candidate system augmentation concepts and selection of the optimum approach, (6) generation of auxiliary power requirement data, (7) development of a complete solar collector-thermal storage-air conditioner computer program, (8) evaluation of the baseline Rankine air conditioner over a five day period simulating the NASA solar house operation, and (9) evaluation of the air conditioner as a heat pump.
LCAMP: Location Constrained Approximate Message Passing for Compressed Sensing MRI
Sung, Kyunghyun; Daniel, Bruce L; Hargreaves, Brian A
2016-01-01
Iterative thresholding methods have been extensively studied as faster alternatives to convex optimization methods for solving large-sized problems in compressed sensing. A novel iterative thresholding method called LCAMP (Location Constrained Approximate Message Passing) is presented for reducing computational complexity and improving reconstruction accuracy when a nonzero location (or sparse support) constraint can be obtained from view shared images. LCAMP modifies the existing approximate message passing algorithm by replacing the thresholding stage with a location constraint, which avoids adjusting regularization parameters or thresholding levels. This work is first compared with other conventional reconstruction methods using random 1D signals and then applied to dynamic contrast-enhanced breast MRI to demonstrate the excellent reconstruction accuracy (less than 2% absolute difference) and low computation time (5 - 10 seconds using Matlab) with highly undersampled 3D data (244 × 128 × 48; overall reduction factor = 10). PMID:23042658