Sample records for generic code content

  1. Similarities and differences in dream content at the cross-cultural, gender, and individual levels.

    PubMed

    William Domhoff, G; Schneider, Adam

    2008-12-01

    The similarities and differences in dream content at the cross-cultural, gender, and individual levels provide one starting point for carrying out studies that attempt to discover correspondences between dream content and various types of waking cognition. Hobson and Kahn's (Hobson, J. A., & Kahn, D. (2007). Dream content: Individual and generic aspects. Consciousness and Cognition, 16, 850-858.) conclusion that dream content may be more generic than most researchers realize, and that individual differences are less salient than usually thought, provides the occasion for a review of findings based on the Hall and Van de Castle (Hall, C., & Van de Castle, R. (1966). The content analysis of dreams. New York: Appleton-Century-Crofts.) coding system for the study of dream content. Then new findings based on a computationally intensive randomization strategy are presented to show the minimum sample sizes needed to detect gender and individual differences in dream content. Generally speaking, sample sizes of 100-125 dream reports are needed because most dream elements appear in less than 50% of dream reports and the magnitude of the differences usually is not large.

  2. Establishing a Link Between Prescription Drug Abuse and Illicit Online Pharmacies: Analysis of Twitter Data.

    PubMed

    Katsuki, Takeo; Mackey, Tim Ken; Cuomo, Raphael

    2015-12-16

    Youth and adolescent non-medical use of prescription medications (NUPM) has become a national epidemic. However, little is known about the association between promotion of NUPM behavior and access via the popular social media microblogging site, Twitter, which is currently used by a third of all teens. In order to better assess NUPM behavior online, this study conducts surveillance and analysis of Twitter data to characterize the frequency of NUPM-related tweets and also identifies illegal access to drugs of abuse via online pharmacies. Tweets were collected over a 2-week period from April 1-14, 2015, by applying NUPM keyword filters for both generic/chemical and street names associated with drugs of abuse using the Twitter public streaming application programming interface. Tweets were then analyzed for relevance to NUPM and whether they promoted illegal online access to prescription drugs using a protocol of content coding and supervised machine learning. A total of 2,417,662 tweets were collected and analyzed for this study. Tweets filtered for generic drugs names comprised 232,108 tweets, including 22,174 unique associated uniform resource locators (URLs), and 2,185,554 tweets (376,304 unique URLs) filtered for street names. Applying an iterative process of manual content coding and supervised machine learning, 81.72% of the generic and 12.28% of the street NUPM datasets were predicted as having content relevant to NUPM respectively. By examining hyperlinks associated with NUPM relevant content for the generic Twitter dataset, we discovered that 75.72% of the tweets with URLs included a hyperlink to an online marketing affiliate that directly linked to an illicit online pharmacy advertising the sale of Valium without a prescription. This study examined the association between Twitter content, NUPM behavior promotion, and online access to drugs using a broad set of prescription drug keywords. Initial results are concerning, as our study found over 45,000 tweets that directly promoted NUPM by providing a URL that actively marketed the illegal online sale of prescription drugs of abuse. Additional research is needed to further establish the link between Twitter content and NUPM, as well as to help inform future technology-based tools, online health promotion activities, and public policy to combat NUPM online.

  3. Establishing a Link Between Prescription Drug Abuse and Illicit Online Pharmacies: Analysis of Twitter Data

    PubMed Central

    Cuomo, Raphael

    2015-01-01

    Background Youth and adolescent non-medical use of prescription medications (NUPM) has become a national epidemic. However, little is known about the association between promotion of NUPM behavior and access via the popular social media microblogging site, Twitter, which is currently used by a third of all teens. Objective In order to better assess NUPM behavior online, this study conducts surveillance and analysis of Twitter data to characterize the frequency of NUPM-related tweets and also identifies illegal access to drugs of abuse via online pharmacies. Methods Tweets were collected over a 2-week period from April 1-14, 2015, by applying NUPM keyword filters for both generic/chemical and street names associated with drugs of abuse using the Twitter public streaming application programming interface. Tweets were then analyzed for relevance to NUPM and whether they promoted illegal online access to prescription drugs using a protocol of content coding and supervised machine learning. Results A total of 2,417,662 tweets were collected and analyzed for this study. Tweets filtered for generic drugs names comprised 232,108 tweets, including 22,174 unique associated uniform resource locators (URLs), and 2,185,554 tweets (376,304 unique URLs) filtered for street names. Applying an iterative process of manual content coding and supervised machine learning, 81.72% of the generic and 12.28% of the street NUPM datasets were predicted as having content relevant to NUPM respectively. By examining hyperlinks associated with NUPM relevant content for the generic Twitter dataset, we discovered that 75.72% of the tweets with URLs included a hyperlink to an online marketing affiliate that directly linked to an illicit online pharmacy advertising the sale of Valium without a prescription. Conclusions This study examined the association between Twitter content, NUPM behavior promotion, and online access to drugs using a broad set of prescription drug keywords. Initial results are concerning, as our study found over 45,000 tweets that directly promoted NUPM by providing a URL that actively marketed the illegal online sale of prescription drugs of abuse. Additional research is needed to further establish the link between Twitter content and NUPM, as well as to help inform future technology-based tools, online health promotion activities, and public policy to combat NUPM online. PMID:26677966

  4. Relations Between Narrative Coherence, Identity, and Psychological Well-being in Emerging Adulthood

    PubMed Central

    Waters, Theodore E. A.; Fivush, Robyn

    2014-01-01

    Objective The hypothesis that the ability to construct a coherent account of personal experience is reflective, or predictive, of psychological adjustment cuts across numerous domains of psychological science. It has been argued that coherent accounts of identity are especially adaptive. We tested these hypotheses by examining relations between narrative coherence of personally significant autobiographical memories and three psychological well-being components (Purpose and Meaning; Positive Self View; Positive Relationships). We also examined the potential moderation of the relations between coherence and well-being by assessing the identity content of each narrative. Method We collected two autobiographical narratives of personally significant events from 103 undergraduate students and coded them for coherence and identity content. Two additional narratives about generic/recurring events were also collected and coded for coherence. Results We confirmed the prediction that constructing coherent autobiographical narratives is related to psychological well-being. Further, we found that this relation was moderated by the narratives’ relevance to identity and that this moderation held after controlling for narrative ability more generally (i.e. coherence of generic/recurring events). Conclusion These data lend strong support to the coherent narrative identity hypothesis and the prediction that unique events are a critical feature of identity construction in emerging adulthood. PMID:25110125

  5. Synthesizing Safety Conditions for Code Certification Using Meta-Level Programming

    NASA Technical Reports Server (NTRS)

    Eusterbrock, Jutta

    2004-01-01

    In code certification the code consumer publishes a safety policy and the code producer generates a proof that the produced code is in compliance with the published safety policy. In this paper, a novel viewpoint approach towards an implementational re-use oriented framework for code certification is taken. It adopts ingredients from Necula's approach for proof-carrying code, but in this work safety properties can be analyzed on a higher code level than assembly language instructions. It consists of three parts: (1) The specification language is extended to include generic pre-conditions that shall ensure safety at all states that can be reached during program execution. Actual safety requirements can be expressed by providing domain-specific definitions for the generic predicates which act as interface to the environment. (2) The Floyd-Hoare inductive assertion method is refined to obtain proof rules that allow the derivation of the proof obligations in terms of the generic safety predicates. (3) A meta-interpreter is designed and experimentally implemented that enables automatic synthesis of proof obligations for submitted programs by applying the modified Floyd-Hoare rules. The proof obligations have two separate conjuncts, one for functional correctness and another for the generic safety obligations. Proof of the generic obligations, having provided the actual safety definitions as context, ensures domain-specific safety of program execution in a particular environment and is simpler than full program verification.

  6. Rocketdyne/Westinghouse nuclear thermal rocket engine modeling

    NASA Technical Reports Server (NTRS)

    Glass, James F.

    1993-01-01

    The topics are presented in viewgraph form and include the following: systems approach needed for nuclear thermal rocket (NTR) design optimization; generic NTR engine power balance codes; rocketdyne nuclear thermal system code; software capabilities; steady state model; NTR engine optimizer code-logic; reactor power calculation logic; sample multi-component configuration; NTR design code output; generic NTR code at Rocketdyne; Rocketdyne NTR model; and nuclear thermal rocket modeling directions.

  7. The SENSEI Generic In Situ Interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayachit, Utkarsh; Whitlock, Brad; Wolf, Matthew

    The SENSEI generic in situ interface is an API that promotes code portability and reusability. From the simulation view, a developer can instrument their code with the SENSEI API and then make make use of any number of in situ infrastructures. From the method view, a developer can write an in situ method using the SENSEI API, then expect it to run in any number of in situ infrastructures, or be invoked directly from a simulation code, with little or no modification. This paper presents the design principles underlying the SENSEI generic interface, along with some simplified coding examples.

  8. A generic coding approach for the examination of meal patterns.

    PubMed

    Woolhead, Clara; Gibney, Michael J; Walsh, Marianne C; Brennan, Lorraine; Gibney, Eileen R

    2015-08-01

    Meal pattern analysis can be complex because of the large variability in meal consumption. The use of aggregated, generic meal data may address some of these issues. The objective was to develop a meal coding system and use it to explore meal patterns. Dietary data were used from the National Adult Nutrition Survey (2008-2010), which collected 4-d food diary information from 1500 healthy adults. Self-recorded meal types were listed for each food item. Common food group combinations were identified to generate a number of generic meals for each meal type: breakfast, light meals, main meals, snacks, and beverages. Mean nutritional compositions of the generic meals were determined and substituted into the data set to produce a generic meal data set. Statistical comparisons were performed against the original National Adult Nutrition Survey data. Principal component analysis was carried out by using these generic meals to identify meal patterns. A total of 21,948 individual meals were reduced to 63 generic meals. Good agreement was seen for nutritional comparisons (original compared with generic data sets mean ± SD), such as fat (75.7 ± 29.4 and 71.7 ± 12.9 g, respectively, P = 0.243) and protein (83.3 ± 26.9 and 80.1 ± 13.4 g, respectively, P = 0.525). Similarly, Bland-Altman plots demonstrated good agreement (<5% outside limits of agreement) for many nutrients, including protein, saturated fat, and polyunsaturated fat. Twelve meal types were identified from the principal component analysis ranging in meal-type inclusion/exclusion, varying in energy-dense meals, and differing in the constituents of the meals. A novel meal coding system was developed; dietary intake data were recoded by using generic meal consumption data. Analysis revealed that the generic meal coding system may be appropriate when examining nutrient intakes in the population. Furthermore, such a coding system was shown to be suitable for use in determining meal-based dietary patterns. © 2015 American Society for Nutrition.

  9. Newspaper Articles Related to the Not Criminally Responsible on Account of Mental Disorder (NCRMD) Designation: A Comparative Analysis.

    PubMed

    Whitley, Rob; Wang, JiaWei; Carmichael, Victoria; Wellen, Ruth

    2017-10-01

    The not criminally responsible on account of mental disorder (NCRMD) designation remains widely misunderstood by the public. Such misunderstandings may also be reflected in the media. As such, the aim of this study is to conduct a preliminary examination of the tone and content of recent Canadian newspaper articles where NCRMD is a major theme, comparing these to generic articles about mental illness. Articles about mental illness were gathered from major Canadian newspapers. These were then divided into two categories: 1) articles where NCRMD was a major theme and 2) articles where NCRMD was not a major theme. Articles were then coded for the presence or absence of 1) a negative tone, 2) stigmatising tone/content, 3) recovery/rehabilitation as a theme, and 4) shortage of resources/poor quality of care as a theme. The retrieval strategy resulted in 940 articles. Fourteen percent ( n = 131) of all articles had NCRMD as a major theme. In comparison to generic articles about mental illness, articles with NCRMD as a major theme were significantly more likely to have a negative tone ( P < 0.001) and stigmatising tone/content ( P < 0.001) and significantly less likely to have recovery/rehabilitation ( P < 0.001) or shortage of resources/poor quality of care as a theme ( P < 0.001). Articles with NCRMD as a theme were overwhelmingly negative and almost never focused on recovery or rehabilitation, in stark comparison to generic articles about mental illness.

  10. Newspaper Articles Related to the Not Criminally Responsible on Account of Mental Disorder (NCRMD) Designation: A Comparative Analysis

    PubMed Central

    Wang, JiaWei; Carmichael, Victoria; Wellen, Ruth

    2017-01-01

    Objective: The not criminally responsible on account of mental disorder (NCRMD) designation remains widely misunderstood by the public. Such misunderstandings may also be reflected in the media. As such, the aim of this study is to conduct a preliminary examination of the tone and content of recent Canadian newspaper articles where NCRMD is a major theme, comparing these to generic articles about mental illness. Methods: Articles about mental illness were gathered from major Canadian newspapers. These were then divided into two categories: 1) articles where NCRMD was a major theme and 2) articles where NCRMD was not a major theme. Articles were then coded for the presence or absence of 1) a negative tone, 2) stigmatising tone/content, 3) recovery/rehabilitation as a theme, and 4) shortage of resources/poor quality of care as a theme. Results: The retrieval strategy resulted in 940 articles. Fourteen percent (n = 131) of all articles had NCRMD as a major theme. In comparison to generic articles about mental illness, articles with NCRMD as a major theme were significantly more likely to have a negative tone (P < 0.001) and stigmatising tone/content (P < 0.001) and significantly less likely to have recovery/rehabilitation (P < 0.001) or shortage of resources/poor quality of care as a theme (P < 0.001). Conclusions: Articles with NCRMD as a theme were overwhelmingly negative and almost never focused on recovery or rehabilitation, in stark comparison to generic articles about mental illness. PMID:28697626

  11. Services provided by community pharmacies in Wayne County, Michigan: a comparison by ZIP code characteristics.

    PubMed

    Erickson, Steven R; Workman, Paul

    2014-01-01

    To document the availability of selected pharmacy services and out-of-pocket cost of medication throughout a diverse county in Michigan and to assess possible associations between availability of services and price of medication and characteristics of residents of the ZIP codes in which the pharmacies were located. Cross-sectional telephone survey of pharmacies coupled with ZIP code-level census data. 503 pharmacies throughout the 63 ZIP codes of Wayne County, MI. The out-of-pocket cost for a 30 days' supply of levothyroxine 50 mcg and brand-name atorvastatin (Lipitor-Pfizer) 20 mg, availability of discount generic drug programs, home delivery of medications, hours of pharmacy operation, and availability of pharmacy-based immunization services. Census data aggregated at the ZIP code level included race, annual household income, age, and number of residents per pharmacy. The overall results per ZIP code showed that the average cost for levothyroxine was $10.01 ± $2.29 and $140.45 + $14.70 for Lipitor. Per ZIP code, the mean (± SD) percentages of pharmacies offering discount generic drug programs was 66.9% ± 15.0%; home delivery of medications was 44.5% ± 22.7%; and immunization for influenza was 46.7% ± 24.3% of pharmacies. The mean (± SD) hours of operation per pharmacy per ZIP code was 67.0 ± 25.2. ZIP codes with higher household income as well as higher percentage of residents being white had lower levothyroxine price, greater percentage of pharmacies offering discount generic drug programs, more hours of operation per week, and more pharmacy-based immunization services. The cost of Lipitor was not associated with any ZIP code characteristic. Disparities in the cost of generic levothyroxine, the availability of services such as discount generic drug programs, hours of operation, and pharmacy-based immunization services are evident based on race and household income within this diverse metropolitan county.

  12. Pharmaceutical quality of generic isotretinoin products, compared with Roaccutane.

    PubMed

    Taylor, Peter W; Keenan, Michael H J

    2006-03-01

    Isotretinoin is the drug of choice for the management of severe recalcitrant nodular acne. Several generic products are available. However, their pharmaceutical quality, in particular particle size distribution, which may affect safety and efficacy is unknown. Hence, prescribing of some generic products may be problematic. To assess the pharmaceutical quality of 14 generic isotretinoin products compared with Roaccutane (F. Hoffmann-La Roche Ltd). Tests were performed according to Roche standard procedures, European and US pharmacopoeia specifications. Tests included isotretinoin content, identity and amount of impurities and degradation products, effect of accelerated shelf-life studies on stability, particle size distribution and composition of non-active ingredients. The 14 isotretinoin products differed by 30-fold in median particle size and showed variation in their non-active ingredients. The average isotretinoin content of Acnotin and Acne-Tretin fell outside the 95-105% Roche specifications. Following accelerated shelf-life tests, only four products retained isotretinoin content within Roche specifications, whilst Acne-Tretin (the only powder formulation) lost 72.5% isotretinoin content. Two generic products exceeded the +/- 2% specification (Ph. Eur.) and a further three exceeded the +/- 1% (USP) for tretinoin content, eight exceeded the 2.54% specification for total impurities and six contained >or= 5 unknown impurities. Isotretinoin-5.6-epoxide content exceeded the 1.04% specification in five generic products. Thirteen generic products failed to match Roaccutane in one or more tests and 11 failed in three or more tests. It cannot be assumed that all generic isotretinoin products are as therapeutically effective or safe as Roaccutane.

  13. Content Validity of Patient-Reported Outcome Instruments used with Pediatric Patients with Facial Differences: A Systematic Review.

    PubMed

    Wickert, Natasha M; Wong Riff, Karen W Y; Mansour, Mark; Forrest, Christopher R; Goodacre, Timothy E E; Pusic, Andrea L; Klassen, Anne F

    2018-01-01

    Objective The aim of this systematic review was to identify patient-reported outcome (PRO) instruments used in research with children/youth with conditions associated with facial differences to identify the health concepts measured. Design MEDLINE, EMBASE, CINAHL, and PsycINFO were searched from 2004 to 2016 to identify PRO instruments used in acne vulgaris, birthmarks, burns, ear anomalies, facial asymmetries, and facial paralysis patients. We performed a content analysis whereby the items were coded to identify concepts and categorized as positive or negative content or phrasing. Results A total of 7,835 articles were screened; 6 generic and 11 condition-specific PRO instruments were used in 96 publications. Condition-specific instruments were for acne (four), oral health (two), dermatology (one), facial asymmetries (two), microtia (one), and burns (one). The PRO instruments provided 554 items (295 generic; 259 condition specific) that were sorted into 4 domains, 11 subdomains, and 91 health concepts. The most common domain was psychological (n = 224 items). Of the identified items, 76% had negative content or phrasing (e.g., "Because of the way my face looks I wish I had never been born"). Given the small number of items measuring facial appearance (n = 19) and function (n = 22), the PRO instruments reviewed lacked content validity for patients whose condition impacted facial function and/or appearance. Conclusions Treatments can change facial appearance and function. This review draws attention to a problem with content validity in existing PRO instruments. Our team is now developing a new PRO instrument called FACE-Q Kids to address this problem.

  14. Knowledge and abilities catalog for nuclear power plant operators: Boiling water reactors, Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-08-01

    The Knowledge and Abilities Catalog for Nuclear Power Plant Operators: Boiling-Water Reactors (BWRs) (NUREG-1123, Revision 1) provides the basis for the development of content-valid licensing examinations for reactor operators (ROs) and senior reactor operators (SROs). The examinations developed using the BWR Catalog along with the Operator Licensing Examiner Standards (NUREG-1021) and the Examiner`s Handbook for Developing Operator Licensing Written Examinations (NUREG/BR-0122), will cover the topics listed under Title 10, Code of Federal Regulations, Part 55 (10 CFR 55). The BWR Catalog contains approximately 7,000 knowledge and ability (K/A) statements for ROs and SROs at BWRs. The catalog is organized intomore » six major sections: Organization of the Catalog, Generic Knowledge and Ability Statements, Plant Systems grouped by Safety Functions, Emergency and Abnormal Plant Evolutions, Components, and Theory. Revision 1 to the BWR Catalog represents a modification in form and content of the original catalog. The K/As were linked to their applicable 10 CFR 55 item numbers. SRO level K/As were identified by 10 CFR 55.43 item numbers. The plant-wide generic and system generic K/As were combined in one section with approximately one hundred new K/As. Component Cooling Water and Instrument Air Systems were added to the Systems Section. Finally, High Containment Hydrogen Concentration and Plant Fire On Site evolutions added to the Emergency and Abnormal Plant Evolutions section.« less

  15. Generic Kalman Filter Software

    NASA Technical Reports Server (NTRS)

    Lisano, Michael E., II; Crues, Edwin Z.

    2005-01-01

    The Generic Kalman Filter (GKF) software provides a standard basis for the development of application-specific Kalman-filter programs. Historically, Kalman filters have been implemented by customized programs that must be written, coded, and debugged anew for each unique application, then tested and tuned with simulated or actual measurement data. Total development times for typical Kalman-filter application programs have ranged from months to weeks. The GKF software can simplify the development process and reduce the development time by eliminating the need to re-create the fundamental implementation of the Kalman filter for each new application. The GKF software is written in the ANSI C programming language. It contains a generic Kalman-filter-development directory that, in turn, contains a code for a generic Kalman filter function; more specifically, it contains a generically designed and generically coded implementation of linear, linearized, and extended Kalman filtering algorithms, including algorithms for state- and covariance-update and -propagation functions. The mathematical theory that underlies the algorithms is well known and has been reported extensively in the open technical literature. Also contained in the directory are a header file that defines generic Kalman-filter data structures and prototype functions and template versions of application-specific subfunction and calling navigation/estimation routine code and headers. Once the user has provided a calling routine and the required application-specific subfunctions, the application-specific Kalman-filter software can be compiled and executed immediately. During execution, the generic Kalman-filter function is called from a higher-level navigation or estimation routine that preprocesses measurement data and post-processes output data. The generic Kalman-filter function uses the aforementioned data structures and five implementation- specific subfunctions, which have been developed by the user on the basis of the aforementioned templates. The GKF software can be used to develop many different types of unfactorized Kalman filters. A developer can choose to implement either a linearized or an extended Kalman filter algorithm, without having to modify the GKF software. Control dynamics can be taken into account or neglected in the filter-dynamics model. Filter programs developed by use of the GKF software can be made to propagate equations of motion for linear or nonlinear dynamical systems that are deterministic or stochastic. In addition, filter programs can be made to operate in user-selectable "covariance analysis" and "propagation-only" modes that are useful in design and development stages.

  16. Comparison of Code Predictions to Test Measurements for Two Orifice Compensated Hydrostatic Bearings at High Reynolds Numbers

    NASA Technical Reports Server (NTRS)

    Keba, John E.

    1996-01-01

    Rotordynamic coefficients obtained from testing two different hydrostatic bearings are compared to values predicted by two different computer programs. The first set of test data is from a relatively long (L/D=1) orifice compensated hydrostatic bearing tested in water by Texas A&M University (TAMU Bearing No.9). The second bearing is a shorter (L/D=.37) bearing and was tested in a lower viscosity fluid by Rocketdyne Division of Rockwell (Rocketdyne 'Generic' Bearing) at similar rotating speeds and pressures. Computed predictions of bearing rotordynamic coefficients were obtained from the cylindrical seal code 'ICYL', one of the industrial seal codes developed for NASA-LeRC by Mechanical Technology Inc., and from the hydrodynamic bearing code 'HYDROPAD'. The comparison highlights the difference the bearing has on the accuracy of the predictions. The TAMU Bearing No. 9 test data is closely matched by the predictions obtained for the HYDROPAD code (except for added mass terms) whereas significant differences exist between the data from the Rocketdyne 'Generic' bearing the code predictions. The results suggest that some aspects of the fluid behavior in the shorter, higher Reynolds Number 'Generic' bearing may not be modeled accurately in the codes. The ICYL code predictions for flowrate and direct stiffness approximately equal those of HYDROPAD. Significant differences in cross-coupled stiffness and the damping terms were obtained relative to HYDROPAD and both sets of test data. Several observations are included concerning application of the ICYL code.

  17. Pharmaceutical advertisements in prescribing software: an analysis.

    PubMed

    Harvey, Ken J; Vitry, Agnes I; Roughead, Elizabeth; Aroni, Rosalie; Ballenden, Nicola; Faggotter, Ralph

    2005-07-18

    To assess pharmaceutical advertisements in prescribing software, their adherence to code standards, and the opinions of general practitioners regarding the advertisements. Content analysis of advertisements displayed by Medical Director version 2.81 (Health Communication Network, Sydney, NSW) in early 2005; thematic analysis of a debate on this topic held on the General Practice Computer Group email forum (GPCG_talk) during December 2004. Placement, frequency and type of advertisements; their compliance with the Medicines Australia Code of Conduct, and the views of GPs. 24 clinical functions in Medical Director contained advertisements. These included 79 different advertisements for 41 prescription products marketed by 17 companies, including one generic manufacturer. 57 of 60 (95%) advertisements making a promotional claim appeared noncompliant with one or more requirements of the Code. 29 contributors, primarily GPs, posted 174 emails to GPCG_talk; there was little support for these advertisements, but some concern that the price of software would increase if they were removed. We suggest that pharmaceutical promotion in prescribing software should be banned, and inclusion of independent therapeutic information be mandated.

  18. A generic efficient adaptive grid scheme for rocket propulsion modeling

    NASA Technical Reports Server (NTRS)

    Mo, J. D.; Chow, Alan S.

    1993-01-01

    The objective of this research is to develop an efficient, time-accurate numerical algorithm to discretize the Navier-Stokes equations for the predictions of internal one-, two-dimensional and axisymmetric flows. A generic, efficient, elliptic adaptive grid generator is implicitly coupled with the Lower-Upper factorization scheme in the development of ALUNS computer code. The calculations of one-dimensional shock tube wave propagation and two-dimensional shock wave capture, wave-wave interactions, shock wave-boundary interactions show that the developed scheme is stable, accurate and extremely robust. The adaptive grid generator produced a very favorable grid network by a grid speed technique. This generic adaptive grid generator is also applied in the PARC and FDNS codes and the computational results for solid rocket nozzle flowfield and crystal growth modeling by those codes will be presented in the conference, too. This research work is being supported by NASA/MSFC.

  19. Considering quality of life for children with cancer: a systematic review of patient-reported outcome measures and the development of a conceptual model.

    PubMed

    Anthony, Samantha J; Selkirk, Enid; Sung, Lillian; Klaassen, Robert J; Dix, David; Scheinemann, Katrin; Klassen, Anne F

    2014-04-01

    An appraisal of pediatric cancer-specific quality-of-life (QOL) instruments revealed a lack of clarity about what constitutes QOL in this population. This study addresses this concern by identifying the concepts that underpin the construct of QOL as determined by a content analysis of all patient-reported outcome (PRO) instruments used in childhood cancer research. A systematic review was performed of key databases (i.e., MEDLINE, CINAHL, PsychINFO) to identify studies of QOL in children with cancer. A content analysis process was used to code and categorize all items from generic and cancer-specified PRO instruments. Our objective was to provide clarification regarding the conceptual underpinnings of these instruments, as well as to help inform the development of theory and contribute to building a conceptual framework of QOL for children with cancer. A total of 6,013 English language articles were screened, identifying 148 studies. Ten generic and ten cancer-specific PRO instruments provided 957 items. Content analysis led to the identification of four major domains of QOL (physical, psychological, social, and general health), with 11 subdomains covering 98 different concepts. While all instruments reflected items relating to the broader domains of QOL, there was substantial heterogeneity in terms of the content and variability in the distribution of items. This systematic review and the proposed model represent a useful starting point in the critical appraisal of the conceptual underpinnings of PRO instruments used in pediatric oncology and contribute to the need to place such tools under a critical, yet reflective and analytical lens.

  20. Radiation therapy for people with cancer: what do written information materials tell them?

    PubMed

    Smith, S K; Yan, B; Milross, C; Dhillon, H M

    2016-07-01

    This study aimed to compare and contrast the contents of different types of written patient information about radiotherapy, namely (1) hospital radiotherapy departments vs. cancer control organisations and (2) generic vs. tumour-specific materials. A coding framework, informed by existing patients' information needs literature, was developed and applied to 54 radiotherapy information resources. The framework comprised 12 broad themes; cancer diagnosis, general information about radiotherapy, treatment planning, daily treatment, side effects, self-care management, external radiotherapy, internal radiotherapy, impact on daily activities, post-treatment, psychosocial health and other content, such as a glossary. Materials produced by cancer organisations contained significantly more information than hospital resources on diagnosis, general radiotherapy information, internal radiotherapy and psychosocial health. However, hospital materials provided more information about treatment planning, daily treatment and the impact on daily activities. Compared to generic materials, tumour-specific resources were superior in providing information about diagnosis, daily treatment, side effects, post-treatment and psychosocial health. Information about internal radiotherapy, prognosis and chronic side effects were poorly covered by most resources. Collectively, hospital and cancer organisation resources complement each other in meeting patients' information needs. Identifying ways to consolidate different information sources could help comprehensively address patients' medical and psychosocial information needs about radiotherapy. © 2015 John Wiley & Sons Ltd.

  1. Experimental and computational surface and flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Lockman, William K.; Lawrence, Scott L.; Cleary, Joseph W.

    1990-01-01

    The objective of the present investigation is to establish a benchmark experimental data base for a generic hypersonic vehicle shape for validation and/or calibration of advanced computational fluid dynamics computer codes. This paper includes results from the comprehensive test program conducted in the NASA/Ames 3.5-foot Hypersonic Wind Tunnel for a generic all-body hypersonic aircraft model. Experimental and computational results on flow visualization, surface pressures, surface convective heat transfer, and pitot-pressure flow-field surveys are presented. Comparisons of the experimental results with computational results from an upwind parabolized Navier-Stokes code developed at Ames demonstrate the capabilities of this code.

  2. Adapting the coping in deliberation (CODE) framework: a multi-method approach in the context of familial ovarian cancer risk management.

    PubMed

    Witt, Jana; Elwyn, Glyn; Wood, Fiona; Rogers, Mark T; Menon, Usha; Brain, Kate

    2014-11-01

    To test whether the coping in deliberation (CODE) framework can be adapted to a specific preference-sensitive medical decision: risk-reducing bilateral salpingo-oophorectomy (RRSO) in women at increased risk of ovarian cancer. We performed a systematic literature search to identify issues important to women during deliberations about RRSO. Three focus groups with patients (most were pre-menopausal and untested for genetic mutations) and 11 interviews with health professionals were conducted to determine which issues mattered in the UK context. Data were used to adapt the generic CODE framework. The literature search yielded 49 relevant studies, which highlighted various issues and coping options important during deliberations, including mutation status, risks of surgery, family obligations, physician recommendation, peer support and reliable information sources. Consultations with UK stakeholders confirmed most of these factors as pertinent influences on deliberations. Questions in the generic framework were adapted to reflect the issues and coping options identified. The generic CODE framework was readily adapted to a specific preference-sensitive medical decision, showing that deliberations and coping are linked during deliberations about RRSO. Adapted versions of the CODE framework may be used to develop tailored decision support methods and materials in order to improve patient-centred care. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  3. Trading Speed and Accuracy by Coding Time: A Coupled-circuit Cortical Model

    PubMed Central

    Standage, Dominic; You, Hongzhi; Wang, Da-Hui; Dorris, Michael C.

    2013-01-01

    Our actions take place in space and time, but despite the role of time in decision theory and the growing acknowledgement that the encoding of time is crucial to behaviour, few studies have considered the interactions between neural codes for objects in space and for elapsed time during perceptual decisions. The speed-accuracy trade-off (SAT) provides a window into spatiotemporal interactions. Our hypothesis is that temporal coding determines the rate at which spatial evidence is integrated, controlling the SAT by gain modulation. Here, we propose that local cortical circuits are inherently suited to the relevant spatial and temporal coding. In simulations of an interval estimation task, we use a generic local-circuit model to encode time by ‘climbing’ activity, seen in cortex during tasks with a timing requirement. The model is a network of simulated pyramidal cells and inhibitory interneurons, connected by conductance synapses. A simple learning rule enables the network to quickly produce new interval estimates, which show signature characteristics of estimates by experimental subjects. Analysis of network dynamics formally characterizes this generic, local-circuit timing mechanism. In simulations of a perceptual decision task, we couple two such networks. Network function is determined only by spatial selectivity and NMDA receptor conductance strength; all other parameters are identical. To trade speed and accuracy, the timing network simply learns longer or shorter intervals, driving the rate of downstream decision processing by spatially non-selective input, an established form of gain modulation. Like the timing network's interval estimates, decision times show signature characteristics of those by experimental subjects. Overall, we propose, demonstrate and analyse a generic mechanism for timing, a generic mechanism for modulation of decision processing by temporal codes, and we make predictions for experimental verification. PMID:23592967

  4. 40 CFR 1033.110 - Emission diagnostics-general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engine operation. (d) Record and store in computer memory any diagnostic trouble codes showing a... and understand the diagnostic trouble codes stored in the onboard computer with generic tools and...

  5. Patient, physician, pharmacy, and pharmacy benefit design factors related to generic medication use.

    PubMed

    Shrank, William H; Stedman, Margaret; Ettner, Susan L; DeLapp, Dee; Dirstine, June; Brookhart, M Alan; Fischer, Michael A; Avorn, Jerry; Asch, Steven M

    2007-09-01

    Increased use of generic medications conserves insurer and patient financial resources and may increase patient adherence. The objective of the study is to evaluate whether physician, patient, pharmacy benefit design, or pharmacy characteristics influence the likelihood that patients will use generic drugs Observational analysis of 2001-2003 pharmacy claims from a large health plan in the Western United States. We evaluated claims for 5,399 patients who filled a new prescription in at least 1 of 5 classes of chronic medications with generic alternatives. We identified patients initiated on generic drugs and those started on branded medications who switched to generic drugs in the subsequent year. We used generalized estimating equations to perform separate analyses assessing the relationship between independent variables and the probability that patients were initiated on or switched to generic drugs. Of the 5,399 new prescriptions filled, 1,262 (23.4%) were generics. Of those initiated on branded medications, 606 (14.9%) switched to a generic drug in the same class in the subsequent year. After regression adjustment, patients residing in high-income zip codes were more likely to initiate treatment with a generic than patients in low-income regions (RR = 1.29; 95% C.I. 1.04-1.60); medical subspecialists (RR = 0.82; 0.69-0.95) and obstetrician/gynecologists (RR = 0.81; 0.69-0.98) were less likely than generalist physicians to initiate generics. Pharmacy benefit design and pharmacy type were not associated with initiation of generic medications. However, patients were over 2.5 times more likely to switch from branded to generic medications if they were enrolled in 3-tier pharmacy plans (95% C.I. 1.12-6.09), and patients who used mail-order pharmacies were 60% more likely to switch to a generic (95% C.I. 1.18-2.30) after initiating treatment with a branded drug. Physician and patient factors have an important influence on generic drug initiation, with the patients who live in the poorest zip codes paradoxically receiving generic drugs least often. While tiered pharmacy benefit designs and mail-order pharmacies helped steer patients towards generic medications once the first prescription has been filled, they had little effect on initial prescriptions. Providing patients and physicians with information about generic alternatives may reduce costs and lead to more equitable care.

  6. 3D video coding: an overview of present and upcoming standards

    NASA Astrophysics Data System (ADS)

    Merkle, Philipp; Müller, Karsten; Wiegand, Thomas

    2010-07-01

    An overview of existing and upcoming 3D video coding standards is given. Various different 3D video formats are available, each with individual pros and cons. The 3D video formats can be separated into two classes: video-only formats (such as stereo and multiview video) and depth-enhanced formats (such as video plus depth and multiview video plus depth). Since all these formats exist of at least two video sequences and possibly additional depth data, efficient compression is essential for the success of 3D video applications and technologies. For the video-only formats the H.264 family of coding standards already provides efficient and widely established compression algorithms: H.264/AVC simulcast, H.264/AVC stereo SEI message, and H.264/MVC. For the depth-enhanced formats standardized coding algorithms are currently being developed. New and specially adapted coding approaches are necessary, as the depth or disparity information included in these formats has significantly different characteristics than video and is not displayed directly, but used for rendering. Motivated by evolving market needs, MPEG has started an activity to develop a generic 3D video standard within the 3DVC ad-hoc group. Key features of the standard are efficient and flexible compression of depth-enhanced 3D video representations and decoupling of content creation and display requirements.

  7. Enabling Data Intensive Science through Service Oriented Science: Virtual Laboratories and Science Gateways

    NASA Astrophysics Data System (ADS)

    Lescinsky, D. T.; Wyborn, L. A.; Evans, B. J. K.; Allen, C.; Fraser, R.; Rankine, T.

    2014-12-01

    We present collaborative work on a generic, modular infrastructure for virtual laboratories (VLs, similar to science gateways) that combine online access to data, scientific code, and computing resources as services that support multiple data intensive scientific computing needs across a wide range of science disciplines. We are leveraging access to 10+ PB of earth science data on Lustre filesystems at Australia's National Computational Infrastructure (NCI) Research Data Storage Infrastructure (RDSI) node, co-located with NCI's 1.2 PFlop Raijin supercomputer and a 3000 CPU core research cloud. The development, maintenance and sustainability of VLs is best accomplished through modularisation and standardisation of interfaces between components. Our approach has been to break up tightly-coupled, specialised application packages into modules, with identified best techniques and algorithms repackaged either as data services or scientific tools that are accessible across domains. The data services can be used to manipulate, visualise and transform multiple data types whilst the scientific tools can be used in concert with multiple scientific codes. We are currently designing a scalable generic infrastructure that will handle scientific code as modularised services and thereby enable the rapid/easy deployment of new codes or versions of codes. The goal is to build open source libraries/collections of scientific tools, scripts and modelling codes that can be combined in specially designed deployments. Additional services in development include: provenance, publication of results, monitoring, workflow tools, etc. The generic VL infrastructure will be hosted at NCI, but can access alternative computing infrastructures (i.e., public/private cloud, HPC).The Virtual Geophysics Laboratory (VGL) was developed as a pilot project to demonstrate the underlying technology. This base is now being redesigned and generalised to develop a Virtual Hazards Impact and Risk Laboratory (VHIRL); any enhancements and new capabilities will be incorporated into a generic VL infrastructure. At same time, we are scoping seven new VLs and in the process, identifying other common components to prioritise and focus development.

  8. Performance Analysis of GAME: A Generic Automated Marking Environment

    ERIC Educational Resources Information Center

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  9. Generic Critical Thinking Infusion and Course Content Learning in Introductory Psychology

    ERIC Educational Resources Information Center

    Solon, Tom

    2007-01-01

    One group of introductory psychology students received a moderate infusion of generic critical thinking material. The other group did not. Otherwise both groups had the same course content, and took the same pretests and posttests of their critical thinking ability and their knowledge of psychology. The experimental group improved its critical…

  10. A mixed-method study on the generic and ostomy-specific quality of life of cancer and non-cancer ostomy patients.

    PubMed

    Jansen, Femke; van Uden-Kraan, Cornelia F; Braakman, J Annemieke; van Keizerswaard, Paulina M; Witte, Birgit I; Verdonck-de Leeuw, Irma M

    2015-06-01

    The aim of this study is to compare the generic and ostomy-specific quality of life (QoL) between cancer and non-cancer ostomy patients using a mixed-method design. All patients with an ostomy participating in the Stomapanel of the Dutch Ostomy Association were asked to complete a generic (RAND-36) and ostomy-specific (Stoma-QoL) QoL questionnaire. In addition, open-ended questions on symptoms, restrictions or adaptations influencing daily life were included. The generic and ostomy-specific QoL between cancer and non-cancer ostomy patients were compared using linear regression analyses. Qualitative responses were analysed using content analysis. In total, 668 patients were included: 379 cancer patients (80 % colorectal, 17 % bladder and 3 % other) and 289 non-cancer patients (38 % colitis ulcerosa, 22 % Crohn's disease and 40 % other) with a colostomy (55 %), ileostomy (31 %) and/or urostomy (16 %). Adjusted for gender, age, type of ostomy and time elapsed since ostomy surgery, cancer ostomy patients scored higher (better) on Stoma-QoL (β = 2.1) and all RAND-36 domains (9.1 < β ≤ 19.5) except on mental health compared to non-cancer ostomy patients. Of the 33 themes coded for in the content analysis, fatigue or sleeplessness, leakages, pain, bladder or bowel complaints, physical functioning or activity, travelling or being away from home, other daily activities (including work), clothing and diet were among the 10 most frequently reported themes, although ranking differed between both patient groups. Besides, cancer ostomy patients frequently reported on the impact on (engaging in a) relationship or sexual intimacy and non-cancer ostomy patients frequently reported to be relieved of symptoms and restrictions in daily life. Cancer patients reported better generic and ostomy-specific QoL than non-cancer ostomy patients. In both cancer and non-cancer ostomy patients, fatigue or sleeplessness, leakages, pain, bladder or bowel complaints, physical functioning or activity, travelling or being away from home, other daily activities (including work), clothing and diet were among the 10 most common reported themes influencing daily life. However, the ranking of these 10 most common themes was different in both patient groups.

  11. 21 CFR 314.440 - Addresses for applications and abbreviated applications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... mail code for the Office of Generic Drugs is HFD-600, the mail codes for the Divisions of Chemistry I... leukapheresis; (3) Blood component processing solutions and shelf life extenders; and (4) Oxygen carriers. [50...

  12. 21 CFR 314.440 - Addresses for applications and abbreviated applications.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... mail code for the Office of Generic Drugs is HFD-600, the mail codes for the Divisions of Chemistry I... leukapheresis; (3) Blood component processing solutions and shelf life extenders; and (4) Oxygen carriers. [50...

  13. 21 CFR 314.440 - Addresses for applications and abbreviated applications.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... mail code for the Office of Generic Drugs is HFD-600, the mail codes for the Divisions of Chemistry I... leukapheresis; (3) Blood component processing solutions and shelf life extenders; and (4) Oxygen carriers. [50...

  14. 21 CFR 314.440 - Addresses for applications and abbreviated applications.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... mail code for the Office of Generic Drugs is HFD-600, the mail codes for the Divisions of Chemistry I... leukapheresis; (3) Blood component processing solutions and shelf life extenders; and (4) Oxygen carriers. [50...

  15. Faunus: An object oriented framework for molecular simulation

    PubMed Central

    Lund, Mikael; Trulsson, Martin; Persson, Björn

    2008-01-01

    Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331

  16. 78 FR 37848 - ASME Code Cases Not Approved for Use

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-24

    ...The U.S. Nuclear Regulatory Commission (NRC) is issuing for public comment draft regulatory guide (DG), DG-1233, ``ASME Code Cases not Approved for Use.'' This regulatory guide lists the American Society of Mechanical Engineers (ASME) Code Cases that the NRC has determined not to be acceptable for use on a generic basis.

  17. Using a behaviour change techniques taxonomy to identify active ingredients within trials of implementation interventions for diabetes care.

    PubMed

    Presseau, Justin; Ivers, Noah M; Newham, James J; Knittle, Keegan; Danko, Kristin J; Grimshaw, Jeremy M

    2015-04-23

    Methodological guidelines for intervention reporting emphasise describing intervention content in detail. Despite this, systematic reviews of quality improvement (QI) implementation interventions continue to be limited by a lack of clarity and detail regarding the intervention content being evaluated. We aimed to apply the recently developed Behaviour Change Techniques Taxonomy version 1 (BCTTv1) to trials of implementation interventions for managing diabetes to assess the capacity and utility of this taxonomy for characterising active ingredients. Three psychologists independently coded a random sample of 23 trials of healthcare system, provider- and/or patient-focused implementation interventions from a systematic review that included 142 such studies. Intervention content was coded using the BCTTv1, which describes 93 behaviour change techniques (BCTs) grouped within 16 categories. We supplemented the generic coding instructions within the BCTTv1 with decision rules and examples from this literature. Less than a quarter of possible BCTs within the BCTTv1 were identified. For implementation interventions targeting providers, the most commonly identified BCTs included the following: adding objects to the environment, prompts/cues, instruction on how to perform the behaviour, credible source, goal setting (outcome), feedback on outcome of behaviour, and social support (practical). For implementation interventions also targeting patients, the most commonly identified BCTs included the following: prompts/cues, instruction on how to perform the behaviour, information about health consequences, restructuring the social environment, adding objects to the environment, social support (practical), and goal setting (behaviour). The BCTTv1 mapped well onto implementation interventions directly targeting clinicians and patients and could also be used to examine the impact of system-level interventions on clinician and patient behaviour. The BCTTv1 can be used to characterise the active ingredients in trials of implementation interventions and provides specificity of content beyond what is given by broader intervention labels. Identification of BCTs may provide a more helpful means of accumulating knowledge on the content used in trials of implementation interventions, which may help to better inform replication efforts. In addition, prospective use of a behaviour change techniques taxonomy for developing and reporting intervention content would further aid in building a cumulative science of effective implementation interventions.

  18. Generic detection of poleroviruses using an RT-PCR assay targeting the RdRp coding sequence.

    PubMed

    Lotos, Leonidas; Efthimiou, Konstantinos; Maliogka, Varvara I; Katis, Nikolaos I

    2014-03-01

    In this study a two-step RT-PCR assay was developed for the generic detection of poleroviruses. The RdRp coding region was selected as the primers' target, since it differs significantly from that of other members in the family Luteoviridae and its sequence can be more informative than other regions in the viral genome. Species specific RT-PCR assays targeting the same region were also developed for the detection of the six most widespread poleroviral species (Beet mild yellowing virus, Beet western yellows virus, Cucurbit aphid-borne virus, Carrot red leaf virus, Potato leafroll virus and Turnip yellows virus) in Greece and the collection of isolates. These isolates along with other characterized ones were used for the evaluation of the generic PCR's detection range. The developed assay efficiently amplified a 593bp RdRp fragment from 46 isolates of 10 different Polerovirus species. Phylogenetic analysis using the generic PCR's amplicon sequence showed that although it cannot accurately infer evolutionary relationships within the genus it can differentiate poleroviruses at the species level. Overall, the described generic assay could be applied for the reliable detection of Polerovirus infections and, in combination with the specific PCRs, for the identification of new and uncharacterized species in the genus. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. G STL: the geostatistical template library in C++

    NASA Astrophysics Data System (ADS)

    Remy, Nicolas; Shtuka, Arben; Levy, Bruno; Caers, Jef

    2002-10-01

    The development of geostatistics has been mostly accomplished by application-oriented engineers in the past 20 years. The focus on concrete applications gave birth to many algorithms and computer programs designed to address different issues, such as estimating or simulating a variable while possibly accounting for secondary information such as seismic data, or integrating geological and geometrical data. At the core of any geostatistical data integration methodology is a well-designed algorithm. Yet, despite their obvious differences, all these algorithms share many commonalities on which to build a geostatistics programming library, lest the resulting library is poorly reusable and difficult to expand. Building on this observation, we design a comprehensive, yet flexible and easily reusable library of geostatistics algorithms in C++. The recent advent of the generic programming paradigm allows us elegantly to express the commonalities of the geostatistical algorithms into computer code. Generic programming, also referred to as "programming with concepts", provides a high level of abstraction without loss of efficiency. This last point is a major gain over object-oriented programming which often trades efficiency for abstraction. It is not enough for a numerical library to be reusable, it also has to be fast. Because generic programming is "programming with concepts", the essential step in the library design is the careful identification and thorough definition of these concepts shared by most geostatistical algorithms. Building on these definitions, a generic and expandable code can be developed. To show the advantages of such a generic library, we use G STL to build two sequential simulation programs working on two different types of grids—a surface with faults and an unstructured grid—without requiring any change to the G STL code.

  20. Working at the Nexus of Generic and Content-Specific Teaching Practices: An Exploratory Study Based on TIMSS Secondary Analyses

    ERIC Educational Resources Information Center

    Charalambous, Charalambos Y.; Kyriakides, Ermis

    2017-01-01

    For years scholars have attended to either generic or content-specific teaching practices attempting to understand instructional quality and its effects on student learning. Drawing on the TIMSS 2007 and 2011 databases, this exploratory study empirically tests the hypothesis that attending to both types of practices can help better explain student…

  1. 16 CFR 303.41 - Use of fiber trademarks and generic names in advertising.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... use of a fiber trademark shall require a full disclosure of the fiber content information required by... or generic name is used in non-required information in advertising, such fiber trademark or generic... advertising. 303.41 Section 303.41 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC...

  2. Tools for Rapid Understanding of Malware Code

    DTIC Science & Technology

    2015-05-07

    cloaking techniques. We used three malware detectors, covering a wide spectrum of detection technologies, for our experiments: VirusTotal, an online ...Analysis and Manipulation ( SCAM ), 2014. [9] Babak Yadegari, Brian Johannesmeyer, Benjamin Whitely, and Saumya Debray. A generic approach to automatic...and Manipulation ( SCAM ), 2014. [9] Babak Yadegari, Brian Johannesmeyer, Benjamin Whitely, and Saumya Debray. A generic approach to automatic

  3. CRUNCH_PARALLEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shumaker, Dana E.; Steefel, Carl I.

    The code CRUNCH_PARALLEL is a parallel version of the CRUNCH code. CRUNCH code version 2.0 was previously released by LLNL, (UCRL-CODE-200063). Crunch is a general purpose reactive transport code developed by Carl Steefel and Yabusake (Steefel Yabsaki 1996). The code handles non-isothermal transport and reaction in one, two, and three dimensions. The reaction algorithm is generic in form, handling an arbitrary number of aqueous and surface complexation as well as mineral dissolution/precipitation. A standardized database is used containing thermodynamic and kinetic data. The code includes advective, dispersive, and diffusive transport.

  4. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part I: Template-Based Generic Programming

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.

    2012-01-01

    An approach for incorporating embedded simulation and analysis capabilities in complex simulation codes through template-based generic programming is presented. This approach relies on templating and operator overloading within the C++ language to transform a given calculation into one that can compute a variety of additional quantities that are necessary for many state-of-the-art simulation and analysis algorithms. An approach for incorporating these ideas into complex simulation codes through general graph-based assembly is also presented. These ideas have been implemented within a set of packages in the Trilinos framework and are demonstrated on a simple problem from chemical engineering.

  5. Long-Term Exposure to American and European Movies and Television Series Facilitates Caucasian Face Perception in Young Chinese Watchers.

    PubMed

    Wang, Yamin; Zhou, Lu

    2016-10-01

    Most young Chinese people now learn about Caucasian individuals via media, especially American and European movies and television series (AEMT). The current study aimed to explore whether long-term exposure to AEMT facilitates Caucasian face perception in young Chinese watchers. Before the experiment, we created Chinese, Caucasian, and generic average faces (generic average face was created from both Chinese and Caucasian faces) and tested participants' ability to identify them. In the experiment, we asked AEMT watchers and Chinese movie and television series (CMT) watchers to complete a facial norm detection task. This task was developed recently to detect norms used in facial perception. The results indicated that AEMT watchers coded Caucasian faces relative to a Caucasian face norm better than they did to a generic face norm, whereas no such difference was found among CMT watchers. All watchers coded Chinese faces by referencing a Chinese norm better than they did relative to a generic norm. The results suggested that long-term exposure to AEMT has the same effect as daily other-race face contact in shaping facial perception. © The Author(s) 2016.

  6. Breaking and Fixing Origin-Based Access Control in Hybrid Web/Mobile Application Frameworks.

    PubMed

    Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly

    2014-02-01

    Hybrid mobile applications (apps) combine the features of Web applications and "native" mobile apps. Like Web applications, they are implemented in portable, platform-independent languages such as HTML and JavaScript. Like native apps, they have direct access to local device resources-file system, location, camera, contacts, etc. Hybrid apps are typically developed using hybrid application frameworks such as PhoneGap. The purpose of the framework is twofold. First, it provides an embedded Web browser (for example, WebView on Android) that executes the app's Web code. Second, it supplies "bridges" that allow Web code to escape the browser and access local resources on the device. We analyze the software stack created by hybrid frameworks and demonstrate that it does not properly compose the access-control policies governing Web code and local code, respectively. Web code is governed by the same origin policy, whereas local code is governed by the access-control policy of the operating system (for example, user-granted permissions in Android). The bridges added by the framework to the browser have the same local access rights as the entire application, but are not correctly protected by the same origin policy. This opens the door to fracking attacks, which allow foreign-origin Web content included into a hybrid app (e.g., ads confined in iframes) to drill through the layers and directly access device resources. Fracking vulnerabilities are generic: they affect all hybrid frameworks, all embedded Web browsers, all bridge mechanisms, and all platforms on which these frameworks are deployed. We study the prevalence of fracking vulnerabilities in free Android apps based on the PhoneGap framework. Each vulnerability exposes sensitive local resources-the ability to read and write contacts list, local files, etc.-to dozens of potentially malicious Web domains. We also analyze the defenses deployed by hybrid frameworks to prevent resource access by foreign-origin Web content and explain why they are ineffectual. We then present NoFrak, a capability-based defense against fracking attacks. NoFrak is platform-independent, compatible with any framework and embedded browser, requires no changes to the code of the existing hybrid apps, and does not break their advertising-supported business model.

  7. Virtual Frame Buffer Interface Program

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas L.

    1990-01-01

    Virtual Frame Buffer Interface program makes all frame buffers appear as generic frame buffer with specified set of characteristics, allowing programmers to write codes that run unmodified on all supported hardware. Converts generic commands to actual device commands. Consists of definition of capabilities and FORTRAN subroutines called by application programs. Developed in FORTRAN 77 for DEC VAX 11/780 or DEC VAX 11/750 computer under VMS 4.X.

  8. "This Is Spiderman's Mask." "No, It's Green Goblin's": Shared Meanings during Boys' Pretend Play with Superhero and Generic Toys

    ERIC Educational Resources Information Center

    Parsons, Amy; Howe, Nina

    2013-01-01

    Preschool boys' pretense and coconstruction of shared meanings during two play sessions (superhero and generic toys) were investigated with 58 middle-class boys ("M" age = 54.95 mos.). The frequency of dyadic pretense and the coconstruction of shared meanings in the play were coded. The frequency of pretense did not vary across the two…

  9. PHARMACEUTICAL QUALITY OF GENERIC ATORVASTATIN PRODUCTS COMPARED WITH THE INNOVATOR PRODUCT: A NEED FOR REVISING PRICING POLICY IN PALESTINE.

    PubMed

    Shawahna, Ramzi; Hroub, Abdel Kareem; Abed, Eliama; Jibali, Sondos; Al-Saghir, Ruba; Zaid, Abdel Naser

    2016-01-01

    Atorvastatin reduces morbidity and mortality due to cardiovascular events. This study was conducted to assess the prices and pharmaceutical quality of innovator atorvastatin 20 mg with its locally available generics in Palestine and to assess the suitability of their interchangeability. The prices of innovator and generic atorvastatin 20 mg were determined and compared. Innovator atorvastatin and four generic products were tested for their pharmaceutical quality. Tablets were tested for their drug contents, weight uniformity, hardness, disintegration and dissolution. Three out of four generics were less expensive than the innovator. Pharmaceutical quality assessments were satisfactory and within limits for all atorvastatin tested products. The average weight ranged from 206.6 ± 8.40 to 330 ± 3.92 mg and the %RSDs were within the permitted limits as per USP. Tablet hardness ranged from 102 ± 1.41 to 197.4 ± 6.88 kg and drug contents ranged from 92.2% to 105.3%. All products disintegrated within permitted time limits and showed very rapid dissolution. Products released more than 85% of their drug contents in less than 15 min. Our results showed that all tested innovator and generic atorvastatin products were of good pharmaceutical quality. Despite the lack of in vivo evaluation, our results indicate that these products are equivalent in vitro. Considering the in vitro release characteristics, these products might be used interchangeably. However, regulatory authorities permit the use of in vitro data in establishing similarity between immediate release oral dosage forms containing biopharmaceutical classification system class I and III drugs only.

  10. ISO-IEC MPEG-2 software video codec

    NASA Astrophysics Data System (ADS)

    Eckart, Stefan; Fogg, Chad E.

    1995-04-01

    Part 5 of the International Standard ISO/IEC 13818 `Generic Coding of Moving Pictures and Associated Audio' (MPEG-2) is a Technical Report, a sample software implementation of the procedures in parts 1, 2 and 3 of the standard (systems, video, and audio). This paper focuses on the video software, which gives an example of a fully compliant implementation of the standard and of a good video quality encoder, and serves as a tool for compliance testing. The implementation and some of the development aspects of the codec are described. The encoder is based on Test Model 5 (TM5), one of the best, published, non-proprietary coding models, which was used during MPEG-2 collaborative stage to evaluate proposed algorithms and to verify the syntax. The most important part of the Test Model is controlling the quantization parameter based on the image content and bit rate constraints under both signal-to-noise and psycho-optical aspects. The decoder has been successfully tested for compliance with the MPEG-2 standard, using the ISO/IEC MPEG verification and compliance bitstream test suites as stimuli.

  11. (390) Proposal to preclude homonymy of generic names with names of intergeneric graft-hybrids (chimaeras)

    USDA-ARS?s Scientific Manuscript database

    The International Code of Nomenclature for algae, fungi and plants is revised every six years to incorporate decisions of the Nomenclature Section of successive International Botanical Congresses (IBC) on proposals to amend the Code. The proposal in this paper will be considered at the IBC in Shenzh...

  12. PelePhysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-05-17

    PelePhysics is a suite of physics packages that provides functionality of use to reacting hydrodynamics CFD codes. The initial release includes an interface to reaction rate mechanism evaluation, transport coefficient evaluation, and a generalized equation of state (EOS) facility. Both generic evaluators and interfaces to code from externally available tools (Fuego for chemical rates, EGLib for transport coefficients) are provided.

  13. A generic archive protocol and an implementation

    NASA Technical Reports Server (NTRS)

    Jordan, J. M.; Jennings, D. G.; Mcglynn, T. A.; Ruggiero, N. G.; Serlemitsos, T. A.

    1992-01-01

    Archiving vast amounts of data has become a major part of every scientific space mission today. The Generic Archive/Retrieval Services Protocol (GRASP) addresses the question of how to archive the data collected in an environment where the underlying hardware archives may be rapidly changing. GRASP is a device independent specification defining a set of functions for storing and retrieving data from an archive, as well as other support functions. GRASP is divided into two levels: the Transfer Interface and the Action Interface. The Transfer Interface is computer/archive independent code while the Action Interface contains code which is dedicated to each archive/computer addressed. Implementations of the GRASP specification are currently available for DECstations running Ultrix, Sparcstations running SunOS, and microVAX/VAXstation 3100's. The underlying archive is assumed to function as a standard Unix or VMS file system. The code, written in C, is a single suite of files. Preprocessing commands define the machine unique code sections in the device interface. The implementation was written, to the greatest extent possible, using only ANSI standard C functions.

  14. JACOB: an enterprise framework for computational chemistry.

    PubMed

    Waller, Mark P; Dresselhaus, Thomas; Yang, Jack

    2013-06-15

    Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.

  15. Architecture-driven reuse of code in KASE

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay

    1993-01-01

    In order to support the synthesis of large, complex software systems, we need to focus on issues pertaining to the architectural design of a system in addition to algorithm and data structure design. An approach that is based on abstracting the architectural design of a set of problems in the form of a generic architecture, and providing tools that can be used to instantiate the generic architecture for specific problem instances is presented. Such an approach also facilitates reuse of code between different systems belonging to the same problem class. An application of our approach on a realistic problem is described; the results of the exercise are presented; and how our approach compares to other work in this area is discussed.

  16. Generic Algorithms for Estimating Foliar Pigment Content

    NASA Astrophysics Data System (ADS)

    Gitelson, Anatoly; Solovchenko, Alexei

    2017-09-01

    Foliar pigment contents and composition are main factors governing absorbed photosynthetically active radiation, photosynthetic activity, and physiological status of vegetation. In this study the performance of nondestructive techniques based on leaf reflectance were tested for estimating chlorophyll (Chl) and anthocyanin (AnC) contents in species with widely variable leaf structure, pigment content, and composition. Only three spectral bands (green, red edge, and near-infrared) are required for nondestructive Chl and AnC estimation with normalized root-mean-square error (NRMSE) below 4.5% and 6.1%, respectively. The algorithms developed are generic, not requiring reparameterization for each species allowing for accurate nondestructive Chl and AnC estimation using simple handheld field/lab instrumentation. They also have potential in interpretation of airborne and satellite data.

  17. An object oriented generic controller using CLIPS

    NASA Technical Reports Server (NTRS)

    Nivens, Cody R.

    1990-01-01

    In today's applications, the need for the division of code and data has focused on the growth of object oriented programming. This philosophy gives software engineers greater control over the environment of an application. Yet the use of object oriented design does not exclude the need for greater understanding by the application of what the controller is doing. Such understanding is only possible by using expert systems. Providing a controller that is capable of controlling an object by using rule-based expertise would expedite the use of both object oriented design and expert knowledge of the dynamic of an environment in modern controllers. This project presents a model of a controller that uses the CLIPS expert system and objects in C++ to create a generic controller. The polymorphic abilities of C++ allow for the design of a generic component stored in individual data files. Accompanying the component is a set of rules written in CLIPS which provide the following: the control of individual components, the input of sensory data from components and the ability to find the status of a given component. Along with the data describing the application, a set of inference rules written in CLIPS allows the application to make use of sensory facts and status and control abilities. As a demonstration of this ability, the control of the environment of a house is provided. This demonstration includes the data files describing the rooms and their contents as far as devices, windows and doors. The rules used for the home consist of the flow of people in the house and the control of devices by the home owner.

  18. Storybooks aren't just for fun: narrative and non-narrative picture books foster equal amounts of generic language during mother-toddler book sharing

    PubMed Central

    Nyhout, Angela; O'Neill, Daniela K.

    2014-01-01

    Parents and children encounter a variety of animals and objects in the early picture books they share, but little is known about how the context in which these entities are presented influences talk about them. The present study investigated how the presence or absence of a visual narrative context influences mothers' tendency to refer to animals as individual characters or as members of a kind when sharing picture books with their toddlers (mean age 21.3 months). Mother-child dyads shared both a narrative and a non-narrative book, each featuring six animals and matched in terms of length and quantity of text. Mothers made more specific (individual-referring) statements about animals in the narrative books, whereas they provided more labels for animals in the non-narrative books. But, of most interest, the frequency and proportion of mothers' use of generic (kind-referring) utterances did not differ across the two different types of books. Further coding of the content of the utterances revealed that mothers provided more story-specific descriptions of states and actions of the animals when sharing narrative books and more physical descriptions of animals when sharing non-narrative books. However, the two books did not differ in terms of their elicitation of natural facts about the animals. Overall, although the two types of books encouraged different types of talk from mothers, they stimulated generic language and talk about natural facts to an equal degree. Implications for learning from picture storybooks and book genre selection in classrooms and home reading are discussed. PMID:24795675

  19. Storybooks aren't just for fun: narrative and non-narrative picture books foster equal amounts of generic language during mother-toddler book sharing.

    PubMed

    Nyhout, Angela; O'Neill, Daniela K

    2014-01-01

    Parents and children encounter a variety of animals and objects in the early picture books they share, but little is known about how the context in which these entities are presented influences talk about them. The present study investigated how the presence or absence of a visual narrative context influences mothers' tendency to refer to animals as individual characters or as members of a kind when sharing picture books with their toddlers (mean age 21.3 months). Mother-child dyads shared both a narrative and a non-narrative book, each featuring six animals and matched in terms of length and quantity of text. Mothers made more specific (individual-referring) statements about animals in the narrative books, whereas they provided more labels for animals in the non-narrative books. But, of most interest, the frequency and proportion of mothers' use of generic (kind-referring) utterances did not differ across the two different types of books. Further coding of the content of the utterances revealed that mothers provided more story-specific descriptions of states and actions of the animals when sharing narrative books and more physical descriptions of animals when sharing non-narrative books. However, the two books did not differ in terms of their elicitation of natural facts about the animals. Overall, although the two types of books encouraged different types of talk from mothers, they stimulated generic language and talk about natural facts to an equal degree. Implications for learning from picture storybooks and book genre selection in classrooms and home reading are discussed.

  20. Formal proof of the AVM-1 microprocessor using the concept of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    A microprocessor designated AVM-1 was designed to demonstrate the use of generic interpreters in verifying hierarchically decomposed microprocessor specifications. This report is intended to document the high-order language (HOL) code verifying AVM-1. The organization of the proof is discussed and some technical details concerning the execution of the proof scripts in HOL are presented. The proof scripts used to verify AVM-1 are also presented.

  1. Euler technology assessment for preliminary aircraft design employing OVERFLOW code with multiblock structured-grid method

    NASA Technical Reports Server (NTRS)

    Treiber, David A.; Muilenburg, Dennis A.

    1995-01-01

    The viability of applying a state-of-the-art Euler code to calculate the aerodynamic forces and moments through maximum lift coefficient for a generic sharp-edge configuration is assessed. The OVERFLOW code, a method employing overset (Chimera) grids, was used to conduct mesh refinement studies, a wind-tunnel wall sensitivity study, and a 22-run computational matrix of flow conditions, including sideslip runs and geometry variations. The subject configuration was a generic wing-body-tail geometry with chined forebody, swept wing leading-edge, and deflected part-span leading-edge flap. The analysis showed that the Euler method is adequate for capturing some of the non-linear aerodynamic effects resulting from leading-edge and forebody vortices produced at high angle-of-attack through C(sub Lmax). Computed forces and moments, as well as surface pressures, match well enough useful preliminary design information to be extracted. Vortex burst effects and vortex interactions with the configuration are also investigated.

  2. Experimental and numerical results for a generic axisymmetric single-engine afterbody with tails at transonic speeds

    NASA Technical Reports Server (NTRS)

    Burley, J. R., II; Carlson, J. R.; Henderson, W. P.

    1986-01-01

    Static pressure measurements were made on the afterbody, nozzle and tails of a generic single-engine axisymmetric fighter configuration. Data were recorded at Mach numbers of 0.6, 0.9, and 1.2. NPR was varied from 1.0 to 8.0 and angle of attack was varied from -3 deg. to 9 deg. Experimental data were compared with numerical results from two state-of-the-art computer codes.

  3. Breaking and Fixing Origin-Based Access Control in Hybrid Web/Mobile Application Frameworks

    PubMed Central

    Georgiev, Martin; Jana, Suman; Shmatikov, Vitaly

    2014-01-01

    Hybrid mobile applications (apps) combine the features of Web applications and “native” mobile apps. Like Web applications, they are implemented in portable, platform-independent languages such as HTML and JavaScript. Like native apps, they have direct access to local device resources—file system, location, camera, contacts, etc. Hybrid apps are typically developed using hybrid application frameworks such as PhoneGap. The purpose of the framework is twofold. First, it provides an embedded Web browser (for example, WebView on Android) that executes the app's Web code. Second, it supplies “bridges” that allow Web code to escape the browser and access local resources on the device. We analyze the software stack created by hybrid frameworks and demonstrate that it does not properly compose the access-control policies governing Web code and local code, respectively. Web code is governed by the same origin policy, whereas local code is governed by the access-control policy of the operating system (for example, user-granted permissions in Android). The bridges added by the framework to the browser have the same local access rights as the entire application, but are not correctly protected by the same origin policy. This opens the door to fracking attacks, which allow foreign-origin Web content included into a hybrid app (e.g., ads confined in iframes) to drill through the layers and directly access device resources. Fracking vulnerabilities are generic: they affect all hybrid frameworks, all embedded Web browsers, all bridge mechanisms, and all platforms on which these frameworks are deployed. We study the prevalence of fracking vulnerabilities in free Android apps based on the PhoneGap framework. Each vulnerability exposes sensitive local resources—the ability to read and write contacts list, local files, etc.—to dozens of potentially malicious Web domains. We also analyze the defenses deployed by hybrid frameworks to prevent resource access by foreign-origin Web content and explain why they are ineffectual. We then present NoFrak, a capability-based defense against fracking attacks. NoFrak is platform-independent, compatible with any framework and embedded browser, requires no changes to the code of the existing hybrid apps, and does not break their advertising-supported business model. PMID:25485311

  4. Mathematical Modeling Of Life-Support Systems

    NASA Technical Reports Server (NTRS)

    Seshan, Panchalam K.; Ganapathi, Balasubramanian; Jan, Darrell L.; Ferrall, Joseph F.; Rohatgi, Naresh K.

    1994-01-01

    Generic hierarchical model of life-support system developed to facilitate comparisons of options in design of system. Model represents combinations of interdependent subsystems supporting microbes, plants, fish, and land animals (including humans). Generic model enables rapid configuration of variety of specific life support component models for tradeoff studies culminating in single system design. Enables rapid evaluation of effects of substituting alternate technologies and even entire groups of technologies and subsystems. Used to synthesize and analyze life-support systems ranging from relatively simple, nonregenerative units like aquariums to complex closed-loop systems aboard submarines or spacecraft. Model, called Generic Modular Flow Schematic (GMFS), coded in such chemical-process-simulation languages as Aspen Plus and expressed as three-dimensional spreadsheet.

  5. Population exposure to smoking and tobacco branding in the UK reality show 'Love Island'.

    PubMed

    Barker, Alexander B; Opazo Breton, Magdalena; Cranwell, Jo; Britton, John; Murray, Rachael L

    2018-02-05

    Reality television shows are popular with children and young adults; inclusion of tobacco imagery in these programmes is likely to cause smoking in these groups. Series 3 of the UK reality show Love Island, broadcast in 2017, attracted widespread media criticism for high levels of smoking depicted. We have quantified this tobacco content and estimated the UK population exposure to generic and branded tobacco imagery generated by the show. We used 1-min interval coding to quantify actual or implied tobacco use, tobacco paraphernalia or branding, in alternate episodes of series 3 of Love Island, and Census data and viewing figures from Kantar Media to estimate gross and per capita tobacco impressions. We coded 21 episodes comprising 1001 min of content. Tobacco imagery occurred in 204 (20%) intervals; the frequency of appearances fell significantly after media criticism. An identifiable cigarette brand, Lucky Strike Double Click, appeared in 16 intervals. The 21 episodes delivered an estimated 559 million gross tobacco impressions to the UK population, predominantly to women, including 47 million to children aged <16 and 44 million gross impressions of Lucky Strike branding, including 4 million to children <16. Despite advertising legislation and broadcasting regulations intended to protect children from smoking imagery in UK television, series 3 of Love Island delivered millions of general and branded tobacco impressions both to children and adults in the UK. More stringent controls on tobacco content in television programmes are urgently needed. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Linkage of additional contents to moving objects and video shots in a generic media framework for interactive television

    NASA Astrophysics Data System (ADS)

    Lopez, Alejandro; Noe, Miquel; Fernandez, Gabriel

    2004-10-01

    The GMF4iTV project (Generic Media Framework for Interactive Television) is an IST European project that consists of an end-to-end broadcasting platform providing interactivity on heterogeneous multimedia devices such as Set-Top-Boxes and PCs according to the Multimedia Home Platform (MHP) standard from DVB. This platform allows the content providers to create enhanced audiovisual contents with a degree of interactivity at moving object level or shot change from a video. The end user is then able to interact with moving objects from the video or individual shots allowing the enjoyment of additional contents associated to them (MHP applications, HTML pages, JPEG, MPEG4 files...). This paper focus the attention to the issues related to metadata and content transmission, synchronization, signaling and bitrate allocation of the GMF4iTV project.

  7. DoD Electronic Data Interchange (EDI) Convention: ASC X12 Transaction Set 859 Generic Freight Invoice (Version 003020)

    DTIC Science & Technology

    1993-04-01

    FREIGHT INVOICE (VERSION 003020) FORMATTING INVOICE INFORMATION FOR THE DoD TRANSPORTATION PAYMENT SYSTEM USING THE X1 2.55 TRANSACTION SET 859 GENERIC...GBYERIC FREIGHT NIVOICE EDI CONVENTON 859.003020 * Contents FORMATTING INVOICE INFORMATION FOR THE DoD TRANSPORTATION PAYMENT SYSTEM USING THE Xl 2.55... transportation invoice using the ASC X12.55 Transaction Set 859 Generic Freight Invoice (003020). It contains information for the design of interface

  8. Unitals and ovals of symmetric block designs in LDPC and space-time coding

    NASA Astrophysics Data System (ADS)

    Andriamanalimanana, Bruno R.

    2004-08-01

    An approach to the design of LDPC (low density parity check) error-correction and space-time modulation codes involves starting with known mathematical and combinatorial structures, and deriving code properties from structure properties. This paper reports on an investigation of unital and oval configurations within generic symmetric combinatorial designs, not just classical projective planes, as the underlying structure for classes of space-time LDPC outer codes. Of particular interest are the encoding and iterative (sum-product) decoding gains that these codes may provide. Various small-length cases have been numerically implemented in Java and Matlab for a number of channel models.

  9. Code Samples Used for Complexity and Control

    NASA Astrophysics Data System (ADS)

    Ivancevic, Vladimir G.; Reid, Darryn J.

    2015-11-01

    The following sections are included: * MathematicaⓇ Code * Generic Chaotic Simulator * Vector Differential Operators * NLS Explorer * 2C++ Code * C++ Lambda Functions for Real Calculus * Accelerometer Data Processor * Simple Predictor-Corrector Integrator * Solving the BVP with the Shooting Method * Linear Hyperbolic PDE Solver * Linear Elliptic PDE Solver * Method of Lines for a Set of the NLS Equations * C# Code * Iterative Equation Solver * Simulated Annealing: A Function Minimum * Simple Nonlinear Dynamics * Nonlinear Pendulum Simulator * Lagrangian Dynamics Simulator * Complex-Valued Crowd Attractor Dynamics * Freeform Fortran Code * Lorenz Attractor Simulator * Complex Lorenz Attractor * Simple SGE Soliton * Complex Signal Presentation * Gaussian Wave Packet * Hermitian Matrices * Euclidean L2-Norm * Vector/Matrix Operations * Plain C-Code: Levenberg-Marquardt Optimizer * Free Basic Code: 2D Crowd Dynamics with 3000 Agents

  10. Composite Load Spectra for Select Space Propulsion Structural Components

    NASA Technical Reports Server (NTRS)

    Ho, Hing W.; Newell, James F.

    1994-01-01

    Generic load models are described with multiple levels of progressive sophistication to simulate the composite (combined) load spectra (CLS) that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades and liquid oxygen (LOX) posts. These generic (coupled) models combine the deterministic models for composite load dynamic, acoustic, high-pressure and high rotational speed, etc., load simulation using statistically varying coefficients. These coefficients are then determined using advanced probabilistic simulation methods with and without strategically selected experimental data. The entire simulation process is included in a CLS computer code. Applications of the computer code to various components in conjunction with the PSAM (Probabilistic Structural Analysis Method) to perform probabilistic load evaluation and life prediction evaluations are also described to illustrate the effectiveness of the coupled model approach.

  11. Longitudinal aerodynamic characteristics of a generic fighter model with a wing designed for sustained transonic maneuver conditions

    NASA Technical Reports Server (NTRS)

    Ferris, J. C.

    1986-01-01

    A wind-tunnel investigation was made to determine the longitudinal aerodynamic characteristics of a fixed-wing generic fighter model with a wing designed for sustained transonic maneuver conditions. The airfoil sections on the wing were designed with a two-dimensional nonlinear computer code, and the root and tip section were modified with a three-dimensional code. The wing geometric characteristics were as follows: a leading-edge sweep of 45 degrees, a taper ratio of 0.2141, an aspect ratio of 3.30, and a thickness ratio of 0.044. The model was investigated at Mach numbers from 0.600 to 1.200, at Reynolds numbers, based on the model reference length, from 2,560,000 to 3,970,000, and through a model angle-of-attack range from -5 to +18 degrees.

  12. Content specificity of attention bias to threat in anxiety disorders: a meta-analysis.

    PubMed

    Pergamin-Hight, Lee; Naim, Reut; Bakermans-Kranenburg, Marian J; van IJzendoorn, Marinus H; Bar-Haim, Yair

    2015-02-01

    Despite the established evidence for threat-related attention bias in anxiety, the mechanisms underlying this bias remain unclear. One important unresolved question is whether disorder-congruent threats capture attention to a greater extent than do more general or disorder-incongruent threat stimuli. Evidence for attention bias specificity in anxiety would implicate involvement of previous learning and memory processes in threat-related attention bias, whereas lack of content specificity would point to perturbations in more generic attention processes. Enhanced clarity of mechanism could have clinical implications for the stimuli types used in Attention Bias Modification Treatments (ABMT). Content specificity of threat-related attention bias in anxiety and potential moderators of this effect were investigated. A systematic search identified 37 samples from 29 articles (N=866). Relevant data were extracted based on specific coding rules, and Cohen's d effect size was used to estimate bias specificity effects. The results indicate greater attention bias toward disorder-congruent relative to disorder-incongruent threat stimuli (d=0.28, p<0.0001). This effect was not moderated by age, type of anxiety disorder, visual attention tasks, or type of disorder-incongruent stimuli. No evidence of publication bias was observed. Implications for threat bias in anxiety and ABMT are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  14. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  15. Generating Customized Verifiers for Automatically Generated Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2008-01-01

    Program verification using Hoare-style techniques requires many logical annotations. We have previously developed a generic annotation inference algorithm that weaves in all annotations required to certify safety properties for automatically generated code. It uses patterns to capture generator- and property-specific code idioms and property-specific meta-program fragments to construct the annotations. The algorithm is customized by specifying the code patterns and integrating them with the meta-program fragments for annotation construction. However, this is difficult since it involves tedious and error-prone low-level term manipulations. Here, we describe an annotation schema compiler that largely automates this customization task using generative techniques. It takes a collection of high-level declarative annotation schemas tailored towards a specific code generator and safety property, and generates all customized analysis functions and glue code required for interfacing with the generic algorithm core, thus effectively creating a customized annotation inference algorithm. The compiler raises the level of abstraction and simplifies schema development and maintenance. It also takes care of some more routine aspects of formulating patterns and schemas, in particular handling of irrelevant program fragments and irrelevant variance in the program structure, which reduces the size, complexity, and number of different patterns and annotation schemas that are required. The improvements described here make it easier and faster to customize the system to a new safety property or a new generator, and we demonstrate this by customizing it to certify frame safety of space flight navigation code that was automatically generated from Simulink models by MathWorks' Real-Time Workshop.

  16. Adapting Technological Pedagogical Content Knowledge Framework to Teach Mathematics

    ERIC Educational Resources Information Center

    Getenet, Seyum Tekeher

    2017-01-01

    The technological pedagogical content knowledge framework is increasingly in use by educational technology researcher as a generic description of the knowledge requirements for teachers using technology in all subjects. This study describes the development of a mathematics specific variety of the technological pedagogical content knowledge…

  17. "It Is Complicated!": Practices and Challenges of Generic Skills Assessment in Vietnamese Universities

    ERIC Educational Resources Information Center

    Nghia, Tran Le Huu

    2018-01-01

    Contributing to a lack of studies related to generic skills (GS) assessment, especially in non-Western university contexts, this article reports a study that explored practices and challenges of assessing students' GS in the Business Administration programmes in six Vietnamese universities. Content analysis of interviews with 41 teachers of skills…

  18. Generic Software Architecture for Launchers

    NASA Astrophysics Data System (ADS)

    Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre

    2015-09-01

    The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.

  19. Generic reactive transport codes as flexible tools to integrate soil organic matter degradation models with water, transport and geochemistry in soils

    NASA Astrophysics Data System (ADS)

    Jacques, Diederik; Gérard, Fréderic; Mayer, Uli; Simunek, Jirka; Leterme, Bertrand

    2016-04-01

    A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases strictly hard-coded in terms of organic pools, degradation kinetics and dependency on environmental variables. The scientific input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, pH-dependent sorption and colloid-facilitated transport are not incorporated in many of these models, strongly limiting the scope of their application. Furthermore, the most comprehensive organic matter degradation models are combined with simplified representations of flow and transport processes in the soil system. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes constrained by thermodynamic principles and/or kinetic reaction networks. The flexibility of these type of codes allows for straight-forward extension of reaction networks, permits the inclusion of new model components (e.g.: organic matter pools, rate equations, parameter dependency on environmental conditions) and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow (Richards equation), solute transport (advection-dispersion equation), heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter and inorganic carbon in the aqueous and gaseous phases, as well as different decomposition functions with first-order, linear dependence or nonlinear dependence on a biomass pool. In addition, we show how processes such as local bioturbation (bio-diffusion) can be included implicitly through a Fickian formulation of transport of soil organic matter. Coupling soil organic matter models with generic and flexible reactive transport codes offers a valuable tool to enhance insights into coupled physico-chemical processes at different scales within the scope of C-biogeochemical cycles, possibly linked with other chemical elements such as plant nutrients and pollutants.

  20. Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Penn, John

    2014-01-01

    This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.

  1. The Alba ray tracing code: ART

    NASA Astrophysics Data System (ADS)

    Nicolas, Josep; Barla, Alessandro; Juanhuix, Jordi

    2013-09-01

    The Alba ray tracing code (ART) is a suite of Matlab functions and tools for the ray tracing simulation of x-ray beamlines. The code is structured in different layers, which allow its usage as part of optimization routines as well as an easy control from a graphical user interface. Additional tools for slope error handling and for grating efficiency calculations are also included. Generic characteristics of ART include the accumulation of rays to improve statistics without memory limitations, and still providing normalized values of flux and resolution in physically meaningful units.

  2. Air carrier operations system model

    DOT National Transportation Integrated Search

    2001-03-01

    Representatives from the Federal Aviation Administration (FAA) and several 14 Code of Federal Regulations (CFR) Part 121 air carriers met several times during 1999-2000 to develop a system engineering model of the generic functions of air carrier ope...

  3. ImgLib2--generic image processing in Java.

    PubMed

    Pietzsch, Tobias; Preibisch, Stephan; Tomancák, Pavel; Saalfeld, Stephan

    2012-11-15

    ImgLib2 is an open-source Java library for n-dimensional data representation and manipulation with focus on image processing. It aims at minimizing code duplication by cleanly separating pixel-algebra, data access and data representation in memory. Algorithms can be implemented for classes of pixel types and generic access patterns by which they become independent of the specific dimensionality, pixel type and data representation. ImgLib2 illustrates that an elegant high-level programming interface can be achieved without sacrificing performance. It provides efficient implementations of common data types, storage layouts and algorithms. It is the data model underlying ImageJ2, the KNIME Image Processing toolbox and an increasing number of Fiji-Plugins. ImgLib2 is licensed under BSD. Documentation and source code are available at http://imglib2.net and in a public repository at https://github.com/imagej/imglib. Supplementary data are available at Bioinformatics Online. saalfeld@mpi-cbg.de

  4. Relative stability of DNA as a generic criterion for promoter prediction: whole genome annotation of microbial genomes with varying nucleotide base composition.

    PubMed

    Rangannan, Vetriselvi; Bansal, Manju

    2009-12-01

    The rapid increase in genome sequence information has necessitated the annotation of their functional elements, particularly those occurring in the non-coding regions, in the genomic context. Promoter region is the key regulatory region, which enables the gene to be transcribed or repressed, but it is difficult to determine experimentally. Hence an in silico identification of promoters is crucial in order to guide experimental work and to pin point the key region that controls the transcription initiation of a gene. In this analysis, we demonstrate that while the promoter regions are in general less stable than the flanking regions, their average free energy varies depending on the GC composition of the flanking genomic sequence. We have therefore obtained a set of free energy threshold values, for genomic DNA with varying GC content and used them as generic criteria for predicting promoter regions in several microbial genomes, using an in-house developed tool PromPredict. On applying it to predict promoter regions corresponding to the 1144 and 612 experimentally validated TSSs in E. coli (50.8% GC) and B. subtilis (43.5% GC) sensitivity of 99% and 95% and precision values of 58% and 60%, respectively, were achieved. For the limited data set of 81 TSSs available for M. tuberculosis (65.6% GC) a sensitivity of 100% and precision of 49% was obtained.

  5. Active and passive surveillance of enoxaparin generics: a case study relevant to biosimilars.

    PubMed

    Grampp, Gustavo; Bonafede, Machaon; Felix, Thomas; Li, Edward; Malecki, Michael; Sprafka, J Michael

    2015-03-01

    This retrospective analysis assessed the capability of active and passive safety surveillance systems to track product-specific safety events in the USA for branded and generic enoxaparin, a complex injectable subject to immune-related and other adverse events (AEs). Analysis of heparin-induced thrombocytopenia (HIT) incidence was performed on benefit claims for commercial and Medicare supplemental-insured individuals newly treated with enoxaparin under pharmacy benefit (1 January 2009 - 30 June 2012). Additionally, spontaneous reports from the FDA AE Reporting System were reviewed to identify incidence and attribution of enoxaparin-related reports to specific manufacturers. Specific, dispensed products were identifiable from National Drug Codes only in pharmacy-benefit databases, permitting sensitive comparison of HIT incidence in nearly a third of patients treated with brand or generic enoxaparin. After originator medicine's loss of exclusivity, only 5% of spontaneous reports were processed by generic manufacturers; reports attributable to specific generics were approximately ninefold lower than expected based on market share. Claims data were useful for active surveillance of enoxaparin generics dispensed under pharmacy benefits but not for products administered under medical benefits. These findings suggest that the current spontaneous reporting system will not distinguish product-specific safety signals for products distributed by multiple manufacturers, including biosimilars.

  6. Past and Future Directions in Content Area Literacies

    ERIC Educational Resources Information Center

    Bean, Tom; O'Brien, David

    2013-01-01

    In this column, content area literacy scholars Tom Bean and David O'Brien challenge the older "infusion" model of content area literacy with its emphasis on generic strategies. Rather, they argue for and provide examples of projects that draw on the unique dimensions of various disciplines like history, science, and English, particularly in light…

  7. Finite difference time domain grid generation from AMC helicopter models

    NASA Technical Reports Server (NTRS)

    Cravey, Robin L.

    1992-01-01

    A simple technique is presented which forms a cubic grid model of a helicopter from an Aircraft Modeling Code (AMC) input file. The AMC input file defines the helicopter fuselage as a series of polygonal cross sections. The cubic grid model is used as an input to a Finite Difference Time Domain (FDTD) code to obtain predictions of antenna performance on a generic helicopter model. The predictions compare reasonably well with measured data.

  8. Generic substitution of antihypertensive drugs: does it affect adherence?

    PubMed

    Van Wijk, Boris L G; Klungel, Olaf H; Heerdink, Eibert R; de Boer, Anthonius

    2006-01-01

    Generic substitution is an important opportunity to reduce the costs of pharmaceutical care. However, pharmacists and physicians often find that patients and brand-name manufacturers have doubt about the equivalence of the substituted drug. This may be reflected by decreased adherence to therapy. To assess the association between generic substitution and nonadherence to antihypertensive drugs. We conducted a matched cohort study between January 1, 1999, and December 31, 2002. Data were obtained from PHARMO, a record linkage system containing drug-dispensing records from community pharmacies and linked hospital discharge records of approximately 950,000 people in The Netherlands. Residents of 30 medium-sized cities who initiated antihypertensive drug therapy were potential subjects. Refill adherence with antihypertensive drugs after substitution was determined; those with refill adherence below 80% were considered nonadherent. Four hundred sixty-three patients with a substitution in therapy and 565 controls, matched on age, gender, therapy start date, duration of use, and generic product code, were identified. Of the patients who switched from brand-name to generic formulations ("substituted"), 13.6% were nonadherent, and of the non-substituted patients (those who did not switch to generic), 18.7% were nonadherent (OR 0.68; 95% CI 0.48 to 0.96). The association was absent in males. None of the patients discontinued the medication. No differences in hospitalizations for cardiovascular disease in the 6 months after the substitution were observed. Generic substitution of antihypertensive drugs does not lead to lower adherence or more discontinuation and cardiovascular disease-related hospitalizations compared with brand-name therapy. When a less-expensive antihypertensive generic equivalent becomes available, generic substitution should be considered to achieve economic benefits.

  9. Technology to Enhance Mathematics and Science Instruction: Changes in Teacher Perceptions after Participating in a Yearlong Professional Development Program

    ERIC Educational Resources Information Center

    Kersaint, Gladis; Ritzhaupt, Albert D.; Liu, Feng

    2014-01-01

    The purpose of this study is to examine the extent to which teachers of mathematics or science who were engaged in a year-long initiative to help them integrate technological tools were (a) familiar with generic and mathematics- or science-specific technology, (b) comfortable integrating generic and content-specific technology, (c) believe that…

  10. Generic, Extensible, Configurable Push-Pull Framework for Large-Scale Science Missions

    NASA Technical Reports Server (NTRS)

    Foster, Brian M.; Chang, Albert Y.; Freeborn, Dana J.; Crichton, Daniel J.; Woollard, David M.; Mattmann, Chris A.

    2011-01-01

    The push-pull framework was developed in hopes that an infrastructure would be created that could literally connect to any given remote site, and (given a set of restrictions) download files from that remote site based on those restrictions. The Cataloging and Archiving Service (CAS) has recently been re-architected and re-factored in its canonical services, including file management, workflow management, and resource management. Additionally, a generic CAS Crawling Framework was built based on motivation from Apache s open-source search engine project called Nutch. Nutch is an Apache effort to provide search engine services (akin to Google), including crawling, parsing, content analysis, and indexing. It has produced several stable software releases, and is currently used in production services at companies such as Yahoo, and at NASA's Planetary Data System. The CAS Crawling Framework supports many of the Nutch Crawler's generic services, including metadata extraction, crawling, and ingestion. However, one service that was not ported over from Nutch is a generic protocol layer service that allows the Nutch crawler to obtain content using protocol plug-ins that download content using implementations of remote protocols, such as HTTP, FTP, WinNT file system, HTTPS, etc. Such a generic protocol layer would greatly aid in the CAS Crawling Framework, as the layer would allow the framework to generically obtain content (i.e., data products) from remote sites using protocols such as FTP and others. Augmented with this capability, the Orbiting Carbon Observatory (OCO) and NPP (NPOESS Preparatory Project) Sounder PEATE (Product Evaluation and Analysis Tools Elements) would be provided with an infrastructure to support generic FTP-based pull access to remote data products, obviating the need for any specialized software outside of the context of their existing process control systems. This extensible configurable framework was created in Java, and allows the use of different underlying communication middleware (at present, both XMLRPC, and RMI). In addition, the framework is entirely suitable in a multi-mission environment and is supporting both NPP Sounder PEATE and the OCO Mission. Both systems involve tasks such as high-throughput job processing, terabyte-scale data management, and science computing facilities. NPP Sounder PEATE is already using the push-pull framework to accept hundreds of gigabytes of IASI (infrared atmospheric sounding interferometer) data, and is in preparation to accept CRIMS (Cross-track Infrared Microwave Sounding Suite) data. OCO will leverage the framework to download MODIS, CloudSat, and other ancillary data products for use in the high-performance Level 2 Science Algorithm. The National Cancer Institute is also evaluating the framework for use in sharing and disseminating cancer research data through its Early Detection Research Network (EDRN).

  11. Research on Formation of Microsatellite Communication with Genetic Algorithm

    PubMed Central

    Wu, Guoqiang; Bai, Yuguang; Sun, Zhaowei

    2013-01-01

    For the formation of three microsatellites which fly in the same orbit and perform three-dimensional solid mapping for terra, this paper proposes an optimizing design method of space circular formation order based on improved generic algorithm and provides an intersatellite direct spread spectrum communication system. The calculating equation of LEO formation flying satellite intersatellite links is guided by the special requirements of formation-flying microsatellite intersatellite links, and the transmitter power is also confirmed throughout the simulation. The method of space circular formation order optimizing design based on improved generic algorithm is given, and it can keep formation order steady for a long time under various absorb impetus. The intersatellite direct spread spectrum communication system is also provided. It can be found that, when the distance is 1 km and the data rate is 1 Mbps, the input wave matches preferably with the output wave. And LDPC code can improve the communication performance. The correct capability of (512, 256) LDPC code is better than (2, 1, 7) convolution code, distinctively. The design system can satisfy the communication requirements of microsatellites. So, the presented method provides a significant theory foundation for formation-flying and intersatellite communication. PMID:24078796

  12. Research on formation of microsatellite communication with genetic algorithm.

    PubMed

    Wu, Guoqiang; Bai, Yuguang; Sun, Zhaowei

    2013-01-01

    For the formation of three microsatellites which fly in the same orbit and perform three-dimensional solid mapping for terra, this paper proposes an optimizing design method of space circular formation order based on improved generic algorithm and provides an intersatellite direct spread spectrum communication system. The calculating equation of LEO formation flying satellite intersatellite links is guided by the special requirements of formation-flying microsatellite intersatellite links, and the transmitter power is also confirmed throughout the simulation. The method of space circular formation order optimizing design based on improved generic algorithm is given, and it can keep formation order steady for a long time under various absorb impetus. The intersatellite direct spread spectrum communication system is also provided. It can be found that, when the distance is 1 km and the data rate is 1 Mbps, the input wave matches preferably with the output wave. And LDPC code can improve the communication performance. The correct capability of (512, 256) LDPC code is better than (2, 1, 7) convolution code, distinctively. The design system can satisfy the communication requirements of microsatellites. So, the presented method provides a significant theory foundation for formation-flying and intersatellite communication.

  13. Can Cases Carry Pedagogical Content Knowledge? Yes, But We've Got Signs of a "Matthew Effect."

    ERIC Educational Resources Information Center

    Kleinfeld, Judith

    Teacher educators have come to appreciate that the teaching of difficult subjects combines knowledge of content and knowledge of generic teaching methods. A study was conducted, therefore, to explore the potential of cases to carry pedagogical content knowledge. A case was developed describing how an expert teacher goes about teaching…

  14. The influence of generic substitution on the content of patient-pharmacist communication in Swedish community pharmacies.

    PubMed

    Olsson, Erika; Wallach-Kildemoes, Helle; Ahmed, Ban; Ingman, Pontus; Kaae, Susanne; Kälvemark Sporrong, Sofia

    2017-08-01

    The objective was to study the relationship between the length and content of patient-pharmacist communication in community pharmacies, and generic substitution. The study was conducted in six community pharmacies in Sweden. Non-participant observations with audio recordings and short structured interviews were conducted. Out of 32 pharmacists 29 agreed to participate (90.6%), as did 282 out of 407 patients (69.3%). Logistic regression analysis was applied to calculate odds ratio for occurrence of generic substitution. Linear regression (β-coefficients) was applied to test for differences in time spent on different categories. In encounters where generic substitution occurred more time (19.2 s) was spent on non-medical (for instance administrative or economical) issues (P = 0.01, 95% confidence interval 4.8-33.6). However, the total time of the encounter was not significantly longer. The amount of time spent on non-medical issues increased with age of patient (age 60+: β, 33 s, P < 0.001). The results indicate that more time was spent on medical issues with patients who have a higher education (high school: β, 10.8 s, P = 0.07, university: β, 10.2 s, P = 0.11) relative to those with only elementary school education. Occurrence of generic substitution was correlated with more time spent on communicating on non-medical, but not on medical, issues. No extra time was spent on medical information for the groups normally overrepresented among those with low health literacy. This study suggests that pharmacists need to further embrace their role in promoting rational use of medicines, not least when generic substitution occurs. © 2016 Royal Pharmaceutical Society.

  15. The Effects of Text Message Content on the Use of an Internet-Based Physical Activity Intervention in Hong Kong Chinese Adolescents.

    PubMed

    Lau, Erica Y; Lau, Patrick W C; Cai, Bo; Archer, Edward

    2015-01-01

    This study examined the effects of text message content (generic vs. culturally tailored) on the login rate of an Internet physical activity program in Hong Kong Chinese adolescent school children. A convenience sample of 252 Hong Kong secondary school adolescents (51% female, 49% male; M age = 13.17 years, SD = 1.28 years) were assigned to one of 3 treatments for 8 weeks. The control group consisted of an Internet physical activity program. The Internet plus generic text message group consisted of the same Internet physical activity program and included daily generic text messages. The Internet plus culturally tailored text message group consisted of the Internet physical activity program and included daily culturally tailored text messages. Zero-inflated Poisson mixed models showed that the overall effect of the treatment group on the login rates varied significantly across individuals. The login rates over time were significantly higher in the Internet plus culturally tailored text message group than the control group (β = 46.06, 95% CI 13.60, 156.02; p = .002) and the Internet plus generic text message group (β = 15.80, 95% CI 4.81, 51.9; p = .021) after adjusting for covariates. These findings suggest that culturally tailored text messages may be more advantageous than generic text messages on improving adolescents' website login rate, but effects varied significantly across individuals. Our results support the inclusion of culturally tailored messaging in future online physical activity interventions.

  16. Development of real-time software environments for NASA's modern telemetry systems

    NASA Technical Reports Server (NTRS)

    Horner, Ward; Sabia, Steve

    1989-01-01

    An effort has been made to maintain maximum performance and flexibility for NASA-Goddard's VLSI telemetry system elements through the development of two real-time systems: (1) the Base System Environment, which supports generic system integration and furnishes the basic porting of various manufacturers' cards, and (2) the Modular Environment for Data Systems, which supports application-specific developments and furnishes designers with a set of tested generic library functions that can be employed to speed up the development of such application-specific real-time codes. The performance goals and design rationale for these two systems are discussed.

  17. An implementation of a reference symbol approach to generic modulation in fading channels

    NASA Technical Reports Server (NTRS)

    Young, R. J.; Lodge, J. H.; Pacola, L. C.

    1990-01-01

    As mobile satellite communications systems evolve over the next decade, they will have to adapt to a changing tradeoff between bandwidth and power. This paper presents a flexible approach to digital modulation and coding that will accommodate both wideband and narrowband schemes. This architecture could be the basis for a family of modems, each satisfying a specific power and bandwidth constraint, yet all having a large number of common signal processing blocks. The implementation of this generic approach, with general purpose digital processors for transmission of 4.8 kilobits per sec. digitally encoded speech, is described.

  18. Simulation of spacecraft attitude dynamics using TREETOPS and model-specific computer Codes

    NASA Technical Reports Server (NTRS)

    Cochran, John E.; No, T. S.; Fitz-Coy, Norman G.

    1989-01-01

    The simulation of spacecraft attitude dynamics and control using the generic, multi-body code called TREETOPS and other codes written especially to simulate particular systems is discussed. Differences in the methods used to derive equations of motion--Kane's method for TREETOPS and the Lagrangian and Newton-Euler methods, respectively, for the other two codes--are considered. Simulation results from the TREETOPS code are compared with those from the other two codes for two example systems. One system is a chain of rigid bodies; the other consists of two rigid bodies attached to a flexible base body. Since the computer codes were developed independently, consistent results serve as a verification of the correctness of all the programs. Differences in the results are discussed. Results for the two-rigid-body, one-flexible-body system are useful also as information on multi-body, flexible, pointing payload dynamics.

  19. Coupling between a multi-physics workflow engine and an optimization framework

    NASA Astrophysics Data System (ADS)

    Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.

    2016-03-01

    A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.

  20. A Generic Structural Integrity Assurance Technology Program for the Army

    DTIC Science & Technology

    1989-11-01

    and Pressure Vessel Code , American Society of Mechanical Engineers, 1986. DEFINITIONS AND ACRONYMS Definitions A-Basis: At least 99 percent of the...Aluminum Bridge and Other Highway Structures, 1976. Aluminum Association Specifications for Aluminum Structures, Third Edition, 1976. ASME ASME Boiler

  1. Site-specific seismic ground motion analyses for transportation infrastructure in the New Madrid seismic zone.

    DOT National Transportation Integrated Search

    2012-11-01

    Generic, code-based design procedures cannot account for the anticipated short-period attenuation and long-period amplification of earthquake ground motions in the deep, soft sediments of the Mississippi Embayment within the New Madrid Seismic Zone (...

  2. Associations Between Thematic Content and Industry Self-Regulation Code Violations in Beer Advertising Broadcast During the U.S. NCAA Basketball Tournament.

    PubMed

    Noel, Jonathan K; Xuan, Ziming; Babor, Thomas F

    2017-07-03

    Beer marketing in the United States is controlled through self-regulation, whereby the beer industry has created a marketing code and enforces its use. We performed a thematic content analysis on beer ads broadcast during a U.S. college athletic event and determined which themes are associated with violations of a self-regulated alcohol marketing code. 289 beer ads broadcast during the U.S. NCAA Men's and Women's 1999-2008 basketball tournaments were assessed for the presence of 23 thematic content areas. Associations between themes and violations of the U.S. Beer Institute's Marketing and Advertising Code were determined using generalized linear models. Humor (61.3%), taste (61.0%), masculinity (49.2%), and enjoyment (36.5%) were the most prevalent content areas. Nine content areas (i.e., conformity, ethnicity, sensation seeking, sociability, romance, special occasions, text responsibility messages, tradition, and individuality) were positively associated with code violations (p < 0.001-0.042). There were significantly more content areas positively associated with code violations than content areas negatively associated with code violations (p < 0.001). Several thematic content areas were positively associated with code violations. The results can inform existing efforts to revise self-regulated alcohol marketing codes to ensure better protection of vulnerable populations. The use of several themes is concerning in relation to adolescent alcohol use and health disparities.

  3. Potential flow theory and operation guide for the panel code PMARC

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.; Dudley, Michael R.; Iguchi, Steve K.; Browne, Lindsey; Katz, Joseph

    1991-01-01

    The theoretical basis for PMARC, a low-order potential-flow panel code for modeling complex three-dimensional geometries, is outlined. Several of the advanced features currently included in the code, such as internal flow modeling, a simple jet model, and a time-stepping wake model, are discussed in some detail. The code is written using adjustable size arrays so that it can be easily redimensioned for the size problem being solved and the computer hardware being used. An overview of the program input is presented, with a detailed description of the input available in the appendices. Finally, PMARC results for a generic wing/body configuration are compared with experimental data to demonstrate the accuracy of the code. The input file for this test case is given in the appendices.

  4. Automated data capture from free-text radiology reports to enhance accuracy of hospital inpatient stroke codes.

    PubMed

    Flynn, Robert W V; Macdonald, Thomas M; Schembri, Nicola; Murray, Gordon D; Doney, Alexander S F

    2010-08-01

    Much potentially useful clinical information for pharmacoepidemiological research is contained in unstructured free-text documents and is not readily available for analysis. Routine health data such as Scottish Morbidity Records (SMR01) frequently use generic 'stroke' codes. Free-text Computerised Radiology Information System (CRIS) reports have potential to provide this missing detail. We aimed to increase the number of stroke-type-specific diagnoses by augmenting SMR01 with data derived from CRIS reports and to assess the accuracy of this methodology. SMR01 codes describing first-ever-stroke admissions in Tayside, Scotland from 1994 to 2005 were linked to CRIS CT-brain scan reports occurring with 14 days of admission. Software was developed to parse the text and elicit details of stroke type using keyword matching. An algorithm was iteratively developed to differentiate intracerebral haemorrhage (ICH) from ischaemic stroke (IS) against a training set of reports with pathophysiologically precise SMR01 codes. This algorithm was then applied to CRIS reports associated with generic SMR01 codes. To establish the accuracy of the algorithm a sample of 150 ICH and 150 IS reports were independently classified by a stroke physician. There were 8419 SMR01 coded first-ever strokes. The proportion of patients with pathophysiologically clear diagnoses doubled from 2745 (32.6%) to 5614 (66.7%). The positive predictive value was 94.7% (95%CI 89.8-97.3) for IS and 76.7% (95%CI 69.3-82.7) for haemorrhagic stroke. A free-text processing approach was acceptably accurate at identifying IS, but not ICH. This approach could be adapted to other studies where radiology reports may be informative. 2010 John Wiley & Sons, Ltd.

  5. Effects of generic language on category content and structure.

    PubMed

    Gelman, Susan A; Ware, Elizabeth A; Kleinberg, Felicia

    2010-11-01

    We hypothesized that generic noun phrases ("Bears climb trees") would provide important input to children's developing concepts. In three experiments, four-year-olds and adults learned a series of facts about a novel animal category, in one of three wording conditions: generic (e.g., "Zarpies hate ice cream"), specific-label (e.g., "This zarpie hates ice cream"), or no-label (e.g., "This hates ice cream"). Participants completed a battery of tasks assessing the extent to which they linked the category to the properties expressed, and the extent to which they treated the category as constituting an essentialized kind. As predicted, for adults, generics training resulted in tighter category-property links and more category essentialism than both the specific-label and no-label training. Children also showed effects of generic wording, though the effects were weaker and required more extensive input. We discuss the implications for language-thought relations, and for the acquisition of essentialized categories. Copyright 2010 Elsevier Inc. All rights reserved.

  6. Pump CFD code validation tests

    NASA Technical Reports Server (NTRS)

    Brozowski, L. A.

    1993-01-01

    Pump CFD code validation tests were accomplished by obtaining nonintrusive flow characteristic data at key locations in generic current liquid rocket engine turbopump configurations. Data were obtained with a laser two-focus (L2F) velocimeter at scaled design flow. Three components were surveyed: a 1970's-designed impeller, a 1990's-designed impeller, and a four-bladed unshrouded inducer. Two-dimensional velocities were measured upstream and downstream of the two impellers. Three-dimensional velocities were measured upstream, downstream, and within the blade row of the unshrouded inducer.

  7. Development and Evaluation of an Order-N Formulation for Multi-Flexible Body Space Systems

    NASA Technical Reports Server (NTRS)

    Ghosh, Tushar K.; Quiocho, Leslie J.

    2013-01-01

    This paper presents development of a generic recursive Order-N algorithm for systems with rigid and flexible bodies, in tree or closed-loop topology, with N being the number of bodies of the system. Simulation results are presented for several test cases to verify and evaluate the performance of the code compared to an existing efficient dense mass matrix-based code. The comparison brought out situations where Order-N or mass matrix-based algorithms could be useful.

  8. IGGy: An interactive environment for surface grid generation

    NASA Technical Reports Server (NTRS)

    Prewitt, Nathan C.

    1992-01-01

    A graphically interactive derivative of the EAGLE boundary code is presented. This code allows the user to interactively build and execute commands and immediately see the results. Strong ties with a batch oriented script language are maintained. A generalized treatment of grid definition parameters allows a more generic definition of the grid generation process and allows the generation of command scripts which can be applied to topologically similar configurations. The use of the graphical user interface is outlined and example applications are presented.

  9. FDA's proposed rules on patent listing requirements for new drug and 30-month stays on ANDA approval (proposed Oct. 24, 2002).

    PubMed

    Hui, Yuk Fung

    2003-01-01

    In order to close the loophole in the generic drug approval process that allows a brand name drug patent holder to delay or defeat generic drug application merely by technicality, the FDA recently proposed to modify its regulations. Those proposals affect the patent listing requirements of a new drug application, and the duration of time that a generic drug application could be put on hold in the event of a patent infringement suit. With the modified rules, the FDA expects to see an increase in the availability of generic drugs, which eventually will lead to lower drug costs. Ms. Hui discusses the contents of the proposed regulations and provides an analysis of the proposed rule's legal authority, implications on patent rights, and impact on the pharmaceutical industry.

  10. Regulating alcohol advertising: content analysis of the adequacy of federal and self-regulation of magazine advertisements, 2008-2010.

    PubMed

    Smith, Katherine C; Cukier, Samantha; Jernigan, David H

    2014-10-01

    We analyzed beer, spirits, and alcopop magazine advertisements to determine adherence to federal and voluntary advertising standards. We assessed the efficacy of these standards in curtailing potentially damaging content and protecting public health. We obtained data from a content analysis of a census of 1795 unique advertising creatives for beer, spirits, and alcopops placed in nationally available magazines between 2008 and 2010. We coded creatives for manifest content and adherence to federal regulations and industry codes. Advertisements largely adhered to existing regulations and codes. We assessed only 23 ads as noncompliant with federal regulations and 38 with industry codes. Content consistent with the codes was, however, often culturally positive in terms of aspirational depictions. In addition, creatives included degrading and sexualized images, promoted risky behavior, and made health claims associated with low-calorie content. Existing codes and regulations are largely followed regarding content but do not adequately protect against content that promotes unhealthy and irresponsible consumption and degrades potentially vulnerable populations in its depictions. Our findings suggest further limitations and enhanced federal oversight may be necessary to protect public health.

  11. Numerical study of supersonic combustors by multi-block grids with mismatched interfaces

    NASA Technical Reports Server (NTRS)

    Moon, Young J.

    1990-01-01

    A three dimensional, finite rate chemistry, Navier-Stokes code was extended to a multi-block code with mismatched interface for practical calculations of supersonic combustors. To ensure global conservation, a conservative algorithm was used for the treatment of mismatched interfaces. The extended code was checked against one test case, i.e., a generic supersonic combustor with transverse fuel injection, examining solution accuracy, convergence, and local mass flux error. After testing, the code was used to simulate the chemically reacting flow fields in a scramjet combustor with parallel fuel injectors (unswept and swept ramps). Computational results were compared with experimental shadowgraph and pressure measurements. Fuel-air mixing characteristics of the unswept and swept ramps were compared and investigated.

  12. Power-on performance predictions for a complete generic hypersonic vehicle configuration

    NASA Technical Reports Server (NTRS)

    Bennett, Bradford C.

    1991-01-01

    The Compressible Navier-Stokes (CNS) code was developed to compute external hypersonic flow fields. It has been applied to various hypersonic external flow applications. Here, the CNS code was modified to compute hypersonic internal flow fields. Calculations were performed on a Mach 18 sidewall compression inlet and on the Lewis Mach 5 inlet. The use of the ARC3D diagonal algorithm was evaluated for internal flows on the Mach 5 inlet flow. The initial modifications to the CNS code involved generalization of the boundary conditions and the addition of viscous terms in the second crossflow direction and modifications to the Baldwin-Lomax turbulence model for corner flows.

  13. Naming and outline of Dothideomycetes-2014 including proposals for the protection or suppression of generic names.

    PubMed

    Wijayawardene, Nalin N; Crous, Pedro W; Kirk, Paul M; Hawksworth, David L; Boonmee, Saranyaphat; Braun, Uwe; Dai, Dong-Qin; D'souza, Melvina J; Diederich, Paul; Dissanayake, Asha; Doilom, Mingkhuan; Hongsanan, Singang; Jones, E B Gareth; Groenewald, Johannes Z; Jayawardena, Ruvishika; Lawrey, James D; Liu, Jian-Kui; Lücking, Robert; Madrid, Hugo; Manamgoda, Dimuthu S; Muggia, Lucia; Nelsen, Matthew P; Phookamsak, Rungtiwa; Suetrong, Satinee; Tanaka, Kazuaki; Thambugala, Kasun M; Wanasinghe, Dhanushka N; Wikee, Saowanee; Zhang, Ying; Aptroot, André; Ariyawansa, H A; Bahkali, Ali H; Bhat, D Jayarama; Gueidan, Cécile; Chomnunti, Putarak; De Hoog, G Sybren; Knudsen, Kerry; Li, Wen-Jing; McKenzie, Eric H C; Miller, Andrew N; Phillips, Alan J L; Piątek, Marcin; Raja, Huzefa A; Shivas, Roger S; Slippers, Bernad; Taylor, Joanne E; Tian, Qing; Wang, Yong; Woudenberg, Joyce H C; Cai, Lei; Jaklitsch, Walter M; Hyde, Kevin D

    2014-11-01

    Article 59.1, of the International Code of Nomenclature for Algae, Fungi, and Plants (ICN; Melbourne Code), which addresses the nomenclature of pleomorphic fungi, became effective from 30 July 2011. Since that date, each fungal species can have one nomenclaturally correct name in a particular classification. All other previously used names for this species will be considered as synonyms. The older generic epithet takes priority over the younger name. Any widely used younger names proposed for use, must comply with Art. 57.2 and their usage should be approved by the Nomenclature Committee for Fungi (NCF). In this paper, we list all genera currently accepted by us in Dothideomycetes (belonging to 23 orders and 110 families), including pleomorphic and nonpleomorphic genera. In the case of pleomorphic genera, we follow the rulings of the current ICN and propose single generic names for future usage. The taxonomic placements of 1261 genera are listed as an outline. Protected names and suppressed names for 34 pleomorphic genera are listed separately. Notes and justifications are provided for possible proposed names after the list of genera. Notes are also provided on recent advances in our understanding of asexual and sexual morph linkages in Dothideomycetes . A phylogenetic tree based on four gene analyses supported 23 orders and 75 families, while 35 families still lack molecular data.

  14. Computational study of generic hypersonic vehicle flow fields

    NASA Technical Reports Server (NTRS)

    Narayan, Johnny R.

    1994-01-01

    The geometric data of the generic hypersonic vehicle configuration included body definitions and preliminary grids for the forebody (nose cone excluded), midsection (propulsion system excluded), and afterbody sections. This data was to be augmented by the nose section geometry (blunt conical section mated with the noncircular cross section of the forebody initial plane) along with a grid and a detailed supersonic combustion ramjet (scramjet) geometry (inlet and combustor) which should be merged with the nozzle portion of the afterbody geometry. The solutions were to be obtained by using a Navier-Stokes (NS) code such as TUFF for the nose portion, a parabolized Navier-Stokes (PNS) solver such as the UPS and STUFF codes for the forebody, a NS solver with finite rate hydrogen-air chemistry capability such as TUFF and SPARK for the scramjet and a suitable solver (NS or PNS) for the afterbody and external nozzle flows. The numerical simulation of the hypersonic propulsion system for the generic hypersonic vehicle is the major focus of this entire work. Supersonic combustion ramjet is such a propulsion system, hence the main thrust of the present task has been to establish a solution procedure for the scramjet flow. The scramjet flow is compressible, turbulent, and reacting. The fuel used is hydrogen and the combustion process proceeds at a finite rate. As a result, the solution procedure must be capable of addressing such flows.

  15. Naming and outline of Dothideomycetes–2014 including proposals for the protection or suppression of generic names

    PubMed Central

    Wijayawardene, Nalin N.; Crous, Pedro W.; Kirk, Paul M.; Hawksworth, David L.; Boonmee, Saranyaphat; Braun, Uwe; Dai, Dong-Qin; D’souza, Melvina J.; Diederich, Paul; Dissanayake, Asha; Doilom, Mingkhuan; Hongsanan, Singang; Jones, E. B.Gareth; Groenewald, Johannes Z.; Jayawardena, Ruvishika; Lawrey, James D.; Liu, Jian-Kui; Lücking, Robert; Madrid, Hugo; Manamgoda, Dimuthu S.; Muggia, Lucia; Nelsen, Matthew P.; Phookamsak, Rungtiwa; Suetrong, Satinee; Tanaka, Kazuaki; Thambugala, Kasun M.; Wanasinghe, Dhanushka N.; Wikee, Saowanee; Zhang, Ying; Aptroot, André; Ariyawansa, H. A.; Bahkali, Ali H.; Bhat, D. Jayarama; Gueidan, Cécile; Chomnunti, Putarak; De Hoog, G. Sybren; Knudsen, Kerry; Li, Wen-Jing; McKenzie, Eric H. C.; Miller, Andrew N.; Phillips, Alan J. L.; Piątek, Marcin; Raja, Huzefa A.; Shivas, Roger S.; Slippers, Bernad; Taylor, Joanne E.; Tian, Qing; Wang, Yong; Woudenberg, Joyce H. C.; Cai, Lei; Jaklitsch, Walter M.

    2016-01-01

    Article 59.1, of the International Code of Nomenclature for Algae, Fungi, and Plants (ICN; Melbourne Code), which addresses the nomenclature of pleomorphic fungi, became effective from 30 July 2011. Since that date, each fungal species can have one nomenclaturally correct name in a particular classification. All other previously used names for this species will be considered as synonyms. The older generic epithet takes priority over the younger name. Any widely used younger names proposed for use, must comply with Art. 57.2 and their usage should be approved by the Nomenclature Committee for Fungi (NCF). In this paper, we list all genera currently accepted by us in Dothideomycetes (belonging to 23 orders and 110 families), including pleomorphic and nonpleomorphic genera. In the case of pleomorphic genera, we follow the rulings of the current ICN and propose single generic names for future usage. The taxonomic placements of 1261 genera are listed as an outline. Protected names and suppressed names for 34 pleomorphic genera are listed separately. Notes and justifications are provided for possible proposed names after the list of genera. Notes are also provided on recent advances in our understanding of asexual and sexual morph linkages in Dothideomycetes. A phylogenetic tree based on four gene analyses supported 23 orders and 75 families, while 35 families still lack molecular data. PMID:27284275

  16. Modern Foreign Language Content Standards. Generic Standards. Levels I-IV.

    ERIC Educational Resources Information Center

    Scarborough, Rebecca H., Ed.

    Delaware's standards for modern language curriculum content in public schools are provided for teachers' use in coordinating instruction. Teachers are encouraged to use communicative, student-centered classroom activities and to reinforce and expand the material in successive instructional units. The guide consists of introductory sections on the…

  17. Burner liner thermal/structural load modeling: TRANCITS program user's manual

    NASA Technical Reports Server (NTRS)

    Maffeo, R.

    1985-01-01

    Transfer Analysis Code to Interface Thermal/Structural Problems (TRANCITS) is discussed. The TRANCITS code satisfies all the objectives for transferring thermal data between heat transfer and structural models of combustor liners and it can be used as a generic thermal translator between heat transfer and stress models of any component, regardless of the geometry. The TRANCITS can accurately and efficiently convert the temperature distributions predicted by the heat transfer programs to those required by the stress codes. It can be used for both linear and nonlinear structural codes and can produce nodal temperatures, elemental centroid temperatures, or elemental Gauss point temperatures. The thermal output of both the MARC and SINDA heat transfer codes can be interfaced directly with TRANCITS, and it will automatically produce stress model codes formatted for NASTRAN and MARC. Any thermal program and structural program can be interfaced by using the neutral input and output forms supported by TRANCITS.

  18. Comparative Evaluation of U.S. Brand and Generic Intravenous Sodium Ferric Gluconate Complex in Sucrose Injection: Physicochemical Characterization

    PubMed Central

    Sun, Dajun; Rouse, Rodney; Patel, Vikram; Wu, Yong; Zheng, Jiwen; Karmakar, Alokita; Patri, Anil K.; Keire, David; Ma, Jia; Jiang, Wenlei

    2018-01-01

    The objective of this study was to evaluate physicochemical equivalence between brand (i.e., Ferrlecit) and generic sodium ferric gluconate (SFG) in sucrose injection by conducting a series of comparative in vitro characterizations using advanced analytical techniques. The elemental iron and carbon content, thermal properties, viscosity, particle size, zeta potential, sedimentation coefficient, and molecular weight were determined. There was no noticeable difference between brand and generic SFG in sucrose injection for the above physical parameters evaluated, except for the sedimentation coefficient determined by sedimentation velocity analytical ultracentrifugation (SV-AUC) and molecular weight by asymmetric field flow fractionation-multi-angle light scattering (AFFF-MALS). In addition, brand and generic SFG complex products showed comparable molecular weight distributions when determined by gel permeation chromatography (GPC). The observed minor differences between brand and generic SFG, such as sedimentation coefficient, do not impact their biological activities in separate studies of in vitro cellular uptake and rat biodistribution. Coupled with the ongoing clinical study comparing the labile iron level in healthy volunteers, the FDA-funded post-market studies intended to illustrate comprehensive surveillance efforts ensuring safety and efficacy profiles of generic SFG complex in sucrose injection, and also to shed new light on the approval standards on generic parenteral iron colloidal products. PMID:29303999

  19. 40 CFR 52.2027 - Approval status of Pennsylvania's Generic NOX and VOC RACT Rules.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... through 129.95 (see § 52.2020 (c)(129)) as those regulations apply to the Pittsburgh-Beaver Valley area... its approval of 25 PA Code of Regulations, Chapter 129.91 through 129.95 [see § 52.2020 (c)(129)] as...

  20. 40 CFR 52.2027 - Approval status of Pennsylvania's Generic NOX and VOC RACT Rules.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... through 129.95 (see § 52.2020 (c)(129)) as those regulations apply to the Pittsburgh-Beaver Valley area... its approval of 25 PA Code of Regulations, Chapter 129.91 through 129.95 [see § 52.2020 (c)(129)] as...

  1. Personalized Guideline-Based Treatment Recommendations Using Natural Language Processing Techniques.

    PubMed

    Becker, Matthias; Böckmann, Britta

    2017-01-01

    Clinical guidelines and clinical pathways are accepted and proven instruments for quality assurance and process optimization. Today, electronic representation of clinical guidelines exists as unstructured text, but is not well-integrated with patient-specific information from electronic health records. Consequently, generic content of the clinical guidelines is accessible, but it is not possible to visualize the position of the patient on the clinical pathway, decision support cannot be provided by personalized guidelines for the next treatment step. The Systematized Nomenclature of Medicine - Clinical Terms (SNOMED CT) provides common reference terminology as well as the semantic link for combining the pathways and the patient-specific information. This paper proposes a model-based approach to support the development of guideline-compliant pathways combined with patient-specific structured and unstructured information using SNOMED CT. To identify SNOMED CT concepts, a software was developed to extract SNOMED CT codes out of structured and unstructured German data to map these with clinical pathways annotated in accordance with the systematized nomenclature.

  2. NASA Tech Briefs, February 2008

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Topics discussed include: Optical Measurement of Mass Flow of a Two-Phase Fluid; Selectable-Tip Corrosion-Testing Electrochemical Cell; Piezoelectric Bolt Breakers and Bolt Fatigue Testers; Improved Measurement of B(sub 22) of Macromolecules in a Flow Cell; Measurements by a Vector Network Analyzer at 325 to 508 GHz; Using Light to Treat Mucositis and Help Wounds Heal; Increasing Discharge Capacities of Li-(CF)(sub n) Cells; Dot-in-Well Quantum-Dot Infrared Photodetectors; Integrated Microbatteries for Implantable Medical Devices; Oxidation Behavior of Carbon Fiber-Reinforced Composites; GIDEP Batching Tool; Generic Spacecraft Model for Real-Time Simulation; Parallel-Processing Software for Creating Mosaic Images; Software for Verifying Image-Correlation Tie Points; Flexcam Image Capture Viewing and Spot Tracking; Low-Pt-Content Anode Catalyst for Direct Methanol Fuel Cells; Graphite/Cyanate Ester Face Sheets for Adaptive Optics; Atomized BaF2-CaF7 for Better-Flowing Plasma-Spray Feedstock; Nanophase Nickel-Zirconium Alloys for Fuel Cells; Vacuum Packaging of MEMS With Multiple Internal Seal Rings; Compact Two-Dimensional Spectrometer Optics; and Fault-Tolerant Coding for State Machines.

  3. LANDSAT 3 return beam vidicon response artifacts: A report on RBV photographic product characteristics and quality coding system

    NASA Technical Reports Server (NTRS)

    Clark, B. P.

    1981-01-01

    Analysis of large volumes of LANDSAT 3 RBV digital data that were converted to photographic form led to the firm identification of several visible artifacts (objects or structures not normally present, but producted by an external agency or action) in the imagery. These artifacts were identified, categorized, and traced directly to specific sensor response characteristics. None of these artifacts is easily removed and all cases remain under active study of possible future enhancement. The seven generic categories of sensor response artifacts identified to date include: (1) shading and stairsteps; (2) corners out of focus; (3) missing reseaus; (4) reseau distortion and data distortion; (5) black vertical line; (6) grain effect; and (7) faceplate contamination. An additional category under study, but not yet determined to be caused by sensor response, is a geometric anomaly. Examples of affected imagery are presented to assist in distinguishing between image content and innate defects caused by the sensor system.

  4. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  5. Predicting Regulatory Compliance in Beer Advertising on Facebook.

    PubMed

    Noel, Jonathan K; Babor, Thomas F

    2017-11-01

    The prevalence of alcohol advertising has been growing on social media platforms. The purpose of this study was to evaluate alcohol advertising on Facebook for regulatory compliance and thematic content. A total of 50 Budweiser and Bud Light ads posted on Facebook within 1 month of the 2015 NFL Super Bowl were evaluated for compliance with a self-regulated alcohol advertising code and for thematic content. An exploratory sensitivity/specificity analysis was conducted to determine if thematic content could predict code violations. The code violation rate was 82%, with violations prevalent in guidelines prohibiting the association of alcohol with success (Guideline 5) and health benefits (Guideline 3). Overall, 21 thematic content areas were identified. Displaying the product (62%) and adventure/sensation seeking (52%) were the most prevalent. There was perfect specificity (100%) for 10 content areas for detecting any code violation (animals, negative emotions, positive emotions, games/contests/promotions, female characters, minorities, party, sexuality, night-time, sunrise) and high specificity (>80%) for 10 content areas for detecting violations of guidelines intended to protect minors (animals, negative emotions, famous people, friendship, games/contests/promotions, minorities, responsibility messages, sexuality, sunrise, video games). The high prevalence of code violations indicates a failure of self-regulation to prevent potentially harmful content from appearing in alcohol advertising, including explicit code violations (e.g. sexuality). Routine violations indicate an unwillingness to restrict advertising content for public health purposes, and statutory restrictions may be necessary to sufficiently deter alcohol producers from repeatedly violating marketing codes. Violations of a self-regulated alcohol advertising code are prevalent in a sample of beer ads published on Facebook near the US National Football League's Super Bowl. Overall, 16 thematic content areas demonstrated high specificity for code violations. Alcohol advertising codes should be updated to expressly prohibit the use of such content. © The Author 2017. Medical Council on Alcohol and Oxford University Press. All rights reserved.

  6. Regulating Alcohol Advertising: Content Analysis of the Adequacy of Federal and Self-Regulation of Magazine Advertisements, 2008–2010

    PubMed Central

    Cukier, Samantha; Jernigan, David H.

    2014-01-01

    Objectives. We analyzed beer, spirits, and alcopop magazine advertisements to determine adherence to federal and voluntary advertising standards. We assessed the efficacy of these standards in curtailing potentially damaging content and protecting public health. Methods. We obtained data from a content analysis of a census of 1795 unique advertising creatives for beer, spirits, and alcopops placed in nationally available magazines between 2008 and 2010. We coded creatives for manifest content and adherence to federal regulations and industry codes. Results. Advertisements largely adhered to existing regulations and codes. We assessed only 23 ads as noncompliant with federal regulations and 38 with industry codes. Content consistent with the codes was, however, often culturally positive in terms of aspirational depictions. In addition, creatives included degrading and sexualized images, promoted risky behavior, and made health claims associated with low-calorie content. Conclusions. Existing codes and regulations are largely followed regarding content but do not adequately protect against content that promotes unhealthy and irresponsible consumption and degrades potentially vulnerable populations in its depictions. Our findings suggest further limitations and enhanced federal oversight may be necessary to protect public health. PMID:24228667

  7. 75 FR 16737 - Proposed Information Collection; Comment Request; Generic Clearance for Questionnaire Pretesting...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-02

    ... and procedures, reduce respondent burden, and ultimately increase the quality of data collected in the... procedure: Cognitive interviews, focus groups, respondent debriefing, behavior coding of respondent... used: Mail, telephone, face-to- face, paper-and-pencil, CATI, CAPI, Internet, or IVR. III. Data OMB...

  8. Theoretical Roots and Pedagogical Implications for Contextual Evaluation.

    ERIC Educational Resources Information Center

    Ewald, Helen Rothschild

    There are three types of contexts subject to evaluation of student writing; the textual context that influences grammatical acceptability and the rhetorical effectiveness of a sentence; the coded context or cultural constraints such as generic and stylistic conventions; and pragmatic contexts that unite form, function, and setting in a…

  9. ISART: A Generic Framework for Searching Books with Social Information

    PubMed Central

    Cui, Xiao-Ping; Qu, Jiao; Geng, Bin; Zhou, Fang; Song, Li; Hao, Hong-Wei

    2016-01-01

    Effective book search has been discussed for decades and is still future-proof in areas as diverse as computer science, informatics, e-commerce and even culture and arts. A variety of social information contents (e.g, ratings, tags and reviews) emerge with the huge number of books on the Web, but how they are utilized for searching and finding books is seldom investigated. Here we develop an Integrated Search And Recommendation Technology (IsArt), which breaks new ground by providing a generic framework for searching books with rich social information. IsArt comprises a search engine to rank books with book contents and professional metadata, a Generalized Content-based Filtering model to thereafter rerank books with user-generated social contents, and a learning-to-rank technique to finally combine a wide range of diverse reranking results. Experiments show that this technology permits embedding social information to promote book search effectiveness, and IsArt, by making use of it, has the best performance on CLEF/INEX Social Book Search Evaluation datasets of all 4 years (from 2011 to 2014), compared with some other state-of-the-art methods. PMID:26863545

  10. ISART: A Generic Framework for Searching Books with Social Information.

    PubMed

    Yin, Xu-Cheng; Zhang, Bo-Wen; Cui, Xiao-Ping; Qu, Jiao; Geng, Bin; Zhou, Fang; Song, Li; Hao, Hong-Wei

    2016-01-01

    Effective book search has been discussed for decades and is still future-proof in areas as diverse as computer science, informatics, e-commerce and even culture and arts. A variety of social information contents (e.g, ratings, tags and reviews) emerge with the huge number of books on the Web, but how they are utilized for searching and finding books is seldom investigated. Here we develop an Integrated Search And Recommendation Technology (IsArt), which breaks new ground by providing a generic framework for searching books with rich social information. IsArt comprises a search engine to rank books with book contents and professional metadata, a Generalized Content-based Filtering model to thereafter rerank books with user-generated social contents, and a learning-to-rank technique to finally combine a wide range of diverse reranking results. Experiments show that this technology permits embedding social information to promote book search effectiveness, and IsArt, by making use of it, has the best performance on CLEF/INEX Social Book Search Evaluation datasets of all 4 years (from 2011 to 2014), compared with some other state-of-the-art methods.

  11. NARMER-1: a photon point-kernel code with build-up factors

    NASA Astrophysics Data System (ADS)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  12. Experimental and computational flow-field results for an all-body hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Cleary, Joseph W.

    1989-01-01

    A comprehensive test program is defined which is being implemented in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel for obtaining data on a generic all-body hypersonic vehicle for computational fluid dynamics (CFD) code validation. Computational methods (approximate inviscid methods and an upwind parabolized Navier-Stokes code) currently being applied to the all-body model are outlined. Experimental and computational results on surface pressure distributions and Pitot-pressure surveys for the basic sharp-nose model (without control surfaces) at a free-stream Mach number of 7 are presented.

  13. Porting LAMMPS to GPUs.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, William Michael; Plimpton, Steven James; Wang, Peng

    2010-03-01

    LAMMPS is a classical molecular dynamics code, and an acronym for Large-scale Atomic/Molecular Massively Parallel Simulator. LAMMPS has potentials for soft materials (biomolecules, polymers) and solid-state materials (metals, semiconductors) and coarse-grained or mesoscopic systems. It can be used to model atoms or, more generically, as a parallel particle simulator at the atomic, meso, or continuum scale. LAMMPS runs on single processors or in parallel using message-passing techniques and a spatial-decomposition of the simulation domain. The code is designed to be easy to modify or extend with new functionality.

  14. Subsumption principles underlying medical concept systems and their formal reconstruction.

    PubMed Central

    Bernauer, J.

    1994-01-01

    Conventional medical concept systems represent generic concept relations by hierarchical coding principles. Often, these coding principles constrain the concept system and reduce the potential for automatical derivation of subsumption. Formal reconstruction of medical concept systems is an approach that bases on the conceptual representation of meanings and that allows for the application of formal criteria for subsumption. Those criteria must reflect intuitive principles of subordination which are underlying conventional medical concept systems. Particularly these are: The subordinate concept results (1) from adding a specializing criterion to the superordinate concept, (2) from refining the primary category, or a criterion of the superordinate concept, by a concept that is less general, (3) from adding a partitive criterion to a criterion of the superordinate, (4) from refining a criterion by a concept that is less comprehensive, and finally (5) from coordinating the superordinate concept, or one of its criteria. This paper introduces a formalism called BERNWARD that aims at the formal reconstruction of medical concept systems according to these intuitive principles. The automatical derivation of hierarchical relations is primarily supported by explicit generic and explicit partititive hierarchies of concepts, secondly, by two formal criteria that base on the structure of concept descriptions and explicit hierarchical relations between their elements, namely: formal subsumption and part-sensitive subsumption. Formal subsumption takes only generic relations into account, part-sensitive subsumption additionally regards partive relations between criteria. This approach seems to be flexible enough to cope with unforeseeable effects of partitive criteria on subsumption. PMID:7949907

  15. A false positive food chain error associated with a generic predator gut content ELISA

    USDA-ARS?s Scientific Manuscript database

    Conventional prey-specific gut content ELISA and PCR assays are useful for identifying predators of insect pests in nature. However, these assays are prone to yielding certain types of food chain errors. For instance, it is possible that prey remains can pass through the food chain as the result of ...

  16. An overview of selected NASP aeroelastic studies at the NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Spain, Charles V.; Soistmann, David L.; Parker, Ellen C.; Gibbons, Michael D.; Gilbert, Michael G.

    1990-01-01

    Following an initial discussion of the NASP flight environment, the results of recent aeroelastic testing of NASP-type highly swept delta-wing models in Langley's Transonic Dynamics Tunnel (TDT) are summarized. Subsonic and transonic flutter characteristics of a variety of these models are described, and several analytical codes used to predict flutter of these models are evaluated. These codes generally provide good, but conservative predictions of subsonic and transonic flutter. Also, test results are presented on a nonlinear transonic phenomena known as aileron buzz which occurred in the wind tunnel on highly swept delta wings with full-span ailerons. An analytical procedure which assesses the effects of hypersonic heating on aeroelastic instabilities (aerothermoelasticity) is also described. This procedure accurately predicted flutter of a heated aluminum wing on which experimental data exists. Results are presented on the application of this method to calculate the flutter characteristics of a fine-element model of a generic NASP configuration. Finally, it is demonstrated analytically that active controls can be employed to improve the aeroelastic stability and ride quality of a generic NASP vehicle flying at hypersonic speeds.

  17. Quality appraisal of generic self-reported instruments measuring health-related productivity changes: a systematic review

    PubMed Central

    2014-01-01

    Background Health impairments can result in disability and changed work productivity imposing considerable costs for the employee, employer and society as a whole. A large number of instruments exist to measure health-related productivity changes; however their methodological quality remains unclear. This systematic review critically appraised the measurement properties in generic self-reported instruments that measure health-related productivity changes to recommend appropriate instruments for use in occupational and economic health practice. Methods PubMed, PsycINFO, Econlit and Embase were systematically searched for studies whereof: (i) instruments measured health-related productivity changes; (ii) the aim was to evaluate instrument measurement properties; (iii) instruments were generic; (iv) ratings were self-reported; (v) full-texts were available. Next, methodological quality appraisal was based on COSMIN elements: (i) internal consistency; (ii) reliability; (iii) measurement error; (iv) content validity; (v) structural validity; (vi) hypotheses testing; (vii) cross-cultural validity; (viii) criterion validity; and (ix) responsiveness. Recommendations are based on evidence syntheses. Results This review included 25 articles assessing the reliability, validity and responsiveness of 15 different generic self-reported instruments measuring health-related productivity changes. Most studies evaluated criterion validity, none evaluated cross-cultural validity and information on measurement error is lacking. The Work Limitation Questionnaire (WLQ) was most frequently evaluated with moderate respectively strong positive evidence for content and structural validity and negative evidence for reliability, hypothesis testing and responsiveness. Less frequently evaluated, the Stanford Presenteeism Scale (SPS) showed strong positive evidence for internal consistency and structural validity, and moderate positive evidence for hypotheses testing and criterion validity. The Productivity and Disease Questionnaire (PRODISQ) yielded strong positive evidence for content validity, evidence for other properties is lacking. The other instruments resulted in mostly fair-to-poor quality ratings with limited evidence. Conclusions Decisions based on the content of the instrument, usage purpose, target country and population, and available evidence are recommended. Until high-quality studies are in place to accurately assess the measurement properties of the currently available instruments, the WLQ and, in a Dutch context, the PRODISQ are cautiously preferred based on its strong positive evidence for content validity. Based on its strong positive evidence for internal consistency and structural validity, the SPS is cautiously recommended. PMID:24495301

  18. Quality Scalability Aware Watermarking for Visual Content.

    PubMed

    Bhowmik, Deepayan; Abhayaratne, Charith

    2016-11-01

    Scalable coding-based content adaptation poses serious challenges to traditional watermarking algorithms, which do not consider the scalable coding structure and hence cannot guarantee correct watermark extraction in media consumption chain. In this paper, we propose a novel concept of scalable blind watermarking that ensures more robust watermark extraction at various compression ratios while not effecting the visual quality of host media. The proposed algorithm generates scalable and robust watermarked image code-stream that allows the user to constrain embedding distortion for target content adaptations. The watermarked image code-stream consists of hierarchically nested joint distortion-robustness coding atoms. The code-stream is generated by proposing a new wavelet domain blind watermarking algorithm guided by a quantization based binary tree. The code-stream can be truncated at any distortion-robustness atom to generate the watermarked image with the desired distortion-robustness requirements. A blind extractor is capable of extracting watermark data from the watermarked images. The algorithm is further extended to incorporate a bit-plane discarding-based quantization model used in scalable coding-based content adaptation, e.g., JPEG2000. This improves the robustness against quality scalability of JPEG2000 compression. The simulation results verify the feasibility of the proposed concept, its applications, and its improved robustness against quality scalable content adaptation. Our proposed algorithm also outperforms existing methods showing 35% improvement. In terms of robustness to quality scalable video content adaptation using Motion JPEG2000 and wavelet-based scalable video coding, the proposed method shows major improvement for video watermarking.

  19. Pharmaceutical quality of seven generic Levodopa/Benserazide products compared with original Madopar® / Prolopa®.

    PubMed

    Gasser, Urs E; Fischer, Anton; Timmermans, Jan P; Arnet, Isabelle

    2013-04-23

    By definition, a generic product is considered interchangeable with the innovator brand product. Controversy exists about interchangeability, and attention is predominantly directed to contaminants. In particular for chronic, degenerative conditions such as in Parkinson's disease (PD) generic substitution remains debated among physicians, patients and pharmacists. The objective of this study was to compare the pharmaceutical quality of seven generic levodopa/benserazide hydrochloride combination products marketed in Germany with the original product (Madopar® / Prolopa® 125, Roche, Switzerland) in order to evaluate the potential impact of Madopar® generics versus branded products for PD patients and clinicians. Madopar® / Prolopa® 125 tablets and capsules were used as reference material. The generic products tested (all 100 mg/25 mg formulations) included four tablet and three capsule formulations. Colour, appearance of powder (capsules), disintegration and dissolution, mass of tablets and fill mass of capsules, content, identity and amounts of impurities were assessed along with standard physical and chemical laboratory tests developed and routinely practiced at Roche facilities. Results were compared to the original "shelf-life" specifications in use by Roche. Each of the seven generic products had one or two parameters outside the specifications. Deviations for the active ingredients ranged from +8.4% (benserazide) to -7.6% (levodopa) in two tablet formulations. Degradation products were measured in marked excess (+26.5%) in one capsule formulation. Disintegration time and dissolution for levodopa and benserazide hydrochloride at 30 min were within specifications for all seven generic samples analysed, however with some outliers. Deviations for the active ingredients may go unnoticed by a new user of the generic product, but may entail clinical consequences when switching from original to generic during a long-term therapy. Degradation products may pose a safety concern. Our results should prompt caution when prescribing a generic of Madopar®/Prolopa®, and also invite to further investigations in view of a more comprehensive approach, both pharmaceutical and clinical.

  20. Speech coding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfullymore » regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the coding techniques are equally applicable to any voice signal whether or not it carries any intelligible information, as the term speech implies. Other terms that are commonly used are speech compression and voice compression since the fundamental idea behind speech coding is to reduce (compress) the transmission rate (or equivalently the bandwidth) And/or reduce storage requirements In this document the terms speech and voice shall be used interchangeably.« less

  1. Comparison of ORSAT and SCARAB Reentry Analysis Tools for a Generic Satellite Test Case

    NASA Technical Reports Server (NTRS)

    Kelley, Robert L.; Hill, Nicole M.; Rochelle, W. C.; Johnson, Nicholas L.; Lips, T.

    2010-01-01

    Reentry analysis is essential to understanding the consequences of the full life cycle of a spacecraft. Since reentry is a key factor in spacecraft development, NASA and ESA have separately developed tools to assess the survivability of objects during reentry. Criteria such as debris casualty area and impact energy are particularly important to understanding the risks posed to people on Earth. Therefore, NASA and ESA have undertaken a series of comparison studies of their respective reentry codes for verification and improvements in accuracy. The NASA Object Reentry Survival Analysis Tool (ORSAT) and the ESA Spacecraft Atmospheric Reentry and Aerothermal Breakup (SCARAB) reentry analysis tools serve as standard codes for reentry survivability assessment of satellites. These programs predict whether an object will demise during reentry and calculate the debris casualty area of objects determined to survive, establishing the reentry risk posed to the Earth's population by surviving debris. A series of test cases have been studied for comparison and the most recent uses "Testsat," a conceptual satellite composed of generic parts, defined to use numerous simple shapes and various materials for a better comparison of the predictions of these two codes. This study is an improvement on the others in this series because of increased consistency in modeling techniques and variables. The overall comparison demonstrated that the two codes arrive at similar results. Either most objects modeled resulted in close agreement between the two codes, or if the difference was significant, the variance could be explained as a case of semantics in the model definitions. This paper presents the main results of ORSAT and SCARAB for the Testsat case and discusses the sources of any discovered differences. Discussion of the results of previous comparisons is made for a summary of differences between the codes and lessons learned from this series of tests.

  2. Multi-processing on supercomputers for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; Mehta, Unmeel B.

    1990-01-01

    The MIMD concept is applied, through multitasking, with relatively minor modifications to an existing code for a single processor. This approach maps the available memory to multiple processors, exploiting the C-FORTRAN-Unix interface. An existing single processor algorithm is mapped without the need for developing a new algorithm. The procedure of designing a code utilizing this approach is automated with the Unix stream editor. A Multiple Processor Multiple Grid (MPMG) code is developed as a demonstration of this approach. This code solves the three-dimensional, Reynolds-averaged, thin-layer and slender-layer Navier-Stokes equations with an implicit, approximately factored and diagonalized method. This solver is applied to a generic, oblique-wing aircraft problem on a four-processor computer using one process for data management and nonparallel computations and three processes for pseudotime advance on three different grid systems.

  3. Social media messaging in pregnancy: comparing content of Text4baby to content of free smart phone applications of pregnancy.

    PubMed

    Lewkowitz, Adam K; O'Donnell, Betsy E; Nakagawa, Sanae; Vargas, Juan E; Zlatnik, Marya G

    2016-03-01

    Text4baby is the only free text-message program for pregnancy available. Our objective was to determine whether content differed between Text4baby and popular pregnancy smart phone applications (apps). Researchers enrolled in Text4baby in 2012 and downloaded the four most-popular free pregnancy smart phone apps in July 2013; content was re-extracted in February 2014. Messages were assigned thematic codes. Two researchers coded messages independently before reviewing all the codes jointly to ensure consistency. Logistic regression modeling determined statistical differences between Text4baby and smart phone apps. About 1399 messages were delivered. Of these, 333 messages had content related to more than one theme and were coded as such, resulting in 1820 codes analyzed. Compared to smart phone apps, Text4baby was significantly more likely to have content regarding Postpartum Planning, Seeking Care, Recruitment and Prevention and significantly less likely to mention Normal Pregnancy Symptoms. No messaging program included content regarding postpartum contraception. To improve content without increasing text message number, Text4baby could replace messages on recruitment with messages regarding normal pregnancy symptoms, fetal development and postpartum contraception.

  4. Overview of NASA Lewis Research Center free-piston Stirling engine activities

    NASA Technical Reports Server (NTRS)

    Slaby, J. G.

    1984-01-01

    A generic free-piston Stirling technology project is being conducted to develop technologies generic to both space power and terrestrial heat pump applications in a cooperative, cost-shared effort. The generic technology effort includes extensive parametric testing of a 1 kW free-piston Stirling engine (RE-1000), development of a free-piston Stirling performance computer code, design and fabrication under contract of a hydraulic output modification for RE-1000 engine tests, and a 1000-hour endurance test, under contract, of a 3 kWe free-piston Stirling/alternator engine. A newly initiated space power technology feasibility demonstration effort addresses the capability of scaling a free-piston Stirling/alternator system to about 25 kWe; developing thermodynamic cycle efficiency or equal to 70 percent of Carnot at temperature ratios in the order of 1.5 to 2.0; achieving a power conversion unit specific weight of 6 kg/kWe; operating with noncontacting gas bearings; and dynamically balancing the system. Planned engine and component design and test efforts are described.

  5. Problems with numerical techniques: Application to mid-loop operation transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bryce, W.M.; Lillington, J.N.

    1997-07-01

    There has been an increasing need to consider accidents at shutdown which have been shown in some PSAs to provide a significant contribution to overall risk. In the UK experience has been gained at three levels: (1) Assessment of codes against experiments; (2) Plant studies specifically for Sizewell B; and (3) Detailed review of modelling to support the plant studies for Sizewell B. The work has largely been carried out using various versions of RELAP5 and SCDAP/RELAP5. The paper details some of the problems that have needed to be addressed. It is believed by the authors that these kinds ofmore » problems are probably generic to most of the present generation system thermal-hydraulic codes for the conditions present in mid-loop transients. Thus as far as possible these problems and solutions are proposed in generic terms. The areas addressed include: condensables at low pressure, poor time step calculation detection, water packing, inadequate physical modelling, numerical heat transfer and mass errors. In general single code modifications have been proposed to solve the problems. These have been very much concerned with means of improving existing models rather than by formulating a completely new approach. They have been produced after a particular problem has arisen. Thus, and this has been borne out in practice, the danger is that when new transients are attempted, new problems arise which then also require patching.« less

  6. Evidence for Natural Selection in Nucleotide Content Relationships Based on Complete Mitochondrial Genomes: Strong Effect of Guanine Content on Separation between Terrestrial and Aquatic Vertebrates.

    PubMed

    Sorimachi, Kenji; Okayasu, Teiji

    2015-01-01

    The complete vertebrate mitochondrial genome consists of 13 coding genes. We used this genome to investigate the existence of natural selection in vertebrate evolution. From the complete mitochondrial genomes, we predicted nucleotide contents and then separated these values into coding and non-coding regions. When nucleotide contents of a coding or non-coding region were plotted against the nucleotide content of the complete mitochondrial genomes, we obtained linear regression lines only between homonucleotides and their analogs. On every plot using G or A content purine, G content in aquatic vertebrates was higher than that in terrestrial vertebrates, while A content in aquatic vertebrates was lower than that in terrestrial vertebrates. Based on these relationships, vertebrates were separated into two groups, terrestrial and aquatic. However, using C or T content pyrimidine, clear separation between these two groups was not obtained. The hagfish (Eptatretus burgeri) was further separated from both terrestrial and aquatic vertebrates. Based on these results, nucleotide content relationships predicted from the complete vertebrate mitochondrial genomes reveal the existence of natural selection based on evolutionary separation between terrestrial and aquatic vertebrate groups. In addition, we propose that separation of the two groups might be linked to ammonia detoxification based on high G and low A contents, which encode Glu rich and Lys poor proteins.

  7. Health and nutrition content claims on Australian fast-food websites.

    PubMed

    Wellard, Lyndal; Koukoumas, Alexandra; Watson, Wendy L; Hughes, Clare

    2017-03-01

    To determine the extent that Australian fast-food websites contain nutrition content and health claims, and whether these claims are compliant with the new provisions of the Australia New Zealand Food Standards Code ('the Code'). Systematic content analysis of all web pages to identify nutrition content and health claims. Nutrition information panels were used to determine whether products with claims met Nutrient Profiling Scoring Criteria (NPSC) and qualifying criteria, and to compare them with the Code to determine compliance. Australian websites of forty-four fast-food chains including meals, bakery, ice cream, beverage and salad chains. Any products marketed on the websites using health or nutrition content claims. Of the forty-four fast-food websites, twenty (45 %) had at least one claim. A total of 2094 claims were identified on 371 products, including 1515 nutrition content (72 %) and 579 health claims (28 %). Five fast-food products with health (5 %) and 157 products with nutrition content claims (43 %) did not meet the requirements of the Code to allow them to carry such claims. New provisions in the Code came into effect in January 2016 after a 3-year transition. Food regulatory agencies should review fast-food websites to ensure compliance with the qualifying criteria for nutrition content and health claim regulations. This would prevent consumers from viewing unhealthy foods as healthier choices. Healthy choices could be facilitated by applying NPSC to nutrition content claims. Fast-food chains should be educated on the requirements of the Code regarding claims.

  8. Development of Multi-physics (Multiphase CFD + MCNP) simulation for generic solution vessel power calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Seung Jun; Buechler, Cynthia Eileen

    The current study aims to predict the steady state power of a generic solution vessel and to develop a corresponding heat transfer coefficient correlation for a Moly99 production facility by conducting a fully coupled multi-physics simulation. A prediction of steady state power for the current application is inherently interconnected between thermal hydraulic characteristics (i.e. Multiphase computational fluid dynamics solved by ANSYS-Fluent 17.2) and the corresponding neutronic behavior (i.e. particle transport solved by MCNP6.2) in the solution vessel. Thus, the development of a coupling methodology is vital to understand the system behavior at a variety of system design and postulated operatingmore » scenarios. In this study, we report on the k-effective (keff) calculation for the baseline solution vessel configuration with a selected solution concentration using MCNP K-code modeling. The associated correlation of thermal properties (e.g. density, viscosity, thermal conductivity, specific heat) at the selected solution concentration are developed based on existing experimental measurements in the open literature. The numerical coupling methodology between multiphase CFD and MCNP is successfully demonstrated, and the detailed coupling procedure is documented. In addition, improved coupling methods capturing realistic physics in the solution vessel thermal-neutronic dynamics are proposed and tested further (i.e. dynamic height adjustment, mull-cell approach). As a key outcome of the current study, a multi-physics coupling methodology between MCFD and MCNP is demonstrated and tested for four different operating conditions. Those different operating conditions are determined based on the neutron source strength at a fixed geometry condition. The steady state powers for the generic solution vessel at various operating conditions are reported, and a generalized correlation of the heat transfer coefficient for the current application is discussed. The assessment of multi-physics methodology and preliminary results from various coupled calculations (power prediction and heat transfer coefficient) can be further utilized for the system code validation and generic solution vessel design improvement.« less

  9. Creating Domain Independent Adaptive E-Learning Systems Using the Sharable Content Object Reference Model

    ERIC Educational Resources Information Center

    Watson, Jason; Ahmed, Pervaiz K.; Hardaker, Glenn

    2007-01-01

    Purpose: This research aims to investigate how a generic web-based ITS can be created which will adapt the training content in real time, to the needs of the individual trainee across any domain. Design/methodology/approach: After examining the various alternatives SCORM was adopted in this project because it provided an infrastructure that makes…

  10. Comparative effectiveness and costs of generic and brand-name gabapentin and venlafaxine in patients with neuropathic pain or generalized anxiety disorder in Spain.

    PubMed

    Sicras-Mainar, Antoni; Rejas-Gutiérrez, Javier; Navarro-Artieda, Ruth

    2015-01-01

    To explore adherence/persistence with generic gabapentin/venlafaxine versus brand-name gabapentin/venlafaxine (Neurontin(®)/Vandral(®)) in peripheral neuropathic pain (pNP) or generalized anxiety disorder (GAD), respectively, and whether it is translated into different costs and patient outcomes in routine medical practice. A retrospective, new-user cohort study was designed. Electronic medical records (EMR) of patients included in the health plan of Badalona Serveis Assistencials SA, Barcelona, Spain were exhaustively extracted for analysis. Participants were beneficiaries aged 18+ years, followed between 2008 and 2012, with a pNP/GAD International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) code, who initiated treatment with generic or brand-name gabapentin or venlafaxine. Assessments included 1-year treatment persistence and adherence (medication possession ratio), health care costs, and reduction in severity of pain and anxiety symptoms. A total of 2,210 EMR were analyzed; 1,369 on gabapentin (brand 400; generic 969) and 841 on venlafaxine (brand 370 and generic 471). Brand-name gabapentin and venlafaxine were both significantly associated with longer persistence than generic: 7.3 versus 6.3 months, P<0.001; and 8.8 versus 8.1 months, P<0.05, respectively. Brand-name was associated with higher adherence: 86.5% versus 81.3%, P<0.001; and 82.1% versus 79.0%, P<0.05, respectively. Adjusted average costs were higher with generic compared with brand: €1,277 versus €1,057 (difference of €220 per patient; P<0.001) for gabapentin; and €1,110 versus €928 (difference of €182 per patient; P=0.020) for venlafaxine, both because of more use of medical visits and concomitant medication. Compared with generic, brand-name was associated with higher reduction in pain (7.8%; P<0.001) and anxiety (13.2%; P<0.001). Patients initiating brand-name gabapentin or venlafaxine were more likely to adhere and persist on treatment of pNP or GAD, have lower health care costs, and show further reduction of pain and anxiety symptoms than with generic drugs in routine medical practice.

  11. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-08-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  12. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-06-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  13. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-09-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  14. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-06-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  15. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-01-18

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  16. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-08-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  17. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-12-20

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  18. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2007-02-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  19. CH-TRU Waste Content Codes (CH-TRUCON)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2006-09-15

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  20. CH-TRU Waste Content Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Washington TRU Solutions LLC

    2008-01-16

    The CH-TRU Waste Content Codes (CH-TRUCON) document describes the inventory of the U.S. Department of Energy (DOE) CH-TRU waste within the transportation parameters specified by the Contact-Handled Transuranic Waste Authorized Methods for Payload Control (CH-TRAMPAC). The CH-TRAMPAC defines the allowable payload for the Transuranic Package Transporter-II (TRUPACT-II) and HalfPACT packagings. This document is a catalog of TRUPACT-II and HalfPACT authorized contents and a description of the methods utilized to demonstrate compliance with the CH-TRAMPAC. A summary of currently approved content codes by site is presented in Table 1. The CH-TRAMPAC describes "shipping categories" that are assigned to each payload container.more » Multiple shipping categories may be assigned to a single content code. A summary of approved content codes and corresponding shipping categories is provided in Table 2, which consists of Tables 2A, 2B, and 2C. Table 2A provides a summary of approved content codes and corresponding shipping categories for the "General Case," which reflects the assumption of a 60-day shipping period as described in the CH-TRAMPAC and Appendix 3.4 of the CH-TRU Payload Appendices. For shipments to be completed within an approximately 1,000-mile radius, a shorter shipping period of 20 days is applicable as described in the CH-TRAMPAC and Appendix 3.5 of the CH-TRU Payload Appendices. For shipments to WIPP from Los Alamos National Laboratory (LANL), Nevada Test Site, and Rocky Flats Environmental Technology Site, a 20-day shipping period is applicable. Table 2B provides a summary of approved content codes and corresponding shipping categories for "Close-Proximity Shipments" (20-day shipping period). For shipments implementing the controls specified in the CH-TRAMPAC and Appendix 3.6 of the CH-TRU Payload Appendices, a 10-day shipping period is applicable. Table 2C provides a summary of approved content codes and corresponding shipping categories for "Controlled Shipments" (10-day shipping period).« less

  1. A Comprehensive Validation Approach Using The RAVEN Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J

    2015-06-01

    The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less

  2. Critical attributes of transdermal drug delivery system (TDDS)--a generic product development review.

    PubMed

    Ruby, P K; Pathak, Shriram M; Aggarwal, Deepika

    2014-11-01

    Bioequivalence testing of transdermal drug delivery systems (TDDS) has always been a subject of high concern for generic companies due to the formulation complexity and the fact that they are subtle to even minor manufacturing differences and hence should be clearly qualified in terms of quality, safety and efficacy. In recent times bioequivalence testing of transdermal patches has gained a global attention and many regulatory authorities worldwide have issued recommendations to set specific framework for demonstrating equivalence between two products. These current regulatory procedures demand a complete characterization of the generic formulation in terms of its physicochemical sameness, pharmacokinetics disposition, residual content and/or skin irritation/sensitization testing with respect to the reference formulation. This paper intends to highlight critical in vitro tests in assessing the therapeutic equivalence of products and also outlines their valuable applications in generic product success. Understanding these critical in vitro parameters can probably help to decode the complex bioequivalence outcomes, directing the generic companies to optimize the formulation design in reduced time intervals. It is difficult to summarize a common platform which covers all possible transdermal products; hence few case studies based on this approach has been presented in this review.

  3. A 3D-CFD code for accurate prediction of fluid flows and fluid forces in seals

    NASA Technical Reports Server (NTRS)

    Athavale, M. M.; Przekwas, A. J.; Hendricks, R. C.

    1994-01-01

    Current and future turbomachinery requires advanced seal configurations to control leakage, inhibit mixing of incompatible fluids and to control the rotodynamic response. In recognition of a deficiency in the existing predictive methodology for seals, a seven year effort was established in 1990 by NASA's Office of Aeronautics Exploration and Technology, under the Earth-to-Orbit Propulsion program, to develop validated Computational Fluid Dynamics (CFD) concepts, codes and analyses for seals. The effort will provide NASA and the U.S. Aerospace Industry with advanced CFD scientific codes and industrial codes for analyzing and designing turbomachinery seals. An advanced 3D CFD cylindrical seal code has been developed, incorporating state-of-the-art computational methodology for flow analysis in straight, tapered and stepped seals. Relevant computational features of the code include: stationary/rotating coordinates, cylindrical and general Body Fitted Coordinates (BFC) systems, high order differencing schemes, colocated variable arrangement, advanced turbulence models, incompressible/compressible flows, and moving grids. This paper presents the current status of code development, code demonstration for predicting rotordynamic coefficients, numerical parametric study of entrance loss coefficients for generic annular seals, and plans for code extensions to labyrinth, damping, and other seal configurations.

  4. Mapping a Domain Model and Architecture to a Generic Design

    DTIC Science & Technology

    1994-05-01

    Code 103 Aooession Fol [TIS GRA&I Or DTIC TAB 0 Uaannouneed 0 hTst ifI catlon DIstributiqoj.- A Availability 0Qede Avail and,(o viaa~ CMLVSBI44-TF#4...this step is a record (in no specific form) of the selected features. This record (list, highlighted features diagram, or other media ) is used

  5. "Metropolis," The Lights Fantastic: Semiotic Analysis of Lighting Codes in Relation to Character and Theme.

    ERIC Educational Resources Information Center

    Roth, Lane

    Fritz Lang's "Metropolis" (1927) is a seminal film because of its concern, now generic, with the profound impact technological progress has on mankind's social and spiritual progress. As in many later science fiction films, the ascendancy of artifact over nature is depicted not as liberating human beings, but as subjecting and corrupting…

  6. Label swapper device for spectral amplitude coded optical packet networks monolithically integrated on InP.

    PubMed

    Muñoz, P; García-Olcina, R; Habib, C; Chen, L R; Leijtens, X J M; de Vries, T; Robbins, D; Capmany, J

    2011-07-04

    In this paper the design, fabrication and experimental characterization of an spectral amplitude coded (SAC) optical label swapper monolithically integrated on Indium Phosphide (InP) is presented. The device has a footprint of 4.8x1.5 mm2 and is able to perform label swapping operations required in SAC at a speed of 155 Mbps. The device was manufactured in InP using a multiple purpose generic integration scheme. Compared to previous SAC label swapper demonstrations, using discrete component assembly, this label swapper chip operates two order of magnitudes faster.

  7. Topological Qubits from Valence Bond Solids

    NASA Astrophysics Data System (ADS)

    Wang, Dong-Sheng; Affleck, Ian; Raussendorf, Robert

    2018-05-01

    Topological qubits based on S U (N )-symmetric valence-bond solid models are constructed. A logical topological qubit is the ground subspace with twofold degeneracy, which is due to the spontaneous breaking of a global parity symmetry. A logical Z rotation by an angle 2 π /N , for any integer N >2 , is provided by a global twist operation, which is of a topological nature and protected by the energy gap. A general concatenation scheme with standard quantum error-correction codes is also proposed, which can lead to better codes. Generic error-correction properties of symmetry-protected topological order are also demonstrated.

  8. EUGENE'HOM: A generic similarity-based gene finder using multiple homologous sequences.

    PubMed

    Foissac, Sylvain; Bardou, Philippe; Moisan, Annick; Cros, Marie-Josée; Schiex, Thomas

    2003-07-01

    EUGENE'HOM is a gene prediction software for eukaryotic organisms based on comparative analysis. EUGENE'HOM is able to take into account multiple homologous sequences from more or less closely related organisms. It integrates the results of TBLASTX analysis, splice site and start codon prediction and a robust coding/non-coding probabilistic model which allows EUGENE'HOM to handle sequences from a variety of organisms. The current target of EUGENE'HOM is plant sequences. The EUGENE'HOM web site is available at http://genopole.toulouse.inra.fr/bioinfo/eugene/EuGeneHom/cgi-bin/EuGeneHom.pl.

  9. Final generic environmental statement on the use of recycle plutonium in mixed oxide fuel in light water cooled reactors. Volume 5. Public comments and Nuclear Regulatory Commission responses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1976-08-01

    Copies of 69 letters are presented commenting on the Draft Generic Environmental Statement (GESMO) WASH-1327 and the NRC's responses to the comments received from Federal, State and local agencies; environmental and public interest groups, members of the academic and industrial communities, and individual citizens. An index to these letters indicating the number assigned to each letter, the author, and organization represented, is provided in the Table of Contents.

  10. An audit of alcohol brand websites.

    PubMed

    Gordon, Ross

    2011-11-01

    The study investigated the nature and content of alcohol brand websites in the UK. The research involved an audit of the websites of the 10 leading alcohol brands by sales in the UK across four categories: lager, spirits, Flavoured Alcoholic Beverages and cider/perry. Each site was visited twice over a 1-month period with site features and content recorded using a pro-forma. The content of websites was then reviewed against the regulatory codes governing broadcast advertising of alcohol. It was found that 27 of 40 leading alcohol brands had a dedicated website. Sites featured sophisticated content, including sports and music sections, games, downloads and competitions. Case studies of two brand websites demonstrate the range of content features on such sites. A review of the application of regulatory codes covering traditional advertising found some content may breach the codes. Study findings illustrate the sophisticated range of content accessible on alcohol brand websites. When applying regulatory codes covering traditional alcohol marketing channels it is apparent that some content on alcohol brand websites would breach the codes. This suggests the regulation of alcohol brand websites may be an issue requiring attention from policymakers. Further research in this area would help inform this process. © 2010 Australasian Professional Society on Alcohol and other Drugs.

  11. Dimensional scaling for impact cratering and perforation

    NASA Technical Reports Server (NTRS)

    Watts, Alan J.; Atkinson, Dale

    1995-01-01

    POD Associates have revisited the issue of generic scaling laws able to adequately predict (within better than 20 percent) cratering in semi-infinite targets and perforations through finite thickness targets. The approach used was to apply physical logic for hydrodynamics in a consistent manner able to account for chunky-body impacts such that the only variables needed are those directly related to known material properties for both the impactor and target. The analyses were compared and verified versus CTH hydrodynamic code calculations and existing data. Comparisons with previous scaling laws were also performed to identify which (if any) were good for generic purposes. This paper is a short synopsis of the full report available through the NASA Langley Research Center, LDEF Science Office.

  12. Benchmarks of simple, generic, shaped plates for validation of low-frequency electromagnetic computational codes

    NASA Technical Reports Server (NTRS)

    Deshpande, M. D.; Cockrell, C. R.; Beck, F. B.; Nguyen, T. X.

    1993-01-01

    The validation of low-frequency measurements and electromagnetic (EM) scattering computations for several simple, generic shapes, such as an equilateral-triangular plate, an equilateral-triangular plate with a concentric equilateral-triangular hole, and diamond- and hexagonal-shaped plates, is discussed. The plates were constructed from a thin aluminum sheet with a thickness of 0.08 cm. EM scattering by the planar plates was measured in the experimental test range (ETR) facility of NASA Langley Research Center. The dimensions of the plates were selected such that, over the frequency range of interest, the dimensions were in the range of lambda0 to 3(lambda0). In addition, the triangular plate with a triangular hole was selected to study internal-hole resonances.

  13. Impact of the introduction of mandatory generic substitution in South Africa: private sector sales of generic and originator medicines for chronic diseases.

    PubMed

    Gray, Andrew Lofts; Santa-Ana-Tellez, Yared; J Wirtz, Veronika

    2016-12-01

    To assess the impact of mandatory offer of generic substitution, introduced in South Africa in May 2003, on private sector sales of generic and originator medicines for chronic diseases. Private sector sales data (June 2001 to May 2005) were obtained from IMS Health for proton pump inhibitors (PPIs; ATC code A02BC), HMG-CoA reductase inhibitors (statins; C10AA), dihydropyridine calcium antagonists (C08CA), angiotensin-converting enzyme inhibitors (ACE-I; C09AA) and selective serotonin reuptake inhibitors (SSRIs; N06AB). Monthly sales were expressed as defined daily doses per 1000 insured population per month (DDD/TIM). Interrupted time-series models were used to estimate the changes in slope and level of medicines use after the policy change. ARIMA models were used to correct for autocorrelation and stationarity. Only the SSRIs saw a significant rise in level of generic utilisation (0.2 DDD/TIM; P < 0.001) and a fall in originator usage (-0.1 DDD/TIM; P < 0.001) after the policy change. Utilisation of generic PPIs fell (level 0.06 DDD/TIM, P = 0.048; slope 0.01 DDD/TIM, P = 0.043), but utilisation of originator products also grew (level 0.05 DDD/TIM, P < 0.001; slope 0.003, P = 0.001). Generic calcium antagonists and ACE-I showed an increase in slope (0.01 DDD/TIM, P = 0.016; 0.02 DDD/TIM, P < 0.001), while the originators showed a decrease in slope (-0.003 DDD/TIM, P = 0.046; -0.01 DDD/TIM, P < 0.001). There were insufficient data on generic statin use before the policy change to allow for analysis. The mandatory offer of generic substitution appeared to have had a quantifiable effect on utilisation patterns in the 2 years after May 2003. Managed care interventions that were already in place before the intervention may have blunted the extent of the changes seen in this period. Generic policies are an important enabling provision for cost-containment efforts. However, decisions taken outside of official policy may anticipate or differ from that policy, with important consequences. © 2016 John Wiley & Sons Ltd.

  14. Experimental aerothermodynamic research of hypersonic aircraft

    NASA Technical Reports Server (NTRS)

    Cleary, Joseph W.

    1987-01-01

    The 2-D and 3-D advance computer codes being developed for use in the design of such hypersonic aircraft as the National Aero-Space Plane require comparison of the computational results with a broad spectrum of experimental data to fully assess the validity of the codes. This is particularly true for complex flow fields with control surfaces present and for flows with separation, such as leeside flow. Therefore, the objective is to provide a hypersonic experimental data base required for validation of advanced computational fluid dynamics (CFD) computer codes and for development of more thorough understanding of the flow physics necessary for these codes. This is being done by implementing a comprehensive test program for a generic all-body hypersonic aircraft model in the NASA/Ames 3.5 foot Hypersonic Wind Tunnel over a broad range of test conditions to obtain pertinent surface and flowfield data. Results from the flow visualization portion of the investigation are presented.

  15. Improvements on non-equilibrium and transport Green function techniques: The next-generation TRANSIESTA

    NASA Astrophysics Data System (ADS)

    Papior, Nick; Lorente, Nicolás; Frederiksen, Thomas; García, Alberto; Brandbyge, Mads

    2017-03-01

    We present novel methods implemented within the non-equilibrium Green function code (NEGF) TRANSIESTA based on density functional theory (DFT). Our flexible, next-generation DFT-NEGF code handles devices with one or multiple electrodes (Ne ≥ 1) with individual chemical potentials and electronic temperatures. We describe its novel methods for electrostatic gating, contour optimizations, and assertion of charge conservation, as well as the newly implemented algorithms for optimized and scalable matrix inversion, performance-critical pivoting, and hybrid parallelization. Additionally, a generic NEGF "post-processing" code (TBTRANS/PHTRANS) for electron and phonon transport is presented with several novelties such as Hamiltonian interpolations, Ne ≥ 1 electrode capability, bond-currents, generalized interface for user-defined tight-binding transport, transmission projection using eigenstates of a projected Hamiltonian, and fast inversion algorithms for large-scale simulations easily exceeding 106 atoms on workstation computers. The new features of both codes are demonstrated and bench-marked for relevant test systems.

  16. A Generic Modeling Process to Support Functional Fault Model Development

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  17. Development of a New Branded UK Food Composition Database for an Online Dietary Assessment Tool

    PubMed Central

    Carter, Michelle C.; Hancock, Neil; Albar, Salwa A.; Brown, Helen; Greenwood, Darren C.; Hardie, Laura J.; Frost, Gary S.; Wark, Petra A.; Cade, Janet E.

    2016-01-01

    The current UK food composition tables are limited, containing ~3300 mostly generic food and drink items. To reflect the wide range of food products available to British consumers and to potentially improve accuracy of dietary assessment, a large UK specific electronic food composition database (FCDB) has been developed. A mapping exercise has been conducted that matched micronutrient data from generic food codes to “Back of Pack” data from branded food products using a semi-automated process. After cleaning and processing, version 1.0 of the new FCDB contains 40,274 generic and branded items with associated 120 macronutrient and micronutrient data and 5669 items with portion images. Over 50% of food and drink items were individually mapped to within 10% agreement with the generic food item for energy. Several quality checking procedures were applied after mapping including; identifying foods above and below the expected range for a particular nutrient within that food group and cross-checking the mapping of items such as concentrated and raw/dried products. The new electronic FCDB has substantially increased the size of the current, publically available, UK food tables. The FCDB has been incorporated into myfood24, a new fully automated online dietary assessment tool and, a smartphone application for weight loss. PMID:27527214

  18. Development of a New Branded UK Food Composition Database for an Online Dietary Assessment Tool.

    PubMed

    Carter, Michelle C; Hancock, Neil; Albar, Salwa A; Brown, Helen; Greenwood, Darren C; Hardie, Laura J; Frost, Gary S; Wark, Petra A; Cade, Janet E

    2016-08-05

    The current UK food composition tables are limited, containing ~3300 mostly generic food and drink items. To reflect the wide range of food products available to British consumers and to potentially improve accuracy of dietary assessment, a large UK specific electronic food composition database (FCDB) has been developed. A mapping exercise has been conducted that matched micronutrient data from generic food codes to "Back of Pack" data from branded food products using a semi-automated process. After cleaning and processing, version 1.0 of the new FCDB contains 40,274 generic and branded items with associated 120 macronutrient and micronutrient data and 5669 items with portion images. Over 50% of food and drink items were individually mapped to within 10% agreement with the generic food item for energy. Several quality checking procedures were applied after mapping including; identifying foods above and below the expected range for a particular nutrient within that food group and cross-checking the mapping of items such as concentrated and raw/dried products. The new electronic FCDB has substantially increased the size of the current, publically available, UK food tables. The FCDB has been incorporated into myfood24, a new fully automated online dietary assessment tool and, a smartphone application for weight loss.

  19. Establishing confidence in complex physics codes: Art or science?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trucano, T.

    1997-12-31

    The ALEGRA shock wave physics code, currently under development at Sandia National Laboratories and partially supported by the US Advanced Strategic Computing Initiative (ASCI), is generic to a certain class of physics codes: large, multi-application, intended to support a broad user community on the latest generation of massively parallel supercomputer, and in a continual state of formal development. To say that the author has ``confidence`` in the results of ALEGRA is to say something different than that he believes that ALEGRA is ``predictive.`` It is the purpose of this talk to illustrate the distinction between these two concepts. The authormore » elects to perform this task in a somewhat historical manner. He will summarize certain older approaches to code validation. He views these methods as aiming to establish the predictive behavior of the code. These methods are distinguished by their emphasis on local information. He will conclude that these approaches are more art than science.« less

  20. Pharmaceutical quality of seven generic Levodopa/Benserazide products compared with original Madopar® / Prolopa®

    PubMed Central

    2013-01-01

    Background By definition, a generic product is considered interchangeable with the innovator brand product. Controversy exists about interchangeability, and attention is predominantly directed to contaminants. In particular for chronic, degenerative conditions such as in Parkinson’s disease (PD) generic substitution remains debated among physicians, patients and pharmacists. The objective of this study was to compare the pharmaceutical quality of seven generic levodopa/benserazide hydrochloride combination products marketed in Germany with the original product (Madopar® / Prolopa® 125, Roche, Switzerland) in order to evaluate the potential impact of Madopar® generics versus branded products for PD patients and clinicians. Methods Madopar® / Prolopa® 125 tablets and capsules were used as reference material. The generic products tested (all 100 mg/25 mg formulations) included four tablet and three capsule formulations. Colour, appearance of powder (capsules), disintegration and dissolution, mass of tablets and fill mass of capsules, content, identity and amounts of impurities were assessed along with standard physical and chemical laboratory tests developed and routinely practiced at Roche facilities. Results were compared to the original “shelf-life” specifications in use by Roche. Results Each of the seven generic products had one or two parameters outside the specifications. Deviations for the active ingredients ranged from +8.4% (benserazide) to −7.6% (levodopa) in two tablet formulations. Degradation products were measured in marked excess (+26.5%) in one capsule formulation. Disintegration time and dissolution for levodopa and benserazide hydrochloride at 30 min were within specifications for all seven generic samples analysed, however with some outliers. Conclusions Deviations for the active ingredients may go unnoticed by a new user of the generic product, but may entail clinical consequences when switching from original to generic during a long-term therapy. Degradation products may pose a safety concern. Our results should prompt caution when prescribing a generic of Madopar®/Prolopa®, and also invite to further investigations in view of a more comprehensive approach, both pharmaceutical and clinical. PMID:23617953

  1. Obtaining higher-accuracy estimates of water-rich rocks and water-poor sand dunes on Mars in active neutron experiments

    NASA Astrophysics Data System (ADS)

    Gabriel, T. S. J.; Hardgrove, C.; Litvak, M. L.; Nowicki, S.; Mitrofanov, I. G.; Boynton, W. V.; Fedosov, F.; Golovin, D.; Jun, I.; Mischna, M.; Tate, C. G.; Moersch, J.; Harshman, K.; Kozyrev, A.; Malakhov, A. V.; Mokrousov, M.; Nikiforov, S.; Sanin, A. B.; Vostrukhin, A.; Thompson, L. M.

    2017-12-01

    The Dynamic Albedo of Neutrons (DAN) experiment on the Mars Science Laboratory Curiosity Rover delivers high-energy (14.1 MeV) pulses of neutrons into the surface when operating in "active" mode. Neutrons are moderated in the subsurface and return to two detectors to provide a time-of-flight profile in 64 time-bins in epithermal and thermal energy ranges. Results are compared to simulations of the experiment in the Monte Carlo N-Particle Transport Code where several aspects are modeled including the DAN detectors, neutron source, rover components, and underlying rock. Models can be improved by increasing the fidelity of the rock geochemistry as informed by instruments including the Alpha Particle X-Ray Spectrometer (APXS). Furthermore, increasing the fidelity of the rock morphology in models is enabled by the suite of imaging instruments on the rover.To rapidly interpret DAN data a set of pre-simulated generic rock density and bulk geochemistry models are compared to several DAN active observations. While, to first order, this methodology provides an indication of significant geochemical changes in the subsurface, higher-fidelity models should be used to provide accurate constraints on water content, depth of geologic layers, or abundance of neutron absorbers. For example, in high-silicon, low-iron rocks observed along the rover's traverse, generic models can differ by several wt%H2O from models that use APXS measurements of nearby drill samples. Accurate measurements of high-silicon targets are necessary in outlining the extent of aqueous alteration and hydrothermal activity in Gale Crater. Additionally, we find that for DAN active experiments over sand dunes best-fit models can differ by greater than 0.5 wt%HO when the upper layer density is reduced by 0.6 g/cm3 to account for the low-bulk density of sand. In areas where the rock geochemistry differs little from generic models the difference in results is expectedly less disparate. We report refined wt%HO values for high-silicon, aqueously-altered rock and comparatively dry sand dunes along the rover traverse. We also outline the methodology for providing accurate geochemical and morphological constraints using DAN active measurements.

  2. Comparison of cotton and acrylic socks using a generic cushion sole design for runners.

    PubMed

    Herring, K M; Richie, D H

    1993-09-01

    A longitudinal single-blind study was conducted to test the friction blister prevention properties of synthetic acrylic socks in a generic construction. This study serves as a comparison with the authors' previous work comparing acrylic and cotton socks in a patented padded construction. Twenty-seven long-distance runners provided data regarding dampness, temperature, friction blister incidence, severity, and size. Two different socks were tested; each was identical in every aspect of construction except the fiber content. One test sock was composed of 100% synthetic acrylic fibers, and the other was composed of 100% natural cotton fibers. These results were unsuccessful at demonstrating any superiority of cotton or acrylic fibers when knitting produced a generic "cushion sole" sock. The superiority of acrylic fibers has thus far been demonstrated only when sock knitting provides adequate anatomical padding [corrected].

  3. An Empirical Evaluation of the US Beer Institute’s Self-Regulation Code Governing the Content of Beer Advertising

    PubMed Central

    Xuan, Ziming; Damon, Donna; Noel, Jonathan

    2013-01-01

    Objectives. We evaluated advertising code violations using the US Beer Institute guidelines for responsible advertising. Methods. We applied the Delphi rating technique to all beer ads (n = 289) broadcast in national markets between 1999 and 2008 during the National Collegiate Athletic Association basketball tournament games. Fifteen public health professionals completed ratings using quantitative scales measuring the content of alcohol advertisements (e.g., perceived actor age, portrayal of excessive drinking) according to 1997 and 2006 versions of the Beer Institute Code. Results. Depending on the code version, exclusion criteria, and scoring method, expert raters found that between 35% and 74% of the ads had code violations. There were significant differences among producers in the frequency with which ads with violations were broadcast, but not in the proportions of unique ads with violations. Guidelines most likely to be violated included the association of beer drinking with social success and the use of content appealing to persons younger than 21 years. Conclusions. The alcohol industry’s current self-regulatory framework is ineffective at preventing content violations but could be improved by the use of new rating procedures designed to better detect content code violations. PMID:23947318

  4. An empirical evaluation of the US Beer Institute's self-regulation code governing the content of beer advertising.

    PubMed

    Babor, Thomas F; Xuan, Ziming; Damon, Donna; Noel, Jonathan

    2013-10-01

    We evaluated advertising code violations using the US Beer Institute guidelines for responsible advertising. We applied the Delphi rating technique to all beer ads (n = 289) broadcast in national markets between 1999 and 2008 during the National Collegiate Athletic Association basketball tournament games. Fifteen public health professionals completed ratings using quantitative scales measuring the content of alcohol advertisements (e.g., perceived actor age, portrayal of excessive drinking) according to 1997 and 2006 versions of the Beer Institute Code. Depending on the code version, exclusion criteria, and scoring method, expert raters found that between 35% and 74% of the ads had code violations. There were significant differences among producers in the frequency with which ads with violations were broadcast, but not in the proportions of unique ads with violations. Guidelines most likely to be violated included the association of beer drinking with social success and the use of content appealing to persons younger than 21 years. The alcohol industry's current self-regulatory framework is ineffective at preventing content violations but could be improved by the use of new rating procedures designed to better detect content code violations.

  5. Framing access to medicines in developing countries: an analysis of media coverage of Canada's Access to Medicines Regime.

    PubMed

    Esmail, Laura C; Phillips, Kaye; Kuek, Victoria; Cosio, Andrea Perez; Kohler, Jillian Clare

    2010-01-04

    In September 2003, the Canadian government committed to developing legislation that would facilitate greater access to affordable medicines for developing countries. Over the course of eight months, the legislation, now known as Canada's Access to Medicines Regime (CAMR), went through a controversial policy development process and the newspaper media was one of the major venues in which the policy debates took place. The purpose of this study was to examine how the media framed CAMR to determine how policy goals were conceptualized, which stakeholder interests controlled the public debate and how these variables related to the public policy process. We conducted a qualitative content analysis of newspaper coverage of the CAMR policy and implementation process from 2003-2008. The primary theoretical framework for this study was framing theory. A total of 90 articles from 11 Canadian newspapers were selected for inclusion in our analysis. A team of four researchers coded the articles for themes relating to access to medicines and which stakeholders' voice figured more prominently on each issue. Stakeholders examined included: the research-based industry, the generic industry, civil society, the Canadian government, and developing country representatives. The most frequently mentioned themes across all documents were the issues of drug affordability, intellectual property, trade agreements and obligations, and development. Issues such as human rights, pharmaceutical innovation, and economic competitiveness got little media representation. Civil society dominated the media contents, followed far behind by the Canadian government, the research-based and generic pharmaceutical industries. Developing country representatives were hardly represented in the media. Media framing obscured the discussion of some of the underlying policy goals in this case and failed to highlight issues which are now significant barriers to the use of the legislation. Using the media to engage the public in more in-depth exploration of the policy issues at stake may contribute to a more informed policy development process. The media can be an effective channel for those stakeholders with a weaker voice in policy deliberations to raise public attention to particular issues; however, the political and institutional context must be taken into account as it may outweigh media framing effects.

  6. Genera in Bionectriaceae, Hypocreaceae, and Nectriaceae (Hypocreales) proposed for acceptance or rejection

    Treesearch

    Amy Y. Rossman; Keith A. Seifert; Gary J. Samuels; Andrew M. Minnis; Hans-Josef Schroers; Lorenzo Lombard; PedroW Crous; Kadri Põldmaa; Paul F. Cannon; Richard C. Summerbell; David M. Geiser; Wen-ying Zhuang; Yuuri Hirooka; Cesar Herrera; Catalina Salgado-Salazar; Priscila Chaverri

    2013-01-01

    With the recent changes concerning pleomorphic fungi in the new International Code of Nomenclature for algae, fungi, and plants (ICN), it is necessary to propose the acceptance or protection of sexual morph-typified or asexual morph-typified generic names that do not have priority, or to propose the rejection or suppression1 of competing names. In addition, sexual...

  7. ORAC-DR: Astronomy data reduction pipeline

    NASA Astrophysics Data System (ADS)

    Jenness, Tim; Economou, Frossie; Cavanagh, Brad; Currie, Malcolm J.; Gibb, Andy

    2013-10-01

    ORAC-DR is a generic data reduction pipeline infrastructure; it includes specific data processing recipes for a number of instruments. It is used at the James Clerk Maxwell Telescope, United Kingdom Infrared Telescope, AAT, and LCOGT. This pipeline runs at the JCMT Science Archive hosted by CADC to generate near-publication quality data products; the code has been in use since 1998.

  8. Compression performance of HEVC and its format range and screen content coding extensions

    NASA Astrophysics Data System (ADS)

    Li, Bin; Xu, Jizheng; Sullivan, Gary J.

    2015-09-01

    This paper presents a comparison-based test of the objective compression performance of the High Efficiency Video Coding (HEVC) standard, its format range extensions (RExt), and its draft screen content coding extensions (SCC). The current dominant standard, H.264/MPEG-4 AVC, is used as an anchor reference in the comparison. The conditions used for the comparison tests were designed to reflect relevant application scenarios and to enable a fair comparison to the maximum extent feasible - i.e., using comparable quantization settings, reference frame buffering, intra refresh periods, rate-distortion optimization decision processing, etc. It is noted that such PSNR-based objective comparisons generally provide more conservative estimates of HEVC benefit than are found in subjective studies. The experimental results show that, when compared with H.264/MPEG-4 AVC, HEVC version 1 provides a bit rate savings for equal PSNR of about 23% for all-intra coding, 34% for random access coding, and 38% for low-delay coding. This is consistent with prior studies and the general characterization that HEVC can provide about a bit rate savings of about 50% for equal subjective quality for most applications. The HEVC format range extensions provide a similar bit rate savings of about 13-25% for all-intra coding, 28-33% for random access coding, and 32-38% for low-delay coding at different bit rate ranges. For lossy coding of screen content, the HEVC screen content coding extensions achieve a bit rate savings of about 66%, 63%, and 61% for all-intra coding, random access coding, and low-delay coding, respectively. For lossless coding, the corresponding bit rate savings are about 40%, 33%, and 32%, respectively.

  9. The CARMEN software as a service infrastructure.

    PubMed

    Weeks, Michael; Jessop, Mark; Fletcher, Martyn; Hodge, Victoria; Jackson, Tom; Austin, Jim

    2013-01-28

    The CARMEN platform allows neuroscientists to share data, metadata, services and workflows, and to execute these services and workflows remotely via a Web portal. This paper describes how we implemented a service-based infrastructure into the CARMEN Virtual Laboratory. A Software as a Service framework was developed to allow generic new and legacy code to be deployed as services on a heterogeneous execution framework. Users can submit analysis code typically written in Matlab, Python, C/C++ and R as non-interactive standalone command-line applications and wrap them as services in a form suitable for deployment on the platform. The CARMEN Service Builder tool enables neuroscientists to quickly wrap their analysis software for deployment to the CARMEN platform, as a service without knowledge of the service framework or the CARMEN system. A metadata schema describes each service in terms of both system and user requirements. The search functionality allows services to be quickly discovered from the many services available. Within the platform, services may be combined into more complicated analyses using the workflow tool. CARMEN and the service infrastructure are targeted towards the neuroscience community; however, it is a generic platform, and can be targeted towards any discipline.

  10. TRIQS: A toolbox for research on interacting quantum systems

    NASA Astrophysics Data System (ADS)

    Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka

    2015-11-01

    We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.

  11. Analysis of internal flows relative to the space shuttle main engine

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Cooperative efforts between the Lockheed-Huntsville Computational Mechanics Group and the NASA-MSFC Computational Fluid Dynamics staff has resulted in improved capabilities for numerically simulating incompressible flows generic to the Space Shuttle Main Engine (SSME). A well established and documented CFD code was obtained, modified, and applied to laminar and turbulent flows of the type occurring in the SSME Hot Gas Manifold. The INS3D code was installed on the NASA-MSFC CRAY-XMP computer system and is currently being used by NASA engineers. Studies to perform a transient analysis of the FPB were conducted. The COBRA/TRAC code is recommended for simulating the transient flow of oxygen into the LOX manifold. Property data for modifying the code to represent LOX/GOX flow was collected. The ALFA code was developed and recommended for representing the transient combustion in the preburner. These two codes will couple through the transient boundary conditions to simulate the startup and/or shutdown of the fuel preburner. A study, NAS8-37461, is currently being conducted to implement this modeling effort.

  12. Informational content of official pharmaceutical industry web sites about treatments for erectile dysfunction.

    PubMed

    Waack, Katherine E; Ernst, Michael E; Graber, Mark A

    2004-12-01

    In the last 5 years, several treatments have become available for erectile dysfunction (ED). During this same period, consumer use of the Internet for health information has increased rapidly. In traditional direct-to-consumer advertisements, viewers are often referred to a pharmaceutical company Web site for further information. To evaluate the accessibility and informational content of 5 pharmaceutical company Web sites about ED treatments. Using 10 popular search engines and 1 specialized search engine, the accessibility of the official pharmaceutical company-sponsored Web site was determined by searching under brand and generic names. One company also manufactures an ED device; this site was also included. A structured, explicit review of information found on these sites was conducted. Of 110 searches (1 for each treatment, including corresponding generic drug name, using each search engine), 68 yielded the official pharmaceutical company Web site within the first 10 links. Removal of outliers (for both brand and generic name searches) resulted in 68 of 77 searches producing the pharmaceutical company Web site for the brand-name drug in the top 10 links. Although all pharmaceutical company Web sites contained general information on adverse effects and contraindications to use, only 2 sites gave actual percentages. Three sites provided references for their materials or discussed other treatment or drug options, while 4 of the sites contained profound advertising or emotive content. None mentioned cost of the therapy. The information contained on pharmaceutical company Web sites for ED treatments is superficial and aimed primarily at consumers. It is largely promotional and provides only limited information needed to effectively compare treatment options.

  13. Quality of phenobarbital solid-dosage forms in the urban community of Nouakchott (Mauritania).

    PubMed

    Laroche, Marie-Laure; Traore, Hamidou; Merle, Louis; Gaulier, Jean-Michel; Viana, Marylene; Preux, Pierre-Marie

    2005-08-01

    Epilepsy is a major public-health problem in Africa. The quality of available drugs is a limiting factor for an adequate management. The aim of this study was to describe the proportion of poor-quality phenobarbital (PB) solid-dosage forms and evaluate the factors associated with its quality in Nouakchott (Mauritania). A cross-sectional study was carried out within pharmacies, hospitals, and on the parallel market in March 2003. PB samples were bought by a native person and then assayed by a liquid chromatography method. A package was considered to be of good quality if the active-substance average content was between 85 and 115% of the stated content printed on the packet. Forty-five pharmaceutical stores were visited, enabling us to collect 146 samples of PB. Three brand names were available in Nouakchott. They originated from France, Morocco, Senegal, and Egypt. Results: A prevalence of 13.7%[95% confidence interval (CI), 8.8-20.0] of poor-quality PB was found. All samples from Morocco were underdosed. The generic active content was satisfactory, but saccharose, an excipient with a potential side effects, was identified. Two factors associated with the good quality of PB have been put forward: tablets manufactured in France and loose packaging as generics conditioned in such a way were of good quality. This study shows that the quality of antiepileptic drugs in Africa is still worrying. The setting up of medicine quality control in Mauritania is legitimate. Considering the good quality of generic PB and its lower cost, this type of medicine should be promoted in this region.

  14. A Content Analysis of Testosterone Websites: Sex, Muscle, and Male Age-Related Thematic Differences

    PubMed Central

    Ivanov, Nicholas; Vuong, Jimmy; Gray, Peter B.

    2017-01-01

    Male testosterone supplementation is a large and growing industry. How is testosterone marketed to male consumers online? The present exploratory study entailed a content coding analysis of the home pages of 49 websites focused on testosterone supplementation for men in the United States. Four hypotheses concerning anticipated age-related differences in content coding were also tested: more frequent longevity content toward older men, and more frequent social dominance/physical formidability, muscle, and sex content toward younger men. Codes were created based on inductive observations and drawing upon the medical, life history, and human behavioral endocrinology literatures. Approximately half (n = 24) of websites were oriented toward younger men (estimated audience of men 40 years of age or younger) and half (n = 25) toward older men (estimated audience over 40 years of age). Results indicated that the most frequent content codes concerned online sales (e.g., product and purchasing information). Apart from sales information, the most frequent codes concerned, in order, muscle, sex/sexual functioning, low T, energy, fat, strength, aging, and well-being, with all four hypotheses also supported. These findings are interpreted in the light of medical, evolutionary life history, and human behavioral endocrinology approaches. PMID:29025355

  15. A Content Analysis of Testosterone Websites: Sex, Muscle, and Male Age-Related Thematic Differences.

    PubMed

    Ivanov, Nicholas; Vuong, Jimmy; Gray, Peter B

    2018-03-01

    Male testosterone supplementation is a large and growing industry. How is testosterone marketed to male consumers online? The present exploratory study entailed a content coding analysis of the home pages of 49 websites focused on testosterone supplementation for men in the United States. Four hypotheses concerning anticipated age-related differences in content coding were also tested: more frequent longevity content toward older men, and more frequent social dominance/physical formidability, muscle, and sex content toward younger men. Codes were created based on inductive observations and drawing upon the medical, life history, and human behavioral endocrinology literatures. Approximately half ( n = 24) of websites were oriented toward younger men (estimated audience of men 40 years of age or younger) and half ( n = 25) toward older men (estimated audience over 40 years of age). Results indicated that the most frequent content codes concerned online sales (e.g., product and purchasing information). Apart from sales information, the most frequent codes concerned, in order, muscle, sex/sexual functioning, low T, energy, fat, strength, aging, and well-being, with all four hypotheses also supported. These findings are interpreted in the light of medical, evolutionary life history, and human behavioral endocrinology approaches.

  16. A web-based clinical trial management system for a sham-controlled multicenter clinical trial in depression.

    PubMed

    Durkalski, Valerie; Wenle Zhao; Dillon, Catherine; Kim, Jaemyung

    2010-04-01

    Clinical trial investigators and sponsors invest vast amounts of resources and energy into conducting trials and often face daily challenges with data management, project management, and data quality control. Rather than waiting months for study progress reports, investigators need the ability to use real-time data for the coordination and management of study activities across all study team members including site investigators, oversight committees, data and safety monitoring boards, and medical safety monitors. Web-based data management systems are beginning to meet this need but what distinguishes one system from the other are user needs/requirements and cost. To illustrate the development and implementation of a web-based data and project management system for a multicenter clinical trial designed to test the superiority of repeated transcranial magnetic stimulation versus sham for the treatment of patients with major depression. The authors discuss the reasons for not using a commercially available system for this study and describe the approach to developing their own web-based system for the OPT-TMS study. Timelines, effort, system architecture, and lessons learned are shared with the hope that this information will direct clinical trial researchers and software developers towards more efficient, user-friendly systems. The developers use a combination of generic and custom application code to allow for the flexibility to adapt the system to the needs of the study. Features of the system include: central participant registration and randomization; secure data entry at the site; participant progress/study calendar; safety data reporting; device accounting; monitor verification; and user-configurable generic reports and built-in customized reports. Hard coding was more time-efficient to address project-specific issues compared with the effort of creating a generic code application. As a consequence of this strategy, the required maintenance of the system is increased and the value of using this system for other trials is reduced. Web-based central computerized systems offer time-saving, secure options for managing clinical trial data. The choice of a commercially available system or an internally developed system is determined by the requirements of the study and users. Pros and cons to both approaches were discussed. If the intention is to use the system for various trials (single and multi-center, phases I-III) across various therapeutic areas, then the overall design should be a generic structure that simplifies the general application with minimal loss of functionality.

  17. Definition of the unsteady vortex flow over a wing/body configuration

    NASA Technical Reports Server (NTRS)

    Liou, S. G.; Debry, B.; Lenakos, J.; Caplin, J.; Komerath, N. M.

    1991-01-01

    A problem of current interest in computational aerodynamics is the prediction of unsteady vortex flows over aircraft at high angles of attack. A six-month experimental effort was conducted at the John H. Harper Wind Tunnel to acquire qualitative and quantitative information on the unsteady vortex flow over a generic wing-body configuration at high angles of attack. A double-delta flat-plate wing with beveled edges was combined with a slender sharp-nosed body-of-revolution fuselage to form the generic configuration. This configuration produces a strong attached leading edge vortex on the wing, as well as sharply-peaked flow velocity spectra above the wing. While it thus produces flows with several well-defined features of current interest, the model was designed for efficiency of representation in computational codes. A moderate number of surface pressure ports and two unsteady pressure sensors were used to study the pressure distribution over the wing and body surface at high angles of attack; the unsteady pressure sensing did not succeed because of inadequate signal-to-noise ratio. A pulsed copper vapor laser sheet was used to visualize the vortex flow over the model, and vortex trajectories, burst locations, mutual induction of vortex systems from the forebody, strake, and wing, were quantified. Laser Doppler velocimetry was used to quantify all 3 components of the time-average velocity in 3 data planes perpendicular to the freestream direction. Statistics of the instantaneous velocity were used to study intermittency and fluctuation intensity. Hot-film anemometry was used to study the fluctuation energy content in the velocity field, and the spectra of these fluctuations. In addition, a successful attempt was made to measure velocity spectra, component by component, using laser velocimetry, and these were compared with spectra measured by hot-film anemometry at several locations.

  18. A portable MPI-based parallel vector template library

    NASA Technical Reports Server (NTRS)

    Sheffler, Thomas J.

    1995-01-01

    This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C++ by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of C or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.

  19. A Portable MPI-Based Parallel Vector Template Library

    NASA Technical Reports Server (NTRS)

    Sheffler, Thomas J.

    1995-01-01

    This paper discusses the design and implementation of a polymorphic collection library for distributed address-space parallel computers. The library provides a data-parallel programming model for C + + by providing three main components: a single generic collection class, generic algorithms over collections, and generic algebraic combining functions. Collection elements are the fourth component of a program written using the library and may be either of the built-in types of c or of user-defined types. Many ideas are borrowed from the Standard Template Library (STL) of C++, although a restricted programming model is proposed because of the distributed address-space memory model assumed. Whereas the STL provides standard collections and implementations of algorithms for uniprocessors, this paper advocates standardizing interfaces that may be customized for different parallel computers. Just as the STL attempts to increase programmer productivity through code reuse, a similar standard for parallel computers could provide programmers with a standard set of algorithms portable across many different architectures. The efficacy of this approach is verified by examining performance data collected from an initial implementation of the library running on an IBM SP-2 and an Intel Paragon.

  20. Simulation of Quantum Many-Body Dynamics for Generic Strongly-Interacting Systems

    NASA Astrophysics Data System (ADS)

    Meyer, Gregory; Machado, Francisco; Yao, Norman

    2017-04-01

    Recent experimental advances have enabled the bottom-up assembly of complex, strongly interacting quantum many-body systems from individual atoms, ions, molecules and photons. These advances open the door to studying dynamics in isolated quantum systems as well as the possibility of realizing novel out-of-equilibrium phases of matter. Numerical studies provide insight into these systems; however, computational time and memory usage limit common numerical methods such as exact diagonalization to relatively small Hilbert spaces of dimension 215 . Here we present progress toward a new software package for dynamical time evolution of large generic quantum systems on massively parallel computing architectures. By projecting large sparse Hamiltonians into a much smaller Krylov subspace, we are able to compute the evolution of strongly interacting systems with Hilbert space dimension nearing 230. We discuss and benchmark different design implementations, such as matrix-free methods and GPU based calculations, using both pre-thermal time crystals and the Sachdev-Ye-Kitaev model as examples. We also include a simple symbolic language to describe generic Hamiltonians, allowing simulation of diverse quantum systems without any modification of the underlying C and Fortran code.

  1. EUGÈNE'HOM: a generic similarity-based gene finder using multiple homologous sequences

    PubMed Central

    Foissac, Sylvain; Bardou, Philippe; Moisan, Annick; Cros, Marie-Josée; Schiex, Thomas

    2003-01-01

    EUGÈNE'HOM is a gene prediction software for eukaryotic organisms based on comparative analysis. EUGÈNE'HOM is able to take into account multiple homologous sequences from more or less closely related organisms. It integrates the results of TBLASTX analysis, splice site and start codon prediction and a robust coding/non-coding probabilistic model which allows EUGÈNE'HOM to handle sequences from a variety of organisms. The current target of EUGÈNE'HOM is plant sequences. The EUGÈNE'HOM web site is available at http://genopole.toulouse.inra.fr/bioinfo/eugene/EuGeneHom/cgi-bin/EuGeneHom.pl. PMID:12824408

  2. Use of Code-Switching in Multilingual Content Subject and Language Classrooms

    ERIC Educational Resources Information Center

    Gwee, Susan; Saravanan, Vanithamani

    2018-01-01

    Research literature has shown that teachers code-switched to a language which is not the medium of instruction to help students understand subject matter and establish interpersonal relations with them. However, little is known about the extent to which teachers code-switch in content subject classrooms compared to language classrooms. Using…

  3. Environmental Compliance Assessment Army Reserve (ECAAR)

    DTIC Science & Technology

    1993-09-01

    and water Spent mixed acid Spent caustic Spent sulfuric acid Potential Consequences: Heat generation, violent reaction. Group 2-A Group 2-B Aluminum Any...methane reforming furnaces, pulping liquor recovery furnaces, combustion devices used in the recovery of sulfur values from spent sulfuric acid...Industry and USEPA Hazardous Waste Hazard No. Hazardous Waste Code* Generic FOO1 The spent halogenated solvents used in degreasing: Trichloroethylene, (t

  4. Integrated dynamic analysis simulation of space stations with controllable solar arrays (supplemental data and analyses)

    NASA Technical Reports Server (NTRS)

    Heinrichs, J. A.; Fee, J. J.

    1972-01-01

    Space station and solar array data and the analyses which were performed in support of the integrated dynamic analysis study. The analysis methods and the formulated digital simulation were developed. Control systems for space station altitude control and solar array orientation control include generic type control systems. These systems have been digitally coded and included in the simulation.

  5. Generic and Automated Runtime Program Repair

    DTIC Science & Technology

    2012-09-01

    other person or corporation; or convey any rights or permission to manufacture, use, or sell any patented invention that may relate to them... PERSON PATRICK M. HURLEY a. REPORT U b. ABSTRACT U c. THIS PAGE U 19b. TELEPONE NUMBER (Include area code) N/A Standard Form 298...Public Release; Distribution Unlimited. 2. Introduction Software bugs are ubiquitous, and fixing them remains a difficult, time- consuming , and manual

  6. Generic calculation of two-body partial decay widths at the full one-loop level

    NASA Astrophysics Data System (ADS)

    Goodsell, Mark D.; Liebler, Stefan; Staub, Florian

    2017-11-01

    We describe a fully generic implementation of two-body partial decay widths at the full one-loop level in the SARAH and SPheno framework compatible with most supported models. It incorporates fermionic decays to a fermion and a scalar or a gauge boson as well as scalar decays into two fermions, two gauge bosons, two scalars or a scalar and a gauge boson. We present the relevant generic expressions for virtual and real corrections. Whereas wave-function corrections are determined from on-shell conditions, the parameters of the underlying model are by default renormalised in a \\overline{ {DR}} (or \\overline{ {MS}}) scheme. However, the user can also define model-specific counter-terms. As an example we discuss the renormalisation of the electric charge in the Thomson limit for top-quark decays in the standard model. One-loop-induced decays are also supported. The framework additionally allows the addition of mass and mixing corrections induced at higher orders for the involved external states. We explain our procedure to cancel infrared divergences for such cases, which is achieved through an infrared counter-term taking into account corrected Goldstone boson vertices. We compare our results for sfermion, gluino and Higgs decays in the minimal supersymmetric standard model (MSSM) against the public codes SFOLD, FVSFOLD and HFOLD and explain observed differences. Radiatively induced gluino and neutralino decays are compared against the original implementation in SPheno in the MSSM. We exactly reproduce the results of the code CNNDecays for decays of neutralinos and charginos in R-parity violating models. The new version SARAH 4.11.0 by default includes the calculation of two-body decay widths at the full one-loop level. Current limitations for certain model classes are described.

  7. Multimodal analysis of pretreated biomass species highlights generic markers of lignocellulose recalcitrance.

    PubMed

    Herbaut, Mickaël; Zoghlami, Aya; Habrant, Anouck; Falourd, Xavier; Foucat, Loïc; Chabbert, Brigitte; Paës, Gabriel

    2018-01-01

    Biomass recalcitrance to enzymatic hydrolysis has been assigned to several structural and chemical factors. However, their relative importance remains challenging to evaluate. Three representative biomass species (wheat straw, poplar and miscanthus) were submitted to four standard pretreatments (dilute acid, hot water, ionic liquid and sodium chlorite) in order to generate a set of contrasted samples. A large array of techniques, including wet chemistry analysis, porosity measurements using NMR spectroscopy, electron and fluorescence microscopy, were used in order to determine possible generic factors of biomass recalcitrance. The pretreatment conditions selected allowed obtaining samples displaying different susceptibility to enzymatic hydrolysis (from 3 up to 98% of the initial glucose content released after 96 h of saccharification). Generic correlation coefficients were calculated between the measured chemical and structural features and the final saccharification rates. Increases in porosity displayed overall strong positive correlations with saccharification efficiency, but different porosity ranges were concerned depending on the considered biomass. Lignin-related factors displayed highly negative coefficients for all biomasses. Lignin content, which is likely involved in the correlations observed for porosity, was less detrimental to enzymatic hydrolysis than lignin composition. Lignin influence was highlighted by the strong negative correlation with fluorescence intensity which mainly originates from monolignols in mature tissues. Our results provide a better understanding of the factors responsible for biomass recalcitrance that can reasonably be considered as generic. The correlations with specific porosity ranges are biomass species-dependent, meaning that enzymes cocktails with fitted enzyme size are likely to be needed to optimise saccharification depending on the biomass origin. Lignin composition, which probably influences its structure, is the most important parameter to overcome to enhance enzymes access to the polysaccharides. Accordingly, fluorescence intensity was found to be a rapid and simple method to assess recalcitrance after pretreatment.

  8. Dialog detection in narrative video by shot and face analysis

    NASA Astrophysics Data System (ADS)

    Kroon, B.; Nesvadba, J.; Hanjalic, A.

    2007-01-01

    The proliferation of captured personal and broadcast content in personal consumer archives necessitates comfortable access to stored audiovisual content. Intuitive retrieval and navigation solutions require however a semantic level that cannot be reached by generic multimedia content analysis alone. A fusion with film grammar rules can help to boost the reliability significantly. The current paper describes the fusion of low-level content analysis cues including face parameters and inter-shot similarities to segment commercial content into film grammar rule-based entities and subsequently classify those sequences into so-called shot reverse shots, i.e. dialog sequences. Moreover shot reverse shot specific mid-level cues are analyzed augmenting the shot reverse shot information with dialog specific descriptions.

  9. The Spatial Vision Tree: A Generic Pattern Recognition Engine- Scientific Foundations, Design Principles, and Preliminary Tree Design

    NASA Technical Reports Server (NTRS)

    Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.

    2010-01-01

    New foundational ideas are used to define a novel approach to generic visual pattern recognition. These ideas proceed from the starting point of the intrinsic equivalence of noise reduction and pattern recognition when noise reduction is taken to its theoretical limit of explicit matched filtering. This led us to think of the logical extension of sparse coding using basis function transforms for both de-noising and pattern recognition to the full pattern specificity of a lexicon of matched filter pattern templates. A key hypothesis is that such a lexicon can be constructed and is, in fact, a generic visual alphabet of spatial vision. Hence it provides a tractable solution for the design of a generic pattern recognition engine. Here we present the key scientific ideas, the basic design principles which emerge from these ideas, and a preliminary design of the Spatial Vision Tree (SVT). The latter is based upon a cryptographic approach whereby we measure a large aggregate estimate of the frequency of occurrence (FOO) for each pattern. These distributions are employed together with Hamming distance criteria to design a two-tier tree. Then using information theory, these same FOO distributions are used to define a precise method for pattern representation. Finally the experimental performance of the preliminary SVT on computer generated test images and complex natural images is assessed.

  10. Comparison of Standard Wind Turbine Models with Vendor Models for Power System Stability Analysis: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Honrubia-Escribano, A.; Gomez Lazaro, E.; Jimenez-Buendia, F.

    The International Electrotechnical Commission Standard 61400-27-1 was published in February 2015. This standard deals with the development of generic terms and parameters to specify the electrical characteristics of wind turbines. Generic models of very complex technological systems, such as wind turbines, are thus defined based on the four common configurations available in the market. Due to its recent publication, the comparison of the response of generic models with specific vendor models plays a key role in ensuring the widespread use of this standard. This paper compares the response of a specific Gamesa dynamic wind turbine model to the corresponding genericmore » IEC Type III wind turbine model response when the wind turbine is subjected to a three-phase voltage dip. This Type III model represents the doubly-fed induction generator wind turbine, which is not only one of the most commonly sold and installed technologies in the current market but also a complex variable-speed operation implementation. In fact, active and reactive power transients are observed due to the voltage reduction. Special attention is given to the reactive power injection provided by the wind turbine models because it is a requirement of current grid codes. Further, the boundaries of the generic models associated with transient events that cannot be represented exactly are included in the paper.« less

  11. WINCOF-I code for prediction of fan compressor unit with water ingestion

    NASA Technical Reports Server (NTRS)

    Murthy, S. N. B.; Mullican, A.

    1990-01-01

    The PURDUE-WINCOF code, which provides a numerical method of obtaining the performance of a fan-compressor unit of a jet engine with water ingestion into the inlet, was modified to take into account: (1) the scoop factor, (2) the time required for the setting-in of a quasi-steady distribution of water, and (3) the heat and mass transfer processes over the time calculated under 2. The modified code, named WINCOF-I was utilized to obtain the performance of a fan-compressor unit of a generic jet engine. The results illustrate the manner in which quasi-equilibrium conditions become established in the machine and the redistribution of ingested water in various stages in the form of a film out of the casing wall, droplets across the span, and vapor due to mass transfer.

  12. Laminar Heating Validation of the OVERFLOW Code

    NASA Technical Reports Server (NTRS)

    Lillard, Randolph P.; Dries, Kevin M.

    2005-01-01

    OVERFLOW, a structured finite difference code, was applied to the solution of hypersonic laminar flow over several configurations assuming perfect gas chemistry. By testing OVERFLOW's capabilities over several configurations encompassing a variety of flow physics a validated laminar heating was produced. Configurations tested were a flat plate at 0 degrees incidence, a sphere, a compression ramp, and the X-38 re-entry vehicle. This variety of test cases shows the ability of the code to predict boundary layer flow, stagnation heating, laminar separation with re-attachment heating, and complex flow over a three-dimensional body. In addition, grid resolutions studies were done to give recommendations for the correct number of off-body points to be applied to generic problems and for wall-spacing values to capture heat transfer and skin friction. Numerical results show good comparison to the test data for all the configurations.

  13. Comparison of SPHC Hydrocode Results with Penetration Equations and Results of Other Codes

    NASA Technical Reports Server (NTRS)

    Evans, Steven W.; Stallworth, Roderick; Stellingwerf, Robert F.

    2004-01-01

    The SPHC hydrodynamic code was used to simulate impacts of spherical aluminum projectiles on a single-wall aluminum plate and on a generic Whipple shield. Simulations were carried out in two and three dimensions. Projectile speeds ranged from 2 kilometers per second to 10 kilometers per second for the single-wall runs, and from 3 kilometers per second to 40 kilometers per second for the Whipple shield runs. Spallation limit results of the single-wall simulations are compared with predictions from five standard penetration equations, and are shown to fall comfortably within the envelope of these analytical relations. Ballistic limit results of the Whipple shield simulations are compared with results from the AUTODYN-2D and PAM-SHOCK-3D codes presented in a paper at the Hypervelocity Impact Symposium 2000 and the Christiansen formulation of 2003.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hichwa, B.P.; Pun, D.D.; Wang, D.

    A multielemental analysis to determine the trace metal content of generic and name-brand aspirins and name-brand lipsticks was done via proton induced x-ray (PIXE) measurements. The Hope College PIXE system is described as well as the target preparation methods. The trace metal content of twelve brands of aspirin and aspirin substitutes and fourteen brands of lipstick are reported. Detection limits for most elements are in the range of 100 parts per billion (ppb) to 10 parts per million (ppm).

  15. Toward IVHM Prognostics

    NASA Technical Reports Server (NTRS)

    Walsh, Kevin; Venti, Mike

    2007-01-01

    This viewgraph presentation reviews the prognostics of Integrated Vehicle Health Management. The contents include: 1) Aircraft Operations-Today's way of doing business; 2) Prognostics; 3) NASA's instrumentation data-system rack; 4) Data mining for IVHM; 5) NASA GRC's C-MAPSS generic engine model; and 6) Concluding thoughts.

  16. A generic high-dose rate {sup 192}Ir brachytherapy source for evaluation of model-based dose calculations beyond the TG-43 formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ballester, Facundo, E-mail: Facundo.Ballester@uv.es; Carlsson Tedgren, Åsa; Granero, Domingo

    Purpose: In order to facilitate a smooth transition for brachytherapy dose calculations from the American Association of Physicists in Medicine (AAPM) Task Group No. 43 (TG-43) formalism to model-based dose calculation algorithms (MBDCAs), treatment planning systems (TPSs) using a MBDCA require a set of well-defined test case plans characterized by Monte Carlo (MC) methods. This also permits direct dose comparison to TG-43 reference data. Such test case plans should be made available for use in the software commissioning process performed by clinical end users. To this end, a hypothetical, generic high-dose rate (HDR) {sup 192}Ir source and a virtual watermore » phantom were designed, which can be imported into a TPS. Methods: A hypothetical, generic HDR {sup 192}Ir source was designed based on commercially available sources as well as a virtual, cubic water phantom that can be imported into any TPS in DICOM format. The dose distribution of the generic {sup 192}Ir source when placed at the center of the cubic phantom, and away from the center under altered scatter conditions, was evaluated using two commercial MBDCAs [Oncentra{sup ®} Brachy with advanced collapsed-cone engine (ACE) and BrachyVision ACUROS{sup TM}]. Dose comparisons were performed using state-of-the-art MC codes for radiation transport, including ALGEBRA, BrachyDose, GEANT4, MCNP5, MCNP6, and PENELOPE2008. The methodologies adhered to recommendations in the AAPM TG-229 report on high-energy brachytherapy source dosimetry. TG-43 dosimetry parameters, an along-away dose-rate table, and primary and scatter separated (PSS) data were obtained. The virtual water phantom of (201){sup 3} voxels (1 mm sides) was used to evaluate the calculated dose distributions. Two test case plans involving a single position of the generic HDR {sup 192}Ir source in this phantom were prepared: (i) source centered in the phantom and (ii) source displaced 7 cm laterally from the center. Datasets were independently produced by different investigators. MC results were then compared against dose calculated using TG-43 and MBDCA methods. Results: TG-43 and PSS datasets were generated for the generic source, the PSS data for use with the ACE algorithm. The dose-rate constant values obtained from seven MC simulations, performed independently using different codes, were in excellent agreement, yielding an average of 1.1109 ± 0.0004 cGy/(h U) (k = 1, Type A uncertainty). MC calculated dose-rate distributions for the two plans were also found to be in excellent agreement, with differences within type A uncertainties. Differences between commercial MBDCA and MC results were test, position, and calculation parameter dependent. On average, however, these differences were within 1% for ACUROS and 2% for ACE at clinically relevant distances. Conclusions: A hypothetical, generic HDR {sup 192}Ir source was designed and implemented in two commercially available TPSs employing different MBDCAs. Reference dose distributions for this source were benchmarked and used for the evaluation of MBDCA calculations employing a virtual, cubic water phantom in the form of a CT DICOM image series. The implementation of a generic source of identical design in all TPSs using MBDCAs is an important step toward supporting univocal commissioning procedures and direct comparisons between TPSs.« less

  17. Hypersonic vehicle simulation model: Winged-cone configuration

    NASA Technical Reports Server (NTRS)

    Shaughnessy, John D.; Pinckney, S. Zane; Mcminn, John D.; Cruz, Christopher I.; Kelley, Marie-Louise

    1990-01-01

    Aerodynamic, propulsion, and mass models for a generic, horizontal-takeoff, single-stage-to-orbit (SSTO) configuration are presented which are suitable for use in point mass as well as batch and real-time six degree-of-freedom simulations. The simulations can be used to investigate ascent performance issues and to allow research, refinement, and evaluation of integrated guidance/flight/propulsion/thermal control systems, design concepts, and methodologies for SSTO missions. Aerodynamic force and moment coefficients are given as functions of angle of attack, Mach number, and control surface deflections. The model data were estimated by using a subsonic/supersonic panel code and a hypersonic local surface inclination code. Thrust coefficient and engine specific impulse were estimated using a two-dimensional forebody, inlet, nozzle code and a one-dimensional combustor code and are given as functions of Mach number, dynamic pressure, and fuel equivalence ratio. Rigid-body mass moments of inertia and center of gravity location are functions of vehicle weight which is in turn a function of fuel flow.

  18. SAADA: Astronomical Databases Made Easier

    NASA Astrophysics Data System (ADS)

    Michel, L.; Nguyen, H. N.; Motch, C.

    2005-12-01

    Many astronomers wish to share datasets with their community but have not enough manpower to develop databases having the functionalities required for high-level scientific applications. The SAADA project aims at automatizing the creation and deployment process of such databases. A generic but scientifically relevant data model has been designed which allows one to build databases by providing only a limited number of product mapping rules. Databases created by SAADA rely on a relational database supporting JDBC and covered by a Java layer including a lot of generated code. Such databases can simultaneously host spectra, images, source lists and plots. Data are grouped in user defined collections whose content can be seen as one unique set per data type even if their formats differ. Datasets can be correlated one with each other using qualified links. These links help, for example, to handle the nature of a cross-identification (e.g., a distance or a likelihood) or to describe their scientific content (e.g., by associating a spectrum to a catalog entry). The SAADA query engine is based on a language well suited to the data model which can handle constraints on linked data, in addition to classical astronomical queries. These constraints can be applied on the linked objects (number, class and attributes) and/or on the link qualifier values. Databases created by SAADA are accessed through a rich WEB interface or a Java API. We are currently developing an inter-operability module implanting VO protocols.

  19. [Lack of bioavailability of generic lopinavir/ritonavir not prequalified by WHO marketed in Africa (Congo Brazzaville)].

    PubMed

    Camara, S; Zucman, D; Vasse, M; Goudjo, A; Guillard, E; Peytavin, G

    2015-02-01

    Although second-line generic antiretroviral drugs are of great value in developing countries there are concerns regarding their quality and safety. This study is a case report and pharmacological study in healthy volunteers. A French subject of sub-saharan origin who visited Republic of Congo received a post-exposure treatment with AZT+3TC and LPV/r (200/50 mg, Arga-L®, India) following unprotected sexual intercourse. Two days later, in France, tests showed that plasma concentrations of lopinavir and ritonavir were undetectable. The WHO prequalification list showed Arga-L® was not prequalified. A pharmacological study in healthy volunteers evaluated oral bioavailability: plasma concentrations of generic LPV/r Arga-L® and LPV/r Kaletra® (400/100 mg) were measured after one single dose at 7 days apart in four healthy volunteers. Concentrations of Arga-L® at 12 h after intake were considerably lower than those of Kaletra®, revealing very low oral bioavailability of generic lopinavir and ritonavir (<10%) compared to the brand-name drug. We found that Arga-L®, despite having adequate qualitative and quantitative drug contents, had very poor bio availability compared to Kaletra®. In order to avoid the selection and the spread of drug-resistant HIV strains, rigorous pharmacological monitoring of generic antiretroviral drugs that are not pre-qualified by WHO, but are marketed in Africa, must be a priority for health authorities.

  20. The diagnosis related groups enhanced electronic medical record.

    PubMed

    Müller, Marcel Lucas; Bürkle, Thomas; Irps, Sebastian; Roeder, Norbert; Prokosch, Hans-Ulrich

    2003-07-01

    The introduction of Diagnosis Related Groups as a basis for hospital payment in Germany announced essential changes in the hospital reimbursement practice. A hospital's economical survival will depend vitally on the accuracy and completeness of the documentation of DRG relevant data like diagnosis and procedure codes. In order to enhance physicians' coding compliance, an easy-to-use interface integrating coding tasks seamlessly into clinical routine had to be developed. A generic approach should access coding and clinical guidelines from different information sources. Within the Electronic Medical Record (EMR) a user interface ('DRG Control Center') for all DRG relevant clinical and administrative data has been built. A comprehensive DRG-related web site gives online access to DRG grouping software and an electronic coding expert. Both components are linked together using an application supporting bi-directional communication. Other web based services like a guideline search engine can be integrated as well. With the proposed method, the clinician gains quick access to context sensitive clinical guidelines for appropriate treatment of his/her patient and administrative guidelines for the adequate coding of the diagnoses and procedures. This paper describes the design and current implementation and discusses our experiences.

  1. Composite load spectra for select space propulsion structural components

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Kurth, R. E.; Ho, H.

    1986-01-01

    A multiyear program is performed with the objective to develop generic load models with multiple levels of progressive sophistication to simulate the composite (combined) load spectra that are induced in space propulsion system components, representative of Space Shuttle Main Engines (SSME), such as transfer ducts, turbine blades, and liquid oxygen (LOX) posts. Progress of the first year's effort includes completion of a sufficient portion of each task -- probabilistic models, code development, validation, and an initial operational code. This code has from its inception an expert system philosophy that could be added to throughout the program and in the future. The initial operational code is only applicable to turbine blade type loadings. The probabilistic model included in the operational code has fitting routines for loads that utilize a modified Discrete Probabilistic Distribution termed RASCAL, a barrier crossing method and a Monte Carlo method. An initial load model was developed by Battelle that is currently used for the slowly varying duty cycle type loading. The intent is to use the model and related codes essentially in the current form for all loads that are based on measured or calculated data that have followed a slowly varying profile.

  2. The VREST learning environment.

    PubMed

    Kunst, E E; Geelkerken, R H; Sanders, A J B

    2005-01-01

    The VREST learning environment is an integrated architecture to improve the education of health care professionals. It is a combination of a learning, content and assessment management system based on virtual reality. The generic architecture is now being build and tested around the Lichtenstein protocol for hernia inguinalis repair.

  3. Model Development for VDE Computations in NIMROD

    NASA Astrophysics Data System (ADS)

    Bunkers, K. J.; Sovinec, C. R.

    2017-10-01

    Vertical displacement events (VDEs) and the disruptions associated with them have potential for causing considerable physical damage to ITER and other tokamak experiments. We report on simulations of generic axisymmetric VDEs and a vertically unstable case from Alcator C-MOD using the NIMROD code. Previous calculations have been done with closures for heat flux and viscous stress. Initial calculations show that halo current width is dependent on temperature boundary conditions, and so transport together with plasma-surface interaction may play a role in determining halo currents in experiments. The behavior of VDEs with Braginskii thermal conductivity and viscosity closures and Spitzer-like resistivity are investigated for both the generic axisymmetric VDE case and the C-MOD case. This effort is supported by the U.S. Dept. of Energy, Award Numbers DE-FG02-06ER54850 and DE-FC02-08ER54975.

  4. Chekhovichia, a new generic replacement name for Rotalites Leleshus 1970 (Anthozoa: Heliolitoidea) non Lamarck 1801 (Protista: Foraminifera).

    PubMed

    Doweld, Alexander B

    2015-10-29

    The genus Rotalites was established by Leleshus (1970: 97) for fossil Upper Silurian heliolitoids (Anthozoa) from Southern Tien Shan. However, the name is preoccupied by Rotalites Lamarck (1801: 401) of Foraminifera (Protista) (cf. Loeblich & Tappan, 1987). In accordance with the International Code of Zoological Nomenclature, Chekhovichia nom. nov. is proposed here as a replacement name for Rotalites Leleshus non Lamarck.

  5. Generic Ada code in the NASA space station command, control and communications environment

    NASA Technical Reports Server (NTRS)

    Mcdougall, D. P.; Vollman, T. E.

    1986-01-01

    The results of efforts to apply powerful Ada constructs to the formatted message handling process are described. The goal of these efforts was to extend the state-of-technology in message handling while at the same time producing production-quality, reusable code. The first effort was initiated in September, 1984 and delivered in April, 1985. That product, the Generic Message Handling Facility, met initial goals, was reused, and is available in the Ada Repository on ARPANET. However, it became apparent during its development that the initial approach to building a message handler template was not optimal. As a result of this initial effort, several alternate approaches were identified, and research is now on-going to identify an improved product. The ultimate goal is to be able to instantly build a message handling system for any message format given a specification of that message format. The problem lies in how to specify the message format, and one that is done, how to use that information to build the message handler. Message handling systems and message types are described. The initial efforts, its results and its shortcomings are detailed. The approach now being taken to build a system which will be significantly easier to implement, and once implemented, easier to use, is described. Finally, conclusions are offered.

  6. A generic minimization random allocation and blinding system on web.

    PubMed

    Cai, Hongwei; Xia, Jielai; Xu, Dezhong; Gao, Donghuai; Yan, Yongping

    2006-12-01

    Minimization is a dynamic randomization method for clinical trials. Although recommended by many researchers, the utilization of minimization has been seldom reported in randomized trials mainly because of the controversy surrounding the validity of conventional analyses and its complexity in implementation. However, both the statistical and clinical validity of minimization were demonstrated in recent studies. Minimization random allocation system integrated with blinding function that could facilitate the implementation of this method in general clinical trials has not been reported. SYSTEM OVERVIEW: The system is a web-based random allocation system using Pocock and Simon minimization method. It also supports multiple treatment arms within a trial, multiple simultaneous trials, and blinding without further programming. This system was constructed with generic database schema design method, Pocock and Simon minimization method and blinding method. It was coded with Microsoft Visual Basic and Active Server Pages (ASP) programming languages. And all dataset were managed with a Microsoft SQL Server database. Some critical programming codes were also provided. SIMULATIONS AND RESULTS: Two clinical trials were simulated simultaneously to test the system's applicability. Not only balanced groups but also blinded allocation results were achieved in both trials. Practical considerations for minimization method, the benefits, general applicability and drawbacks of the technique implemented in this system are discussed. Promising features of the proposed system are also summarized.

  7. Assessing the Viability of Social Media for Disseminating Evidence-Based Nutrition Practice Guideline Through Content Analysis of Twitter Messages and Health Professional Interviews: An Observational Study

    PubMed Central

    Kenne, Deric; Wolfram, Taylor M; Abram, Jenica K; Fleming, Michael

    2016-01-01

    Background Given the high penetration of social media use, social media has been proposed as a method for the dissemination of information to health professionals and patients. This study explored the potential for social media dissemination of the Academy of Nutrition and Dietetics Evidence-Based Nutrition Practice Guideline (EBNPG) for Heart Failure (HF). Objectives The objectives were to (1) describe the existing social media content on HF, including message content, source, and target audience, and (2) describe the attitude of physicians and registered dietitian nutritionists (RDNs) who care for outpatient HF patients toward the use of social media as a method to obtain information for themselves and to share this information with patients. Methods The methods were divided into 2 parts. Part 1 involved conducting a content analysis of tweets related to HF, which were downloaded from Twitonomy and assigned codes for message content (19 codes), source (9 codes), and target audience (9 codes); code frequency was described. A comparison in the popularity of tweets (those marked as favorites or retweeted) based on applied codes was made using t tests. Part 2 involved conducting phone interviews with RDNs and physicians to describe health professionals’ attitude toward the use of social media to communicate general health information and information specifically related to the HF EBNPG. Interviews were transcribed and coded; exemplar quotes representing frequent themes are presented. Results The sample included 294 original tweets with the hashtag “#heartfailure.” The most frequent message content codes were “HF awareness” (166/294, 56.5%) and “patient support” (97/294, 33.0%). The most frequent source codes were “professional, government, patient advocacy organization, or charity” (112/277, 40.4%) and “patient or family” (105/277, 37.9%). The most frequent target audience codes were “unable to identify” (111/277, 40.1%) and “other” (55/277, 19.9%). Significant differences were found in the popularity of tweets with (mean 1, SD 1.3 favorites) or without (mean 0.7, SD 1.3 favorites), the content code being “HF research” (P=.049). Tweets with the source code “professional, government, patient advocacy organizations, or charities” were significantly more likely to be marked as a favorite and retweeted than those without this source code (mean 1.2, SD 1.4 vs mean 0.8, SD 1.2, P=.03) and (mean 1.5, SD 1.8 vs mean 0.9, SD 2.0, P=.03). Interview participants believed that social media was a useful way to gather professional information. They did not believe that social media was useful for communicating with patients due to privacy concerns and the fact that the information had to be kept general rather than be tailored for a specific patient and the belief that their patients did not use social media or technology. Conclusions Existing Twitter content related to HF comes from a combination of patients and evidence-based organizations; however, there is little nutrition content. That gap may present an opportunity for EBNPG dissemination. Health professionals use social media to gather information for themselves but are skeptical of its value when communicating with patients, particularly due to privacy concerns and misconceptions about the characteristics of social media users. PMID:27847349

  8. Assessing the Viability of Social Media for Disseminating Evidence-Based Nutrition Practice Guideline Through Content Analysis of Twitter Messages and Health Professional Interviews: An Observational Study.

    PubMed

    Hand, Rosa K; Kenne, Deric; Wolfram, Taylor M; Abram, Jenica K; Fleming, Michael

    2016-11-15

    Given the high penetration of social media use, social media has been proposed as a method for the dissemination of information to health professionals and patients. This study explored the potential for social media dissemination of the Academy of Nutrition and Dietetics Evidence-Based Nutrition Practice Guideline (EBNPG) for Heart Failure (HF). The objectives were to (1) describe the existing social media content on HF, including message content, source, and target audience, and (2) describe the attitude of physicians and registered dietitian nutritionists (RDNs) who care for outpatient HF patients toward the use of social media as a method to obtain information for themselves and to share this information with patients. The methods were divided into 2 parts. Part 1 involved conducting a content analysis of tweets related to HF, which were downloaded from Twitonomy and assigned codes for message content (19 codes), source (9 codes), and target audience (9 codes); code frequency was described. A comparison in the popularity of tweets (those marked as favorites or retweeted) based on applied codes was made using t tests. Part 2 involved conducting phone interviews with RDNs and physicians to describe health professionals' attitude toward the use of social media to communicate general health information and information specifically related to the HF EBNPG. Interviews were transcribed and coded; exemplar quotes representing frequent themes are presented. The sample included 294 original tweets with the hashtag "#heartfailure." The most frequent message content codes were "HF awareness" (166/294, 56.5%) and "patient support" (97/294, 33.0%). The most frequent source codes were "professional, government, patient advocacy organization, or charity" (112/277, 40.4%) and "patient or family" (105/277, 37.9%). The most frequent target audience codes were "unable to identify" (111/277, 40.1%) and "other" (55/277, 19.9%). Significant differences were found in the popularity of tweets with (mean 1, SD 1.3 favorites) or without (mean 0.7, SD 1.3 favorites), the content code being "HF research" (P=.049). Tweets with the source code "professional, government, patient advocacy organizations, or charities" were significantly more likely to be marked as a favorite and retweeted than those without this source code (mean 1.2, SD 1.4 vs mean 0.8, SD 1.2, P=.03) and (mean 1.5, SD 1.8 vs mean 0.9, SD 2.0, P=.03). Interview participants believed that social media was a useful way to gather professional information. They did not believe that social media was useful for communicating with patients due to privacy concerns and the fact that the information had to be kept general rather than be tailored for a specific patient and the belief that their patients did not use social media or technology. Existing Twitter content related to HF comes from a combination of patients and evidence-based organizations; however, there is little nutrition content. That gap may present an opportunity for EBNPG dissemination. Health professionals use social media to gather information for themselves but are skeptical of its value when communicating with patients, particularly due to privacy concerns and misconceptions about the characteristics of social media users. ©Rosa K Hand, Deric Kenne, Taylor M Wolfram, Jenica K Abram, Michael Fleming. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.11.2016.

  9. Alcohol marketing on YouTube: exploratory analysis of content adaptation to enhance user engagement in different national contexts.

    PubMed

    Gupta, Himanshu; Lam, Tina; Pettigrew, Simone; Tait, Robert J

    2018-01-16

    We know little about how social media alcohol marketing is utilized for alcohol promotion in different national contexts. There does not appear to be any academic work on online exposure to alcohol marketing via social media in India, and most of the limited research in Australia has focused on Facebook. Hence, the present study extends previous research by investigating alcohol promotion conducted on an under-researched form of social media (YouTube) in two contrasting geographic contexts. This study examines and compares the types of strategies used by marketers on Indian and Australian alcohol brands with the greatest YouTube presence, and the extent to which users engage with these strategies. The 10 alcohol brands per country with the greatest YouTube presence were identified based on the number of 'subscriptions'. The number of videos, views per video, and the type of content within the videos were collected for each brand. The data were analyzed using an inductive coding approach, using NVivo 10. The targeted brands had gathered 98,881 subscriptions (Indian brands: n = 13,868; Australian brands: n = 85,013). The type of marketing strategies utilized by brands were a mix of those that differed by country (e.g. sexually suggestive content in India and posts related to the brand's tradition or heritage in Australia) and generic approaches (e.g. encouraging time- and event-specific drinking; demonstrations of food/cocktail recipes; camaraderie; competitions and prize draws; and brand sponsorship at music, sports, and fashion events). This cross-national comparison demonstrates that YouTube provides alcohol marketers with an advertising platform where they utilize tailored marketing approaches to cater to specific national contexts and develop content on the cultural meanings users invoke in their interactions with these strategies. Those exposed to alcohol marketing on YouTube are likely to include those under the legal drinking age.

  10. A review of ice accretion data from a model rotor icing test and comparison with theory

    NASA Technical Reports Server (NTRS)

    Britton, Randall K.; Bond, Thomas H.

    1991-01-01

    An experiment was conducted by the Helicopter Icing Consortium (HIC) in the NASA Lewis Icing Research Tunnel (IRT) in which a 1/6 scale fuselage model of a UH-60A Black Hawk helicopter with a generic rotor was subjected to a wide range of icing conditions. The HIC consists of members from NASA, Bell Helicopter, Boeing Helicopter, McDonnell Douglas Helicopters, Sikorsky Aircraft, and Texas A&M University. Data was taken in the form of rotor torque, internal force balance measurements, blade strain gage loading, and two dimensional ice shape tracings. A review of the ice shape data is performed with special attention given to repeatability and correctness of trends in terms of radial variation, rotational speed, icing time, temperature, liquid water content, and volumetric median droplet size. Moreover, an indepth comparison between the experimental data and the analysis of NASA's ice accretion code LEWICE is given. Finally, conclusions are drawn as to the quality of the ice accretion data and the predictability of the data base as a whole. Recommendations are also given for improving data taking technique as well as potential future work.

  11. Verification testing of the compression performance of the HEVC screen content coding extensions

    NASA Astrophysics Data System (ADS)

    Sullivan, Gary J.; Baroncini, Vittorio A.; Yu, Haoping; Joshi, Rajan L.; Liu, Shan; Xiu, Xiaoyu; Xu, Jizheng

    2017-09-01

    This paper reports on verification testing of the coding performance of the screen content coding (SCC) extensions of the High Efficiency Video Coding (HEVC) standard (Rec. ITU-T H.265 | ISO/IEC 23008-2 MPEG-H Part 2). The coding performance of HEVC screen content model (SCM) reference software is compared with that of the HEVC test model (HM) without the SCC extensions, as well as with the Advanced Video Coding (AVC) joint model (JM) reference software, for both lossy and mathematically lossless compression using All-Intra (AI), Random Access (RA), and Lowdelay B (LB) encoding structures and using similar encoding techniques. Video test sequences in 1920×1080 RGB 4:4:4, YCbCr 4:4:4, and YCbCr 4:2:0 colour sampling formats with 8 bits per sample are tested in two categories: "text and graphics with motion" (TGM) and "mixed" content. For lossless coding, the encodings are evaluated in terms of relative bit-rate savings. For lossy compression, subjective testing was conducted at 4 quality levels for each coding case, and the test results are presented through mean opinion score (MOS) curves. The relative coding performance is also evaluated in terms of Bjøntegaard-delta (BD) bit-rate savings for equal PSNR quality. The perceptual tests and objective metric measurements show a very substantial benefit in coding efficiency for the SCC extensions, and provided consistent results with a high degree of confidence. For TGM video, the estimated bit-rate savings ranged from 60-90% relative to the JM and 40-80% relative to the HM, depending on the AI/RA/LB configuration category and colour sampling format.

  12. DCG & GTE: Dynamic Courseware Generation with Teaching Expertise.

    ERIC Educational Resources Information Center

    Vassileva, Julita

    1998-01-01

    Discusses the place of GTE (Generic Tutoring Environment) as an approach to bridging the gap between computer-assisted learning and intelligent tutoring systems; describes DCG (dynamic courseware generation) which allows dynamic planning of the contents of an instructional course; and considers combining GTE with DCG. (Author/LRW)

  13. Generic element processor (application to nonlinear analysis)

    NASA Technical Reports Server (NTRS)

    Stanley, Gary

    1989-01-01

    The focus here is on one aspect of the Computational Structural Mechanics (CSM) Testbed: finite element technology. The approach involves a Generic Element Processor: a command-driven, database-oriented software shell that facilitates introduction of new elements into the testbed. This shell features an element-independent corotational capability that upgrades linear elements to geometrically nonlinear analysis, and corrects the rigid-body errors that plague many contemporary plate and shell elements. Specific elements that have been implemented in the Testbed via this mechanism include the Assumed Natural-Coordinate Strain (ANS) shell elements, developed with Professor K. C. Park (University of Colorado, Boulder), a new class of curved hybrid shell elements, developed by Dr. David Kang of LPARL (formerly a student of Professor T. Pian), other shell and solid hybrid elements developed by NASA personnel, and recently a repackaged version of the workhorse shell element used in the traditional STAGS nonlinear shell analysis code. The presentation covers: (1) user and developer interfaces to the generic element processor, (2) an explanation of the built-in corotational option, (3) a description of some of the shell-elements currently implemented, and (4) application to sample nonlinear shell postbuckling problems.

  14. Extracting recurrent scenarios from narrative texts using a Bayesian network: application to serious occupational accidents with movement disturbance.

    PubMed

    Abdat, F; Leclercq, S; Cuny, X; Tissot, C

    2014-09-01

    A probabilistic approach has been developed to extract recurrent serious Occupational Accident with Movement Disturbance (OAMD) scenarios from narrative texts within a prevention framework. Relevant data extracted from 143 accounts was initially coded as logical combinations of generic accident factors. A Bayesian Network (BN)-based model was then built for OAMDs using these data and expert knowledge. A data clustering process was subsequently performed to group the OAMDs into similar classes from generic factor occurrence and pattern standpoints. Finally, the Most Probable Explanation (MPE) was evaluated and identified as the associated recurrent scenario for each class. Using this approach, 8 scenarios were extracted to describe 143 OAMDs in the construction and metallurgy sectors. Their recurrent nature is discussed. Probable generic factor combinations provide a fair representation of particularly serious OAMDs, as described in narrative texts. This work represents a real contribution to raising company awareness of the variety of circumstances, in which these accidents occur, to progressing in the prevention of such accidents and to developing an analysis framework dedicated to this kind of accident. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Fourier phase retrieval with a single mask by Douglas-Rachford algorithms.

    PubMed

    Chen, Pengwen; Fannjiang, Albert

    2018-05-01

    The Fourier-domain Douglas-Rachford (FDR) algorithm is analyzed for phase retrieval with a single random mask. Since the uniqueness of phase retrieval solution requires more than a single oversampled coded diffraction pattern, the extra information is imposed in either of the following forms: 1) the sector condition on the object; 2) another oversampled diffraction pattern, coded or uncoded. For both settings, the uniqueness of projected fixed point is proved and for setting 2) the local, geometric convergence is derived with a rate given by a spectral gap condition. Numerical experiments demonstrate global, power-law convergence of FDR from arbitrary initialization for both settings as well as for 3 or more coded diffraction patterns without oversampling. In practice, the geometric convergence can be recovered from the power-law regime by a simple projection trick, resulting in highly accurate reconstruction from generic initialization.

  16. Transitioning to a national health system in Cyprus: a stakeholder analysis of pharmaceutical policy reform.

    PubMed

    Wouters, Olivier J; Kanavos, Panos G

    2015-09-01

    To review the pharmaceutical sector in Cyprus in terms of the availability and affordability of medicines and to explore pharmaceutical policy options for the national health system finance reform expected to be introduced in 2016. We conducted semi-structured interviews in April 2014 with senior representatives from seven key national organizations involved in pharmaceutical care. The captured data were coded and analysed using the predetermined themes of pricing, reimbursement, prescribing, dispensing and cost sharing. We also examined secondary data provided by the Cypriot Ministry of Health; these data included the prices and volumes of prescription medicines in 2013. We identified several key issues, including high medicine prices, underuse of generic medicines and high out-of-pocket drug spending. Most stakeholders recommended that the national government review existing pricing policies to ensure medicines within the forthcoming national health system are affordable and available, introduce a national reimbursement system and incentivize the prescribing and dispensing of generic medicines. There were disagreements over how to (i) allocate responsibilities to governmental agencies in the national health system, (ii) reconcile differences in opinion between stakeholders and (iii) raise awareness among patients, physicians and pharmacists about the benefits of greater generic drug use. In Cyprus, if the national health system is going to provide universal health coverage in a sustainable fashion, then the national government must address the current issues in the pharmaceutical sector. Importantly, the country will need to increase the market share of generic medicines to contain drug spending.

  17. Knowledge Data Base for Amorphous Metals

    DTIC Science & Technology

    2007-07-26

    not programmatic, updates. Over 100 custom SQL statements that maintain the domain specific data are attached to the workflow entries in a generic...for the form by populating the SQL and run generation tables. Application data may be prepared in different ways for two steps that invoke the same form...run generation mode). There is a single table of SQL commands. Each record has a user-definable ID, the SQL code, and a comment. The run generation

  18. A Feasibility Study of Life-Extending Controls for Aircraft Turbine Engines Using a Generic Air Force Model (Preprint)

    DTIC Science & Technology

    2006-12-01

    intelligent control algorithm embedded in the FADEC . This paper evaluates the LEC, based on critical components research, to demonstrate how an...control action, engine component life usage, and designing an intelligent control algorithm embedded in the FADEC . This paper evaluates the LEC, based on...simulation code for each simulator. One is typically configured to operate as a Full- Authority Digital Electronic Controller ( FADEC

  19. Improving throughput for temporal target nomination using existing infrastructure

    NASA Astrophysics Data System (ADS)

    Raeth, Peter G.

    2007-04-01

    Earlier, we reported on predictive anomaly detection (PAD) for nominating targets within data streams generated by persistent sensing and surveillance. This technique is purely temporal and does not directly depend on the physics attendant on the sensed environment. Since PAD adapts to evolving data streams, there are no determinacy assumptions. We showed PAD to be general across sensor types, demonstrating it using synthetic chaotic data and in audio, visual, and infrared applications. Defense-oriented demonstrations included explosions, muzzle flashes, and missile and aircraft detection. Experiments were ground-based and air-to-air. As new sensors come on line, PAD offers immediate data filtering and target nomination. Its results can be taken individually, pixel by pixel, for spectral analysis and material detection/identification. They can also be grouped for shape analysis, target identification, and track development. PAD analyses reduce data volume by around 95%, depending on target number and size, while still retaining all target indicators. While PAD's code is simple when compared to physics codes, PAD tends to build a huge model. A PAD model for 512 x 640 frames may contain 19,660,800 Gaussian basis functions. (PAD models grow linearly with the number of pixels and the frequency content, in the FFT sense, of the sensed scenario's background data). PAD's complexity in terms of computational and data intensity is an example of what one sees in new algorithms now in the R&D pipeline, especially as DoD seeks capability that runs fully automatic, with little to no human interaction. Work is needed to improve algorithms' throughput while employing existing infrastructure, yet allowing for growth in the types of hardware employed. In this present paper, we discuss a generic cluster interface for legacy codes that can be partitioned at the data level. The discussion's foundation is the growth of PAD models to accommodate a particular scenario and the need to reduce false alarms while preserving all targets. The discussion closes with a view of future software and hardware opportunities.

  20. Automated Assessment of Reviews

    ERIC Educational Resources Information Center

    Ramachandran, Lakshmi

    2013-01-01

    Relevance helps identify to what extent a review's content pertains to that of the submission. Relevance metric helps distinguish generic or vague reviews from the useful ones. Relevance of a review to a submission can be determined by identifying semantic and syntactic similarities between them. Our work introduces the use of a word-order graph…

  1. 21 CFR 864.7500 - Whole blood hemoglobin assays.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Whole blood hemoglobin assays. 864.7500 Section... blood hemoglobin assays. (a) Identification. A whole blood hemoglobin assay is a device consisting or... hemoglobin content of whole blood for the detection of anemia. This generic device category does not include...

  2. Writing in Math: A Disciplinary Literacy Approach

    ERIC Educational Resources Information Center

    Brozo, William G.; Crain, Sarah

    2018-01-01

    Mathematics teachers often resist generic literacy strategies because they do not seem relevant to math learning. Discipline-specific literacy practices that emerge directly from the math content and processes under study are more likely to be embraced by math teachers. Furthermore, national and state-level mathematics standards as well as Common…

  3. 21 CFR 864.7500 - Whole blood hemoglobin assays.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Whole blood hemoglobin assays. 864.7500 Section... blood hemoglobin assays. (a) Identification. A whole blood hemoglobin assay is a device consisting or... hemoglobin content of whole blood for the detection of anemia. This generic device category does not include...

  4. 21 CFR 864.7500 - Whole blood hemoglobin assays.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Whole blood hemoglobin assays. 864.7500 Section... blood hemoglobin assays. (a) Identification. A whole blood hemoglobin assay is a device consisting or... hemoglobin content of whole blood for the detection of anemia. This generic device category does not include...

  5. 21 CFR 864.7500 - Whole blood hemoglobin assays.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Whole blood hemoglobin assays. 864.7500 Section... blood hemoglobin assays. (a) Identification. A whole blood hemoglobin assay is a device consisting or... hemoglobin content of whole blood for the detection of anemia. This generic device category does not include...

  6. 21 CFR 864.7500 - Whole blood hemoglobin assays.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Whole blood hemoglobin assays. 864.7500 Section... blood hemoglobin assays. (a) Identification. A whole blood hemoglobin assay is a device consisting or... hemoglobin content of whole blood for the detection of anemia. This generic device category does not include...

  7. 12 CFR 226.18 - Content of disclosures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the other payments in the series. (h) Total of payments. The total of payments, using that term, and a... creditor making the disclosures. (b) Amount financed. The amount financed, using that term, and a brief... identify those persons. 41 41 The following payees may be described using generic or other general terms...

  8. Towards Evolutional Authoring Support Systems

    ERIC Educational Resources Information Center

    Aroyo, Lora; Mizoguchi, Riichiro

    2004-01-01

    The ultimate aim of this research is to specify and implement a general authoring framework for content and knowledge engineering for Intelligent Educational Systems (IES). In this context we attempt to develop an authoring tool supporting this framework that is powerful in its functionality, generic in its support of instructional strategies and…

  9. Enhancing Teacher Education Students' Generic Skills through Problem-Based Learning

    ERIC Educational Resources Information Center

    Murray-Harvey, Rosalind; Curtis, David D.; Cattley, Georgina; Slee, Phillip T.

    2005-01-01

    Claims made for the value of problem-based learning (PBL) as an effective method for professional education programmes draw on constructivist principles of teaching and learning to achieve essential content knowledge, higher order thinking skills, and a team approach to problem-solving through the interdisciplinary, student-directed study of…

  10. The Pursuit of Understanding in Clinical Reasoning.

    ERIC Educational Resources Information Center

    Feltovich, Paul J.; Patel, Vimla L.

    Trends in emphases in the study of clinical reasoning are examined, with attention to three major branches of research: problem-solving, knowledge engineering, and propositional analysis. There has been a general progression from a focus on the generic form of clinical reasoning to an emphasis on medical content that supports the reasoning…

  11. Measuring Mathematics Teacher Educators' Knowledge of Technology Integrated Teaching: Instrument Development

    ERIC Educational Resources Information Center

    Getenet, Seyum Tekeher; Beswick, Kim

    2013-01-01

    This study describes the construction of a questionnaire instrument to measure mathematics teacher educators' knowledge for technology integrated mathematics teaching. The study was founded on a reconceptualisation of the generic Technological Pedagogical Content Knowledge framework in the specific context of mathematics teaching. Steps in the…

  12. A Comprehensive Observational Coding Scheme for Analyzing Instrumental, Affective, and Relational Communication in Health Care Contexts

    PubMed Central

    SIMINOFF, LAURA A.; STEP, MARY M.

    2011-01-01

    Many observational coding schemes have been offered to measure communication in health care settings. These schemes fall short of capturing multiple functions of communication among providers, patients, and other participants. After a brief review of observational communication coding, the authors present a comprehensive scheme for coding communication that is (a) grounded in communication theory, (b) accounts for instrumental and relational communication, and (c) captures important contextual features with tailored coding templates: the Siminoff Communication Content & Affect Program (SCCAP). To test SCCAP reliability and validity, the authors coded data from two communication studies. The SCCAP provided reliable measurement of communication variables including tailored content areas and observer ratings of speaker immediacy, affiliation, confirmation, and disconfirmation behaviors. PMID:21213170

  13. The nucleotide composition of microbial genomes indicates differential patterns of selection on core and accessory genomes.

    PubMed

    Bohlin, Jon; Eldholm, Vegard; Pettersson, John H O; Brynildsrud, Ola; Snipen, Lars

    2017-02-10

    The core genome consists of genes shared by the vast majority of a species and is therefore assumed to have been subjected to substantially stronger purifying selection than the more mobile elements of the genome, also known as the accessory genome. Here we examine intragenic base composition differences in core genomes and corresponding accessory genomes in 36 species, represented by the genomes of 731 bacterial strains, to assess the impact of selective forces on base composition in microbes. We also explore, in turn, how these results compare with findings for whole genome intragenic regions. We found that GC content in coding regions is significantly higher in core genomes than accessory genomes and whole genomes. Likewise, GC content variation within coding regions was significantly lower in core genomes than in accessory genomes and whole genomes. Relative entropy in coding regions, measured as the difference between observed and expected trinucleotide frequencies estimated from mononucleotide frequencies, was significantly higher in the core genomes than in accessory and whole genomes. Relative entropy was positively associated with coding region GC content within the accessory genomes, but not within the corresponding coding regions of core or whole genomes. The higher intragenic GC content and relative entropy, as well as the lower GC content variation, observed in the core genomes is most likely associated with selective constraints. It is unclear whether the positive association between GC content and relative entropy in the more mobile accessory genomes constitutes signatures of selection or selective neutral processes.

  14. Generic patient-reported outcomes in child health research: a review of conceptual content using World Health Organization definitions.

    PubMed

    Fayed, Nora; de Camargo, Olaf Kraus; Kerr, Elizabeth; Rosenbaum, Peter; Dubey, Ankita; Bostan, Cristina; Faulhaber, Markus; Raina, Parminder; Cieza, Alarcos

    2012-12-01

    Our aims were to (1) describe the conceptual basis of popular generic instruments according to World Health Organization (WHO) definitions of functioning, disability, and health (FDH), and quality of life (QOL) with health-related quality of life (HRQOL) as a subcomponent of QOL; (2) map the instruments to the International Classification of Functioning, Disability and Health (ICF); and (3) provide information on how the analyzed instruments were used in the literature. This should enable users to make valid choices about which instruments have the desired content for a specific context or purpose. Child health-based literature over a 5-year period was reviewed to find research employing health status and QOL/HRQOL instruments. WHO definitions of FDH and QOL were applied to each item of the 15 most used instruments to differentiate measures of FDH and QOL/HRQOL. The ICF was used to describe the health and health-related content (if any) in those instruments. Additional aspects of instrument use were extracted from these articles. Many instruments that were used to measure QOL/HRQOL did not reflect WHO definitions of QOL. The ICF domains within instruments were highly variable with respect to whether body functions, activities and participation, or environment were emphasized. There is inconsistency among researchers about how to measure HRQOL and QOL. Moreover, when an ICF content analysis is applied, there is variability among instruments in the health components included and emphasized. Reviewing content is important for matching instruments to their intended purpose. © The Authors. Developmental Medicine & Child Neurology © 2012 Mac Keith Press.

  15. Adapting a generic coping skills programme for adolescents with type 1 diabetes: a qualitative study.

    PubMed

    Serlachius, A; Northam, E; Frydenberg, E; Cameron, F

    2012-04-01

    Few qualitative studies have examined the views of adolescents with type 1 diabetes mellitus (T1DM) regarding psychosocial programme development and content. We conducted focus groups with 13 adolescents with T1DM to explore stressors and gain feedback on adapting a generic coping skills programme. The following prevalent stressors were identified: parental/adolescent conflict, balancing self-management and daily life, and health concerns. Prevalent views on programme adaptation included enhancing social support and adding diabetes-specific information and skills. Based on these data, the programme was adapted to address stressors and support self-management, thus better meeting the needs of, and appeal to, adolescents with T1DM.

  16. A new method for evaluating compliance with industry self-regulation codes governing the content of alcohol advertising.

    PubMed

    Babor, Thomas F; Xuan, Ziming; Damon, Donna

    2013-10-01

    This study evaluated the use of a modified Delphi technique in combination with a previously developed alcohol advertising rating procedure to detect content violations in the U.S. Beer Institute Code. A related aim was to estimate the minimum number of raters needed to obtain reliable evaluations of code violations in television commercials. Six alcohol ads selected for their likelihood of having code violations were rated by community and expert participants (N = 286). Quantitative rating scales were used to measure the content of alcohol advertisements based on alcohol industry self-regulatory guidelines. The community group participants represented vulnerability characteristics that industry codes were designed to protect (e.g., age <21); experts represented various health-related professions, including public health, human development, alcohol research, and mental health. Alcohol ads were rated on 2 occasions separated by 1 month. After completing Time 1 ratings, participants were randomized to receive feedback from 1 group or the other. Findings indicate that (i) ratings at Time 2 had generally reduced variance, suggesting greater consensus after feedback, (ii) feedback from the expert group was more influential than that of the community group in developing group consensus, (iii) the expert group found significantly fewer violations than the community group, (iv) experts representing different professional backgrounds did not differ among themselves in the number of violations identified, and (v) a rating panel composed of at least 15 raters is sufficient to obtain reliable estimates of code violations. The Delphi technique facilitates consensus development around code violations in alcohol ad content and may enhance the ability of regulatory agencies to monitor the content of alcoholic beverage advertising when combined with psychometric-based rating procedures. Copyright © 2013 by the Research Society on Alcoholism.

  17. A New Method for Evaluating Compliance with Industry Self-regulation Codes Governing the Content of Alcohol Advertising

    PubMed Central

    Babor, Thomas F.; Xuan, Ziming; Damon, Donna

    2013-01-01

    Background This study evaluated the use of a modified Delphi technique in combination with a previously developed alcohol advertising rating procedure to detect content violations in the US Beer Institute code. A related aim was to estimate the minimum number of raters needed to obtain reliable evaluations of code violations in television commercials. Methods Six alcohol ads selected for their likelihood of having code violations were rated by community and expert participants (N=286). Quantitative rating scales were used to measure the content of alcohol advertisements based on alcohol industry self-regulatory guidelines. The community group participants represented vulnerability characteristics that industry codes were designed to protect (e.g., age < 21); experts represented various health-related professions, including public health, human development, alcohol research and mental health. Alcohol ads were rated on two occasions separated by one month. After completing Time 1 ratings, participants were randomized to receive feedback from one group or the other. Results Findings indicate that (1) ratings at Time 2 had generally reduced variance, suggesting greater consensus after feedback, (2) feedback from the expert group was more influential than that of the community group in developing group consensus, (3) the expert group found significantly fewer violations than the community group, (4) experts representing different professional backgrounds did not differ among themselves in the number of violations identified; (5) a rating panel composed of at least 15 raters is sufficient to obtain reliable estimates of code violations. Conclusions The Delphi Technique facilitates consensus development around code violations in alcohol ad content and may enhance the ability of regulatory agencies to monitor the content of alcoholic beverage advertising when combined with psychometric-based rating procedures. PMID:23682927

  18. Measuring the impact of cataract surgery on generic and vision-specific quality of life.

    PubMed

    Groessl, Erik J; Liu, Lin; Sklar, Marisa; Tally, Steven R; Kaplan, Robert M; Ganiats, Theodore G

    2013-08-01

    Cataracts are the leading cause of blindness worldwide and cause visual impairment for millions of adults in the United States. We compared the sensitivity of a vision-specific health-related quality of life (HRQOL) measure to that of multiple generic measures of HRQOL before and at 2 time points after cataract surgery. Participants completed 1 vision-specific and 5 generic quality of life measures before cataract surgery, and again 1 and 6 months after surgery. Random effects modeling was used to measure changes over the three assessment points. The NEI-VFQ25 total score and all 11 subscales showed significant improvements during the first interval (baseline and 1 month). During the second interval (1-6 months post-surgery), significant improvements were observed on the total score and 5 of 11 NEI-VFQ25 subscales. There were significant increases in HRQOL during the first interval on some preference-based generic HRQOL measures, though changes during the second interval were mostly non-significant. None of the SF-36v2™ or SF6D scales changed significantly between any of the assessment periods. The NEI-VFQ25 was sensitive to changes in vision-specific domains of QOL. Some preference-based generic HRQOL measures were also sensitive to change and showed convergence with the NEI-VFQ25, but the effects were small. The SF-36v2™ and SF-6D did not change in a similar manner, possibly reflecting a lack of vision-related content. Studies seeking to document both the vision-specific and generic HRQOL improvements of cataract surgery should consider these results when selecting measures.

  19. C3 generic workstation: Performance metrics and applications

    NASA Technical Reports Server (NTRS)

    Eddy, Douglas R.

    1988-01-01

    The large number of integrated dependent measures available on a command, control, and communications (C3) generic workstation under development are described. In this system, embedded communications tasks will manipulate workload to assess the effects of performance-enhancing drugs (sleep aids and decongestants), work/rest cycles, biocybernetics, and decision support systems on performance. Task performance accuracy and latency will be event coded for correlation with other measures of voice stress and physiological functioning. Sessions will be videotaped to score non-verbal communications. Physiological recordings include spectral analysis of EEG, ECG, vagal tone, and EOG. Subjective measurements include SWAT, fatigue, POMS and specialized self-report scales. The system will be used primarily to evaluate the effects on performance of drugs, work/rest cycles, and biocybernetic concepts. Performance assessment algorithms will also be developed, including those used with small teams. This system provides a tool for integrating and synchronizing behavioral and psychophysiological measures in a complex decision-making environment.

  20. Human life support during interplanetary travel and domicile. II - Generic Modular Flow Schematic modeling

    NASA Technical Reports Server (NTRS)

    Farral, Joseph F.; Seshan, P. K.; Rohatgi, Naresh K.

    1991-01-01

    This paper describes the Generic Modular Flow Schematic (GMFS) architecture capable of encompassing all functional elements of a physical/chemical life support system (LSS). The GMFS can be implemented to synthesize, model, analyze, and quantitatively compare many configurations of LSSs, from a simple, completely open-loop to a very complex closed-loop. The GMFS model is coded in ASPEN, a state-of-the-art chemical process simulation program, to accurately compute the material, heat, and power flow quantities for every stream in each of the subsystem functional elements (SFEs) in the chosen configuration of a life support system. The GMFS approach integrates the various SFEs and subsystems in a hierarchical and modular fashion facilitating rapid substitutions and reconfiguration of a life support system. The comprehensive ASPEN material and energy balance output is transferred to a systems and technology assessment spreadsheet for rigorous system analysis and trade studies.

  1. Work Experience Report

    NASA Technical Reports Server (NTRS)

    Guo, Daniel

    2017-01-01

    The NASA Platform for Autonomous Systems (NPAS) toolkit is currently being used at the NASA John C. Stennis Space Center (SSC) to develop the INSIGHT program, which will autonomously monitor and control the Nitrogen System of the High Pressure Gas Facility (HPGF) on site. The INSIGHT program is in need of generic timing capabilities in order to perform timing based actions such as pump usage timing and sequence step timing. The purpose of this project was to develop a timing module that could fulfill these requirements and be adaptable for expanded use in the future. The code was written in Gensym G2 software platform, the same as INSIGHT, and was written generically to ensure compatibility with any G2 program. Currently, the module has two timing capabilities, a stopwatch function and a countdown function. Although the module has gone through some functionality testing, actual integration of the module into NPAS and the INSIGHT program is contingent on the module passing later checks.

  2. Universal and adapted vocabularies for generic visual categorization.

    PubMed

    Perronnin, Florent

    2008-07-01

    Generic Visual Categorization (GVC) is the pattern classification problem which consists in assigning labels to an image based on its semantic content. This is a challenging task as one has to deal with inherent object/scene variations as well as changes in viewpoint, lighting and occlusion. Several state-of-the-art GVC systems use a vocabulary of visual terms to characterize images with a histogram of visual word counts. We propose a novel practical approach to GVC based on a universal vocabulary, which describes the content of all the considered classes of images, and class vocabularies obtained through the adaptation of the universal vocabulary using class-specific data. The main novelty is that an image is characterized by a set of histograms - one per class - where each histogram describes whether the image content is best modeled by the universal vocabulary or the corresponding class vocabulary. This framework is applied to two types of local image features: low-level descriptors such as the popular SIFT and high-level histograms of word co-occurrences in a spatial neighborhood. It is shown experimentally on two challenging datasets (an in-house database of 19 categories and the PASCAL VOC 2006 dataset) that the proposed approach exhibits state-of-the-art performance at a modest computational cost.

  3. Research and Trends in the Field of Technology-Enhanced Learning from 2006 to 2011: A Content Analysis of Quick Response Code (QR-Code) and Its Application in Selected Studies

    ERIC Educational Resources Information Center

    Hau, Goh Bak; Siraj, Saedah; Alias, Norlidah; Rauf, Rose Amnah Abd.; Zakaria, Abd. Razak; Darusalam, Ghazali

    2013-01-01

    This study provides a content analysis of selected articles in the field of QR code and its application in educational context that were published in journals and proceedings of international conferences and workshops from 2006 to 2011. These articles were cross analysed by published years, journal, and research topics. Further analysis was…

  4. 21 CFR 106.90 - Coding.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content of Infant Formulas § 106.90 Coding. The manufacturer shall code all infant formulas in conformity...

  5. Certifying Domain-Specific Policies

    NASA Technical Reports Server (NTRS)

    Lowry, Michael; Pressburger, Thomas; Rosu, Grigore; Koga, Dennis (Technical Monitor)

    2001-01-01

    Proof-checking code for compliance to safety policies potentially enables a product-oriented approach to certain aspects of software certification. To date, previous research has focused on generic, low-level programming-language properties such as memory type safety. In this paper we consider proof-checking higher-level domain -specific properties for compliance to safety policies. The paper first describes a framework related to abstract interpretation in which compliance to a class of certification policies can be efficiently calculated Membership equational logic is shown to provide a rich logic for carrying out such calculations, including partiality, for certification. The architecture for a domain-specific certifier is described, followed by an implemented case study. The case study considers consistency of abstract variable attributes in code that performs geometric calculations in Aerospace systems.

  6. Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network

    PubMed Central

    Lin, Kai; Wang, Di; Hu, Long

    2016-01-01

    With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC). The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S) evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods. PMID:27376302

  7. SLHAplus: A library for implementing extensions of the standard model

    NASA Astrophysics Data System (ADS)

    Bélanger, G.; Christensen, Neil D.; Pukhov, A.; Semenov, A.

    2011-03-01

    We provide a library to facilitate the implementation of new models in codes such as matrix element and event generators or codes for computing dark matter observables. The library contains an SLHA reader routine as well as diagonalisation routines. This library is available in CalcHEP and micrOMEGAs. The implementation of models based on this library is supported by LanHEP and FeynRules. Program summaryProgram title: SLHAplus_1.3 Catalogue identifier: AEHX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6283 No. of bytes in distributed program, including test data, etc.: 52 119 Distribution format: tar.gz Programming language: C Computer: IBM PC, MAC Operating system: UNIX (Linux, Darwin, Cygwin) RAM: 2000 MB Classification: 11.1 Nature of problem: Implementation of extensions of the standard model in matrix element and event generators and codes for dark matter observables. Solution method: For generic extensions of the standard model we provide routines for reading files that adopt the standard format of the SUSY Les Houches Accord (SLHA) file. The procedure has been generalized to take into account an arbitrary number of blocks so that the reader can be used in generic models including non-supersymmetric ones. The library also contains routines to diagonalize real and complex mass matrices with either unitary or bi-unitary transformations as well as routines for evaluating the running strong coupling constant, running quark masses and effective quark masses. Running time: 0.001 sec

  8. Framing access to medicines in developing countries: an analysis of media coverage of Canada's Access to Medicines Regime

    PubMed Central

    2010-01-01

    Background In September 2003, the Canadian government committed to developing legislation that would facilitate greater access to affordable medicines for developing countries. Over the course of eight months, the legislation, now known as Canada's Access to Medicines Regime (CAMR), went through a controversial policy development process and the newspaper media was one of the major venues in which the policy debates took place. The purpose of this study was to examine how the media framed CAMR to determine how policy goals were conceptualized, which stakeholder interests controlled the public debate and how these variables related to the public policy process. Methods We conducted a qualitative content analysis of newspaper coverage of the CAMR policy and implementation process from 2003-2008. The primary theoretical framework for this study was framing theory. A total of 90 articles from 11 Canadian newspapers were selected for inclusion in our analysis. A team of four researchers coded the articles for themes relating to access to medicines and which stakeholders' voice figured more prominently on each issue. Stakeholders examined included: the research-based industry, the generic industry, civil society, the Canadian government, and developing country representatives. Results The most frequently mentioned themes across all documents were the issues of drug affordability, intellectual property, trade agreements and obligations, and development. Issues such as human rights, pharmaceutical innovation, and economic competitiveness got little media representation. Civil society dominated the media contents, followed far behind by the Canadian government, the research-based and generic pharmaceutical industries. Developing country representatives were hardly represented in the media. Conclusions Media framing obscured the discussion of some of the underlying policy goals in this case and failed to highlight issues which are now significant barriers to the use of the legislation. Using the media to engage the public in more in-depth exploration of the policy issues at stake may contribute to a more informed policy development process. The media can be an effective channel for those stakeholders with a weaker voice in policy deliberations to raise public attention to particular issues; however, the political and institutional context must be taken into account as it may outweigh media framing effects. PMID:20044940

  9. Proposal for a new content model for the Austrian Procedure Catalogue.

    PubMed

    Neururer, Sabrina B; Pfeiffer, Karl P

    2013-01-01

    The Austrian Procedure Catalogue is used for procedure coding in Austria. Its architecture and content has some major weaknesses. The aim of this study is the presentation of a new potential content model for this classification system consisting of main characteristics of health interventions. It is visualized using a UML class diagram. Based on this proposition, an implementation of an ontology for procedure coding is planned.

  10. Method for the prediction of the installation aerodynamics of a propfan at subsonic speeds: User manual

    NASA Technical Reports Server (NTRS)

    Chandrasekaran, B.

    1986-01-01

    This document is the user's guide for the method developed earlier for predicting the slipstream wing interaction at subsonic speeds. The analysis involves a subsonic panel code (HESS code) modified to handle the propeller onset flow. The propfan slipstream effects are superimposed on the normal flow boundary condition and are applied over the surface washed by the slipstream. The effects of the propeller slipstream are to increase the axial induced velocity, tangential velocity, and a total pressure rise in the wake of the propeller. Principles based on blade performance theory, momentum theory, and vortex theory were used to evaluate the slipstream effects. The code can be applied to any arbitrary three dimensional geometry, expressed in the form of HESS input format. The code can handle a propeller alone configuration or a propeller/nacelle/airframe configuration, operating up to high subcritical Mach numbers over a range of angles of attack. Inclusion of a viscous modelling is briefly outlined. Wind tunnel results/theory comparisons are included as examples for the application of the code to a generic supercritical wing/overwing Nacelle with a powered propfan. A sample input/output listing is provided.

  11. Toward Developing a Universal Code of Ethics for Adult Educators.

    ERIC Educational Resources Information Center

    Siegel, Irwin H.

    2000-01-01

    Presents conflicting viewpoints on a universal code of ethics for adult educators. Suggests objectives of a code (guidance for practice, policymaking direction, common reference point, shared values). Outlines content and methods for implementing a code. (SK)

  12. Edge-diffraction effects in RCS predictions and their importance in systems analysis

    NASA Astrophysics Data System (ADS)

    Friess, W. F.; Klement, D.; Ruppel, M.; Stein, Volker

    1996-06-01

    In developing RCS prediction codes a variety of physical effects such as the edge diffraction effect have to be considered with the consequence that the computer effort increases considerably. This fact limits the field of application of such codes, especially if the RCS data serve as input parameters for system simulators which very often need these data for a high number of observation angles and/or frequencies. Vice versa the issues of a system analysis can be used to estimate the relevance of physical effects under system viewpoints and to rank them according to their magnitude. This paper tries to evaluate the importance of RCS predictions containing an edge diffracted field for systems analysis. A double dihedral with a strong depolarizing behavior and a generic airplane design containing many arbitrarily oriented edges are used as test structures. Data of the scattered field are generated by the RCS computer code SIGMA with and without including edge diffraction effects. These data are submitted to the code DORA to determine radar range and radar detectibility and to a SAR simulator code to generate SAR imagery. In both cases special scenarios are assumed. The essential features of the computer codes in their current state are described, the results are presented and discussed under systems viewpoints.

  13. A deep learning and novelty detection framework for rapid phenotyping in high-content screening

    PubMed Central

    Sommer, Christoph; Hoefler, Rudolf; Samwer, Matthias; Gerlich, Daniel W.

    2017-01-01

    Supervised machine learning is a powerful and widely used method for analyzing high-content screening data. Despite its accuracy, efficiency, and versatility, supervised machine learning has drawbacks, most notably its dependence on a priori knowledge of expected phenotypes and time-consuming classifier training. We provide a solution to these limitations with CellCognition Explorer, a generic novelty detection and deep learning framework. Application to several large-scale screening data sets on nuclear and mitotic cell morphologies demonstrates that CellCognition Explorer enables discovery of rare phenotypes without user training, which has broad implications for improved assay development in high-content screening. PMID:28954863

  14. Context-dependent effects of noise on echolocation pulse characteristics in free-tailed bats

    PubMed Central

    Smotherman, Michael S.

    2010-01-01

    Background noise evokes a similar suite of adaptations in the acoustic structure of communication calls across a diverse range of vertebrates. Echolocating bats may have evolved specialized vocal strategies for echolocating in noise, but also seem to exhibit generic vertebrate responses such as the ubiquitous Lombard response. We wondered how bats balance generic and echolocation-specific vocal responses to noise. To address this question, we first characterized the vocal responses of flying free-tailed bats (Tadarida brasiliensis) to broadband noises varying in amplitude. Secondly, we measured the bats’ responses to band-limited noises that varied in the extent of overlap with their echolocation pulse bandwidth. We hypothesized that the bats’ generic responses to noise would be graded proportionally with noise amplitude, total bandwidth and frequency content, and consequently that more selective responses to band-limited noise such as the jamming avoidance response could be explained by a linear decomposition of the response to broadband noise. Instead, the results showed that both the nature and the magnitude of the vocal responses varied with the acoustic structure of the outgoing pulse as well as non-linearly with noise parameters. We conclude that free-tailed bats utilize separate generic and specialized vocal responses to noise in a context-dependent fashion. PMID:19672604

  15. Sequence-based heuristics for faster annotation of non-coding RNA families.

    PubMed

    Weinberg, Zasha; Ruzzo, Walter L

    2006-01-01

    Non-coding RNAs (ncRNAs) are functional RNA molecules that do not code for proteins. Covariance Models (CMs) are a useful statistical tool to find new members of an ncRNA gene family in a large genome database, using both sequence and, importantly, RNA secondary structure information. Unfortunately, CM searches are extremely slow. Previously, we created rigorous filters, which provably sacrifice none of a CM's accuracy, while making searches significantly faster for virtually all ncRNA families. However, these rigorous filters make searches slower than heuristics could be. In this paper we introduce profile HMM-based heuristic filters. We show that their accuracy is usually superior to heuristics based on BLAST. Moreover, we compared our heuristics with those used in tRNAscan-SE, whose heuristics incorporate a significant amount of work specific to tRNAs, where our heuristics are generic to any ncRNA. Performance was roughly comparable, so we expect that our heuristics provide a high-quality solution that--unlike family-specific solutions--can scale to hundreds of ncRNA families. The source code is available under GNU Public License at the supplementary web site.

  16. Euler Technology Assessment for Preliminary Aircraft Design: Compressibility Predictions by Employing the Cartesian Unstructured Grid SPLITFLOW Code

    NASA Technical Reports Server (NTRS)

    Finley, Dennis B.; Karman, Steve L., Jr.

    1996-01-01

    The objective of the second phase of the Euler Technology Assessment program was to evaluate the ability of Euler computational fluid dynamics codes to predict compressible flow effects over a generic fighter wind tunnel model. This portion of the study was conducted by Lockheed Martin Tactical Aircraft Systems, using an in-house Cartesian-grid code called SPLITFLOW. The Cartesian grid technique offers several advantages, including ease of volume grid generation and reduced number of cells compared to other grid schemes. SPLITFLOW also includes grid adaption of the volume grid during the solution to resolve high-gradient regions. The SPLITFLOW code predictions of configuration forces and moments are shown to be adequate for preliminary design, including predictions of sideslip effects and the effects of geometry variations at low and high angles-of-attack. The transonic pressure prediction capabilities of SPLITFLOW are shown to be improved over subsonic comparisons. The time required to generate the results from initial surface data is on the order of several hours, including grid generation, which is compatible with the needs of the design environment.

  17. Seeing the Invisible: Embedding Tests in Code That Cannot be Modified

    NASA Technical Reports Server (NTRS)

    O'Malley, Owen; Mansouri-Samani, Masoud; Mehlitz, Peter; Penix, John

    2005-01-01

    The difficulty of characterizing and observing valid software behavior during testing can be very difficult in flight systems. To address this issue, we evaluated several approaches to increasing test observability on the Shuttle Abort Flight Management (SAFM) system. To increase test observability, we added probes into the running system to evaluate the internal state and analyze test data. To minimize the impact of the instrumentation and reduce manual effort, we used Aspect-Oriented Programming (AOP) tools to instrument the source code. We developed and elicited a spectrum of properties, from generic to application specific properties, to be monitored via the instrumentation. To evaluate additional approaches, SAFM was ported to Linux, enabling the use of gcov for measuring test coverage, Valgrind for looking for memory usage errors, and libraries for finding non-normal floating point values. An in-house C++ source code scanning tool was also used to identify violations of SAFM coding standards, and other potentially problematic C++ constructs. Using these approaches with the existing test data sets, we were able to verify several important properties, confirm several problems and identify some previously unidentified issues.

  18. Program ratings do not predict negative content in commercials on children's channels.

    PubMed

    Dale, Lourdes P; Klein, Jordana; DiLoreto, James; Pidano, Anne E; Borto, Jolanta W; McDonald, Kathleen; Olson, Heather; Neace, William P

    2011-01-01

    The aim of this study was to determine the presence of negative content in commercials airing on 3 children's channels (Disney Channel, Nickelodeon, and Cartoon Network). The 1681 commercials were coded with a reliable coding system and content comparisons were made. Although the majority of the commercials were coded as neutral, negative content was present in 13.5% of commercials. This rate was significantly more than the predicted value of zero and more similar to the rates cited in previous research examining content during sporting events. The rate of negative content was less than, but not significantly different from, the rate of positive content. Thus, our findings did not support our hypothesis that there would be more commercials with positive content than with negative content. Logistic regression analysis indicated that channel, and not rating, was a better predictor of the presence of overall negative content and the presence of violent behaviors. Commercials airing on the Cartoon Network had significantly more negative content, and those airing on Disney Channel had significantly less negative content than the other channels. Within the individual channels, program ratings did not relate to the presence of negative content. Parents cannot assume the content of commercials will be consistent with the program rating or label. Pediatricians and psychologists should educate parents about the potential for negative content in commercials and advocate for a commercials rating system to ensure that there is greater parity between children's programs and the corresponding commercials.

  19. The "Motherese" of Mr. Rogers: A Description of the Dialogue of Educational Television Programs.

    ERIC Educational Resources Information Center

    Rice, Mabel L.; Haight, Patti L.

    Dialogue from 30-minute samples from "Sesame Street" and "Mr. Rogers' Neighborhood" was coded for grammar, content, and discourse. Grammatical analysis used the LINGQUEST computer-assisted language assessment program (Mordecai, Palen, and Palmer 1982). Content coding was based on categories developed by Rice (1984) and…

  20. Prescription Drug Abuse Information in D.A.R.E.

    ERIC Educational Resources Information Center

    Morris, Melissa C.; Cline, Rebecca J. Welch; Weiler, Robert M.; Broadway, S. Camille

    2006-01-01

    This investigation was designed to examine prescription drug-related content and learning objectives in Drug Abuse Resistance Education (D.A.R.E.) for upper elementary and middle schools. Specific prescription-drug topics and context associated with content and objectives were coded. The coding system for topics included 126 topics organized…

  1. Analyzing Prosocial Content on T.V.

    ERIC Educational Resources Information Center

    Davidson, Emily S.; Neale, John M.

    To enhance knowledge of television content, a prosocial code was developed by watching a large number of potentially prosocial television programs and making notes on all the positive acts. The behaviors were classified into a workable number of categories. The prosocial code is largely verbal and contains seven categories which fall into two…

  2. Generic Detection of Register Realignment

    NASA Astrophysics Data System (ADS)

    Ďurfina, Lukáš; Kolář, Dušan

    2011-09-01

    The register realignment is a method of binary obfuscation and it is used by malware writers. The paper introduces the method how register realignment can be recognized by analysis based on the scattered context grammars. Such an analysis includes exploration of bytes affected by realignment, finding new valid values for them, building the scattered context grammar and parse an obfuscated code by this grammar. The created grammar has LL property--an ability for parsing by this type of grammar.

  3. Generic Detection of Register Realignment

    NASA Astrophysics Data System (ADS)

    Durfina, Lukáš; Kolář, Dušan

    2011-09-01

    The register realignment is a method of binary obfuscation and it is used by malware writers. The paper introduces the method how register realignment can be recognized by analysis based on the scattered context grammars. Such an analysis includes exploration of bytes affected by realignment, finding new valid values for them, building the scattered context grammar and parse an obfuscated code by this grammar. The created grammar has LL property—an ability for parsing by this type of grammar.

  4. Design and implementation of a compliant robot with force feedback and strategy planning software

    NASA Technical Reports Server (NTRS)

    Premack, T.; Strempek, F. M.; Solis, L. A.; Brodd, S. S.; Cutler, E. P.; Purves, L. R.

    1984-01-01

    Force-feedback robotics techniques are being developed for automated precision assembly and servicing of NASA space flight equipment. Design and implementation of a prototype robot which provides compliance and monitors forces is in progress. Computer software to specify assembly steps and makes force feedback adjustments during assembly are coded and tested for three generically different precision mating problems. A model program demonstrates that a suitably autonomous robot can plan its own strategy.

  5. An assessment of viscous effects in computational simulation of benign and burst vortex flows on generic fighter wind-tunnel models using TEAM code

    NASA Technical Reports Server (NTRS)

    Kinard, Tim A.; Harris, Brenda W.; Raj, Pradeep

    1995-01-01

    Vortex flows on a twin-tail and a single-tail modular transonic vortex interaction (MTVI) model, representative of a generic fighter configuration, are computationally simulated in this study using the Three-dimensional Euler/Navier-Stokes Aerodynamic Method (TEAM). The primary objective is to provide an assessment of viscous effects on benign (10 deg angle of attack) and burst (35 deg angle of attack) vortex flow solutions. This study was conducted in support of a NASA project aimed at assessing the viability of using Euler technology to predict aerodynamic characteristics of aircraft configurations at moderate-to-high angles of attack in a preliminary design environment. The TEAM code solves the Euler and Reynolds-average Navier-Stokes equations on patched multiblock structured grids. Its algorithm is based on a cell-centered finite-volume formulation with multistage time-stepping scheme. Viscous effects are assessed by comparing the computed inviscid and viscous solutions with each other and experimental data. Also, results of Euler solution sensitivity to grid density and numerical dissipation are presented for the twin-tail model. The results show that proper accounting of viscous effects is necessary for detailed design and optimization but Euler solutions can provide meaningful guidelines for preliminary design of flight vehicles which exhibit vortex flows in parts of their flight envelope.

  6. Preparing Content Area Teachers for Disciplinary Literacy Instruction: The Role of Literacy Teacher Educators

    ERIC Educational Resources Information Center

    Fang, Zhihui

    2014-01-01

    The recent call for secondary reading instruction to move away from a focus on generic literacy strategies to discipline-specific language and literacy practices presents new challenges for secondary teacher preparation. This column identifies some of the roles literacy teacher educators can play in helping address these challenges.

  7. 40 CFR Appendix B to Part 72 - Methodology for Conversion of Emissions Limits

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Methodology for Conversion of... Conversion of Emissions Limits For the purposes of the Acid Rain Program, all emissions limits must be... conditions. Generic conversions for these limits are based on the assumed average energy contents listed in...

  8. 40 CFR Appendix B to Part 72 - Methodology for Conversion of Emissions Limits

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Methodology for Conversion of... Conversion of Emissions Limits For the purposes of the Acid Rain Program, all emissions limits must be... conditions. Generic conversions for these limits are based on the assumed average energy contents listed in...

  9. Transfer of the Pedagogical Transformation Competence across Chemistry Topics

    ERIC Educational Resources Information Center

    Mavhunga, Elizabeth

    2016-01-01

    Pedagogical Content Knowledge (PCK) observed in one topic is commonly understood not to be transferable to another topic. This study asked, what can then be transferred in the context of learning and acquiring PCK? The study firstly posits the existence of a generic pedagogical competence that is developed in pre-service teachers to pedagogically…

  10. Manufacturing Math Classes: An Instructional Program Guide for Manufacturing Workers.

    ERIC Educational Resources Information Center

    McBride, Pamela G.; And Others

    This program guide documents a manufacturing job family curriculum that develops competence in generic work force education skills through three courses: Reading Rulers, Charts, and Gauges and Math for Manufacturing Workers I and II. An annotated table of contents lists a brief description of the questions answered in each section. An introduction…

  11. Information Technology in University-Level Mathematics Teaching and Learning: A Mathematician's Point of View

    ERIC Educational Resources Information Center

    Borovik, Alexandre

    2011-01-01

    Although mathematicians frequently use specialist software in direct teaching of mathematics, as a means of delivery e-learning technologies have so far been less widely used. We (mathematicians) insist that teaching methods should be subject-specific and content-driven, not delivery-driven. We oppose generic approaches to teaching, including…

  12. Education, Globalisation and the "Voice of Knowledge"

    ERIC Educational Resources Information Center

    Young, Michael

    2009-01-01

    This paper argues that underlying the links being made between the need for educational change in responding to the knowledge economy is an evacuation of the content of curricula and a misplaced emphasis on "genericism" and experience. As an alternative the paper draws on ideas from Durkheim, Vygotsky and Bernstein to make the case for…

  13. Conceptual Underpinnings of the Quality of Life in Neurological Disorders (Neuro-QoL): Comparisons of Core Sets for Stroke, Multiple Sclerosis, Spinal Cord Injury, and Traumatic Brain Injury.

    PubMed

    Wong, Alex W K; Lau, Stephen C L; Fong, Mandy W M; Cella, David; Lai, Jin-Shei; Heinemann, Allen W

    2018-04-03

    To determine the extent to which the content of the Quality of Life in Neurological Disorders (Neuro-QoL) covers the International Classification of Functioning, Disability and Health (ICF) Core Sets for multiple sclerosis (MS), stroke, spinal cord injury (SCI), and traumatic brain injury (TBI) using summary linkage indicators. Content analysis by linking content of the Neuro-QoL to corresponding ICF codes of each Core Set for MS, stroke, SCI, and TBI. Three academic centers. None. None. Four summary linkage indicators proposed by MacDermid et al were estimated to compare the content coverage between Neuro-QoL and the ICF codes of Core Sets for MS, stroke, MS, and TBI. Neuro-QoL represented 20% to 30% Core Set codes for different conditions in which more codes in Core Sets for MS (29%), stroke (28%), and TBI (28%) were covered than those for SCI in the long-term (20%) and early postacute (19%) contexts. Neuro-QoL represented nearly half of the unique Activity and Participation codes (43%-49%) and less than one third of the unique Body Function codes (12%-32%). It represented fewer Environmental Factors codes (2%-6%) and no Body Structures codes. Absolute linkage indicators found that at least 60% of Neuro-QoL items were linked to Core Set codes (63%-95%), but many items covered the same codes as revealed by unique linkage indicators (7%-13%), suggesting high concept redundancy among items. The Neuro-QoL links more closely to ICF Core Sets for stroke, MS, and TBI than to those for SCI, and primarily covers activity and participation ICF domains. Other instruments are needed to address concepts not measured by the Neuro-QoL when a comprehensive health assessment is needed. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  14. Generic absence of strong singularities in loop quantum Bianchi-IX spacetimes

    NASA Astrophysics Data System (ADS)

    Saini, Sahil; Singh, Parampreet

    2018-03-01

    We study the generic resolution of strong singularities in loop quantized effective Bianchi-IX spacetime in two different quantizations—the connection operator based ‘A’ quantization and the extrinsic curvature based ‘K’ quantization. We show that in the effective spacetime description with arbitrary matter content, it is necessary to include inverse triad corrections to resolve all the strong singularities in the ‘A’ quantization. Whereas in the ‘K’ quantization these results can be obtained without including inverse triad corrections. Under these conditions, the energy density, expansion and shear scalars for both of the quantization prescriptions are bounded. Notably, both the quantizations can result in potentially curvature divergent events if matter content allows divergences in the partial derivatives of the energy density with respect to the triad variables at a finite energy density. Such events are found to be weak curvature singularities beyond which geodesics can be extended in the effective spacetime. Our results show that all potential strong curvature singularities of the classical theory are forbidden in Bianchi-IX spacetime in loop quantum cosmology and geodesic evolution never breaks down for such events.

  15. Scene Context Dependency of Pattern Constancy of Time Series Imagery

    NASA Technical Reports Server (NTRS)

    Woodell, Glenn A.; Jobson, Daniel J.; Rahman, Zia-ur

    2008-01-01

    A fundamental element of future generic pattern recognition technology is the ability to extract similar patterns for the same scene despite wide ranging extraneous variables, including lighting, turbidity, sensor exposure variations, and signal noise. In the process of demonstrating pattern constancy of this kind for retinex/visual servo (RVS) image enhancement processing, we found that the pattern constancy performance depended somewhat on scene content. Most notably, the scene topography and, in particular, the scale and extent of the topography in an image, affects the pattern constancy the most. This paper will explore these effects in more depth and present experimental data from several time series tests. These results further quantify the impact of topography on pattern constancy. Despite this residual inconstancy, the results of overall pattern constancy testing support the idea that RVS image processing can be a universal front-end for generic visual pattern recognition. While the effects on pattern constancy were significant, the RVS processing still does achieve a high degree of pattern constancy over a wide spectrum of scene content diversity, and wide ranging extraneousness variations in lighting, turbidity, and sensor exposure.

  16. Does the Genetic Code Have A Eukaryotic Origin?

    PubMed Central

    Zhang, Zhang; Yu, Jun

    2013-01-01

    In the RNA world, RNA is assumed to be the dominant macromolecule performing most, if not all, core “house-keeping” functions. The ribo-cell hypothesis suggests that the genetic code and the translation machinery may both be born of the RNA world, and the introduction of DNA to ribo-cells may take over the informational role of RNA gradually, such as a mature set of genetic code and mechanism enabling stable inheritance of sequence and its variation. In this context, we modeled the genetic code in two content variables—GC and purine contents—of protein-coding sequences and measured the purine content sensitivities for each codon when the sensitivity (% usage) is plotted as a function of GC content variation. The analysis leads to a new pattern—the symmetric pattern—where the sensitivity of purine content variation shows diagonally symmetry in the codon table more significantly in the two GC content invariable quarters in addition to the two existing patterns where the table is divided into either four GC content sensitivity quarters or two amino acid diversity halves. The most insensitive codon sets are GUN (valine) and CAN (CAR for asparagine and CAY for aspartic acid) and the most biased amino acid is valine (always over-estimated) followed by alanine (always under-estimated). The unique position of valine and its codons suggests its key roles in the final recruitment of the complete codon set of the canonical table. The distinct choice may only be attributable to sequence signatures or signals of splice sites for spliceosomal introns shared by all extant eukaryotes. PMID:23402863

  17. [Preliminarily application of content analysis to qualitative nursing data].

    PubMed

    Liang, Shu-Yuan; Chuang, Yeu-Hui; Wu, Shu-Fang

    2012-10-01

    Content analysis is a methodology for objectively and systematically studying the content of communication in various formats. Content analysis in nursing research and nursing education is called qualitative content analysis. Qualitative content analysis is frequently applied to nursing research, as it allows researchers to determine categories inductively and deductively. This article examines qualitative content analysis in nursing research from theoretical and practical perspectives. We first describe how content analysis concepts such as unit of analysis, meaning unit, code, category, and theme are used. Next, we describe the basic steps involved in using content analysis, including data preparation, data familiarization, analysis unit identification, creating tentative coding categories, category refinement, and establishing category integrity. Finally, this paper introduces the concept of content analysis rigor, including dependability, confirmability, credibility, and transferability. This article elucidates the content analysis method in order to help professionals conduct systematic research that generates data that are informative and useful in practical application.

  18. A multi-site cognitive task analysis for biomedical query mediation.

    PubMed

    Hruby, Gregory W; Rasmussen, Luke V; Hanauer, David; Patel, Vimla L; Cimino, James J; Weng, Chunhua

    2016-09-01

    To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: "Identify potential index phenotype," "If needed, request EHR database access rights," and "Perform query and present output to medical researcher", and 8 are invalid. We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. A Multi-Site Cognitive Task Analysis for Biomedical Query Mediation

    PubMed Central

    Hruby, Gregory W.; Rasmussen, Luke V.; Hanauer, David; Patel, Vimla; Cimino, James J.; Weng, Chunhua

    2016-01-01

    Objective To apply cognitive task analyses of the Biomedical query mediation (BQM) processes for EHR data retrieval at multiple sites towards the development of a generic BQM process model. Materials and Methods We conducted semi-structured interviews with eleven data analysts from five academic institutions and one government agency, and performed cognitive task analyses on their BQM processes. A coding schema was developed through iterative refinement and used to annotate the interview transcripts. The annotated dataset was used to reconstruct and verify each BQM process and to develop a harmonized BQM process model. A survey was conducted to evaluate the face and content validity of this harmonized model. Results The harmonized process model is hierarchical, encompassing tasks, activities, and steps. The face validity evaluation concluded the model to be representative of the BQM process. In the content validity evaluation, out of the 27 tasks for BQM, 19 meet the threshold for semi-valid, including 3 fully valid: “Identify potential index phenotype,” “If needed, request EHR database access rights,” and “Perform query and present output to medical researcher”, and 8 are invalid. Discussion We aligned the goals of the tasks within the BQM model with the five components of the reference interview. The similarity between the process of BQM and the reference interview is promising and suggests the BQM tasks are powerful for eliciting implicit information needs. Conclusions We contribute a BQM process model based on a multi-site study. This model promises to inform the standardization of the BQM process towards improved communication efficiency and accuracy. PMID:27435950

  20. Generic trending and analysis system

    NASA Technical Reports Server (NTRS)

    Keehan, Lori; Reese, Jay

    1994-01-01

    The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.

  1. Transitioning to a national health system in Cyprus: a stakeholder analysis of pharmaceutical policy reform

    PubMed Central

    Kanavos, Panos G

    2015-01-01

    Abstract Objective To review the pharmaceutical sector in Cyprus in terms of the availability and affordability of medicines and to explore pharmaceutical policy options for the national health system finance reform expected to be introduced in 2016. Methods We conducted semi-structured interviews in April 2014 with senior representatives from seven key national organizations involved in pharmaceutical care. The captured data were coded and analysed using the predetermined themes of pricing, reimbursement, prescribing, dispensing and cost sharing. We also examined secondary data provided by the Cypriot Ministry of Health; these data included the prices and volumes of prescription medicines in 2013. Findings We identified several key issues, including high medicine prices, underuse of generic medicines and high out-of-pocket drug spending. Most stakeholders recommended that the national government review existing pricing policies to ensure medicines within the forthcoming national health system are affordable and available, introduce a national reimbursement system and incentivize the prescribing and dispensing of generic medicines. There were disagreements over how to (i) allocate responsibilities to governmental agencies in the national health system, (ii) reconcile differences in opinion between stakeholders and (iii) raise awareness among patients, physicians and pharmacists about the benefits of greater generic drug use. Conclusion In Cyprus, if the national health system is going to provide universal health coverage in a sustainable fashion, then the national government must address the current issues in the pharmaceutical sector. Importantly, the country will need to increase the market share of generic medicines to contain drug spending. PMID:26478624

  2. [Generics: essentially similar, bioequivalent but not identical].

    PubMed

    Even-Adin, D; De Muylder, J A; Sternon, J

    2001-12-01

    The using of generic forms (GF) is presented as a potential source of budgetary "saving of money" in the field of pharmaceutical expenses. Not frequently prescribed in Belgium, they win a new interest thanks to the recent making use of the "reference repayment". Sale's authorization of GF is controlled by european rules, but some questions about their identity to original medications remain. Do similarities based only upon qualitative and quantitative composition in active molecules, pharmaceutical forms and biodisponibility give us all requested guarantees? Several cases of discordances can appear: the major elements of non conformity are the nature of excipients, notice's contents and the value of biodisponibility studies. However, in term of economy, in the drug market, development of GF appears to constitute an unavoidable phenomenon.

  3. Dream content: Individual and generic aspects.

    PubMed

    Hobson, Allan; Kahn, David

    2007-12-01

    Dream reports were collected from normal subjects in an effort to determine the degree to which dream reports can be used to identify individual dreamers. Judges were asked to group the reports by their authors. The judges scored the reports correctly at chance levels. This finding indicated that dreams may be at least as much like each other as they are the signature of individual dreamers. Our results suggest that dream reports cannot be used to identify the individuals who produced them when identifiers like names and gender of friends and family members are removed from the dream report. In addition to using dreams to learn about an individual, we must look at dreams as telling us about important common or generic aspects of human consciousness.

  4. A visual interface for generic message translation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blattner, M.M.; Kou, L.T.; Carlson, J.W.

    1988-06-21

    This paper is concerned with the translation of data structures we call messages. Messages are an example of a type of data structure encountered in generic data translation. Our objective is to provide a system that the nonprogrammer can use to specify the nature of translations from one type to another. For this reason we selected a visual interface that uses interaction techniques that do not require a knowledge of programming or command languages. The translator must accomplish two tasks: create a mapping between fields in different message types that specifies which fields have similar semantic content, and reformat ormore » translate data specifications within those fields. The translations are accomplished with appropriate, but different, visual metaphors. 14 refs., 4 figs.« less

  5. Diabetes Mellitus Coding Training for Family Practice Residents.

    PubMed

    Urse, Geraldine N

    2015-07-01

    Although physicians regularly use numeric coding systems such as the International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) to describe patient encounters, coding errors are common. One of the most complicated diagnoses to code is diabetes mellitus. The ICD-9-CM currently has 39 separate codes for diabetes mellitus; this number will be expanded to more than 50 with the introduction of ICD-10-CM in October 2015. To assess the effect of a 1-hour focused presentation on ICD-9-CM codes on diabetes mellitus coding. A 1-hour focused lecture on the correct use of diabetes mellitus codes for patient visits was presented to family practice residents at Doctors Hospital Family Practice in Columbus, Ohio. To assess resident knowledge of the topic, a pretest and posttest were given to residents before and after the lecture, respectively. Medical records of all patients with diabetes mellitus who were cared for at the hospital 6 weeks before and 6 weeks after the lecture were reviewed and compared for the use of diabetes mellitus ICD-9 codes. Eighteen residents attended the lecture and completed the pretest and posttest. The mean (SD) percentage of correct answers was 72.8% (17.1%) for the pretest and 84.4% (14.6%) for the posttest, for an improvement of 11.6 percentage points (P≤.035). The percentage of total available codes used did not substantially change from before to after the lecture, but the use of the generic ICD-9-CM code for diabetes mellitus type II controlled (250.00) declined (58 of 176 [33%] to 102 of 393 [26%]) and the use of other codes increased, indicating a greater variety in codes used after the focused lecture. After a focused lecture on diabetes mellitus coding, resident coding knowledge improved. Review of medical record data did not reveal an overall change in the number of diabetic codes used after the lecture but did reveal a greater variety in the codes used.

  6. Reliability of a rating procedure to monitor industry self-regulation codes governing alcohol advertising content.

    PubMed

    Babor, Thomas F; Xuan, Ziming; Proctor, Dwayne

    2008-03-01

    The purposes of this study were to develop reliable procedures to monitor the content of alcohol advertisements broadcast on television and in other media, and to detect violations of the content guidelines of the alcohol industry's self-regulation codes. A set of rating-scale items was developed to measure the content guidelines of the 1997 version of the U.S. Beer Institute Code. Six focus groups were conducted with 60 college students to evaluate the face validity of the items and the feasibility of the procedure. A test-retest reliability study was then conducted with 74 participants, who rated five alcohol advertisements on two occasions separated by 1 week. Average correlations across all advertisements using three reliability statistics (r, rho, and kappa) were almost all statistically significant and the kappas were good for most items, which indicated high test-retest agreement. We also found high interrater reliabilities (intraclass correlations) among raters for item-level and guideline-level violations, indicating that regardless of the specific item, raters were consistent in their general evaluations of the advertisements. Naïve (untrained) raters can provide consistent (reliable) ratings of the main content guidelines proposed in the U.S. Beer Institute Code. The rating procedure may have future applications for monitoring compliance with industry self-regulation codes and for conducting research on the ways in which alcohol advertisements are perceived by young adults and other vulnerable populations.

  7. MC3, Version 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cawkwell, Marc Jon

    2016-09-09

    The MC3 code is used to perform Monte Carlo simulations in the isothermal-isobaric ensemble (constant number of particles, temperature, and pressure) on molecular crystals. The molecules within the periodic simulation cell are treated as rigid bodies, alleviating the requirement for a complex interatomic potential. Intermolecular interactions are described using generic, atom-centered pair potentials whose parameterization is taken from the literature [D. E. Williams, J. Comput. Chem., 22, 1154 (2001)] and electrostatic interactions arising from atom-centered, fixed, point partial charges. The primary uses of the MC3 code are the computation of i) the temperature and pressure dependence of lattice parameters andmore » thermal expansion coefficients, ii) tensors of elastic constants and compliances via the Parrinello and Rahman’s fluctuation formula [M. Parrinello and A. Rahman, J. Chem. Phys., 76, 2662 (1982)], and iii) the investigation of polymorphic phase transformations. The MC3 code is written in Fortran90 and requires LAPACK and BLAS linear algebra libraries to be linked during compilation. Computationally expensive loops are accelerated using OpenMP.« less

  8. The decoding of Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.

    1988-01-01

    Reed-Solomon (RS) codes form an important part of the high-rate downlink telemetry system for the Magellan mission, and the RS decoding function for this project will be done by DSN. Although the basic idea behind all Reed-Solomon decoding algorithms was developed by Berlekamp in 1968, there are dozens of variants of Berlekamp's algorithm in current use. An attempt to restore order is made by presenting a mathematical theory which explains the working of almost all known RS decoding algorithms. The key innovation that makes this possible is the unified approach to the solution of the key equation, which simultaneously describes the Berlekamp, Berlekamp-Massey, Euclid, and continued fractions approaches. Additionally, a detailed analysis is made of what can happen to a generic RS decoding algorithm when the number of errors and erasures exceeds the code's designed correction capability, and it is shown that while most published algorithms do not detect as many of these error-erasure patterns as possible, by making a small change in the algorithms, this problem can be overcome.

  9. Assessment of Self-Regulatory Code Violations in Brazilian Television Beer Advertisements*

    PubMed Central

    Vendrame, Alan; Pinsky, Ilana; Souza E Silva, Rebeca; Babor, Thomas

    2010-01-01

    Objective: Research suggests that alcoholic beverage advertisements may have an adverse effect on teenagers and young adults, owing to their vulnerability to suggestive message content. This study was designed to evaluate perceived violations of the content guidelines of the Brazilian alcohol marketing self-regulation code, based on ratings of the five most popular beer advertisements broadcast on television in the summer of 2005–2006 and during the 2006 FIFA (Fédération Internationale de Football Association) World Cup games. Method: Five beer advertisements were selected from a previous study showing that they were perceived to be highly appealing to a sample of Brazilian teenagers. These advertisements were evaluated by a sample of Brazilian high school students using a rating procedure designed to measure the content of alcohol advertisements covered in industry self-regulation codes. Results: All five advertisements were found to violate multiple guidelines of the Brazilian code of marketing self-regulation. The advertisement with the greatest number of violations was Antarctica's “Male Repellent,” which was perceived to violate 11 of the 16 guidelines in the code. Two advertisements had nine violations, and one had eight. The guidelines most likely to be violated by these advertisements were Guideline 1, which is aimed at protecting children and teenagers, and Guideline 2, which prohibits content encouraging excessive and irresponsible alcoholic beverage consumption. Conclusions: The five beer advertisements rated as most appealing to Brazilian teenagers were perceived by a sample of the same population to have violated numerous principles of the Brazilian self-regulation code governing the marketing of alcoholic beverages. Because of these numerous perceived code violations, it now seems important for regulatory authorities to submit industry marketing content to more systematic evaluation by young people and public health experts and for researchers to focus more on the ways in which alcohol advertising influences early onset of drinking and excessive alcohol consumption. PMID:20409439

  10. Assessment of self-regulatory code violations in Brazilian television beer advertisements.

    PubMed

    Vendrame, Alan; Pinsky, Ilana; e Silva, Rebeca Souza; Babor, Thomas

    2010-05-01

    Research suggests that alcoholic beverage advertisements may have an adverse effect on teenagers and young adults, owing to their vulnerability to suggestive message content. This study was designed to evaluate perceived violations of the content guidelines of the Brazilian alcohol marketing self-regulation code, based on ratings of the five most popular beer advertisements broadcast on television in the summer of 2005-2006 and during the 2006 FIFA (Federation Internationale de Football Association) World Cup games. Five beer advertisements were selected from a previous study showing that they were perceived to be highly appealing to a sample of Brazilian teenagers. These advertisements were evaluated by a sample of Brazilian high school students using a rating procedure designed to measure the content of alcohol advertisements covered in industry self-regulation codes. All five advertisements were found to violate multiple guidelines of the Brazilian code of marketing self-regulation. The advertisement with the greatest number of violations was Antarctica's "Male Repellent," which was perceived to violate 11 of the 16 guidelines in the code. Two advertisements had nine violations, and one had eight. The guidelines most likely to be violated by these advertisements were Guideline 1, which is aimed at protecting children and teenagers, and Guideline 2, which prohibits content encouraging excessive and irresponsible alcoholic beverage consumption. The five beer advertisements rated as most appealing to Brazilian teenagers were perceived by a sample of the same population to have violated numerous principles of the Brazilian self-regulation code governing the marketing of alcoholic beverages. Because of these numerous perceived code violations, it now seems important for regulatory authorities to submit industry marketing content to more systematic evaluation by young people and public health experts and for researchers to focus more on the ways in which alcohol advertising influences early onset of drinking and excessive alcohol consumption.

  11. MadDM: Computation of dark matter relic abundance

    NASA Astrophysics Data System (ADS)

    Backović, Mihailo; Kong, Kyoungchul; McCaskey, Mathew

    2017-12-01

    MadDM computes dark matter relic abundance and dark matter nucleus scattering rates in a generic model. The code is based on the existing MadGraph 5 architecture and as such is easily integrable into any MadGraph collider study. A simple Python interface offers a level of user-friendliness characteristic of MadGraph 5 without sacrificing functionality. MadDM is able to calculate the dark matter relic abundance in models which include a multi-component dark sector, resonance annihilation channels and co-annihilations. The direct detection module of MadDM calculates spin independent / spin dependent dark matter-nucleon cross sections and differential recoil rates as a function of recoil energy, angle and time. The code provides a simplified simulation of detector effects for a wide range of target materials and volumes.

  12. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E.

    1985-01-01

    Progress in tethered satellite system dynamics research is reported. A retrieval rate control law with no angular feedback to investigate the system's dynamic response was studied. The initial conditions for the computer code which simulates the satellite's rotational dynamics were extended to a generic orbit. The model of the satellite thrusters was modified to simulate a pulsed thrust, by making the SKYHOOK integrator suitable for dealing with delta functions without loosing computational efficiency. Tether breaks were simulated with the high resolution computer code SLACK3. Shuttle's maneuvers were tested. The electric potential around a severed conductive tether with insulator, in the case of a tether breakage at 20 km from the Shuttle, was computed. The electrodynamic hazards due to the breakage of the TSS electrodynamic tether in a plasma are evaluated.

  13. An assessment of the adaptive unstructured tetrahedral grid, Euler Flow Solver Code FELISA

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Erickson, Larry L.

    1994-01-01

    A three-dimensional solution-adaptive Euler flow solver for unstructured tetrahedral meshes is assessed, and the accuracy and efficiency of the method for predicting sonic boom pressure signatures about simple generic models are demonstrated. Comparison of computational and wind tunnel data and enhancement of numerical solutions by means of grid adaptivity are discussed. The mesh generation is based on the advancing front technique. The FELISA code consists of two solvers, the Taylor-Galerkin and the Runge-Kutta-Galerkin schemes, both of which are spacially discretized by the usual Galerkin weighted residual finite-element methods but with different explicit time-marching schemes to steady state. The solution-adaptive grid procedure is based on either remeshing or mesh refinement techniques. An alternative geometry adaptive procedure is also incorporated.

  14. Tool for Rapid Analysis of Monte Carlo Simulations

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; McCall, Kurt E.; Hurtado, John E.

    2013-01-01

    Designing a spacecraft, or any other complex engineering system, requires extensive simulation and analysis work. Oftentimes, the large amounts of simulation data generated are very difficult and time consuming to analyze, with the added risk of overlooking potentially critical problems in the design. The authors have developed a generic data analysis tool that can quickly sort through large data sets and point an analyst to the areas in the data set that cause specific types of failures. The first version of this tool was a serial code and the current version is a parallel code, which has greatly increased the analysis capabilities. This paper describes the new implementation of this analysis tool on a graphical processing unit, and presents analysis results for NASA's Orion Monte Carlo data to demonstrate its capabilities.

  15. PyCorrFit-generic data evaluation for fluorescence correlation spectroscopy.

    PubMed

    Müller, Paul; Schwille, Petra; Weidemann, Thomas

    2014-09-01

    We present a graphical user interface (PyCorrFit) for the fitting of theoretical model functions to experimental data obtained by fluorescence correlation spectroscopy (FCS). The program supports many data file formats and features a set of tools specialized in FCS data evaluation. The Python source code is freely available for download from the PyCorrFit web page at http://pycorrfit.craban.de. We offer binaries for Ubuntu Linux, Mac OS X and Microsoft Windows. © The Author 2014. Published by Oxford University Press.

  16. Comments on the status of revived old names for some North American birds

    USGS Publications Warehouse

    Banks, R.C.; Browning, M.R.

    1995-01-01

    We discuss 44 instances of the use of generic, specific, or subspecific names that differ from those generally in use for North American (sensu AOU 1957) birds. These names are generally older than the names presently used and have been revived on the basis of priority. We examine the basis for the proposed changes and make recommendations as to which names should properly be used in an effort to promote nomenclatural stability in accordance with the International Code of Zoological Nomenclature.

  17. SC'11 Poster: A Highly Efficient MGPT Implementation for LAMMPS; with Strong Scaling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oppelstrup, T; Stukowski, A; Marian, J

    2011-12-07

    The MGPT potential has been implemented as a drop in package to the general molecular dynamics code LAMMPS. We implement an improved communication scheme that shrinks the communication layer thickness, and increases the load balancing. This results in unprecedented strong scaling, and speedup continuing beyond 1/8 atom/core. In addition, we have optimized the small matrix linear algebra with generic blocking (for all processors) and specific SIMD intrinsics for vectorization on Intel, AMD, and BlueGene CPUs.

  18. Effects of electronic prescribing on formulary compliance and generic drug utilization in the ambulatory care setting: a retrospective analysis of administrative claims data.

    PubMed

    Ross, S Michael; Papshev, Diana; Murphy, Erin L; Sternberg, David J; Taylor, Jeffrey; Barg, Ronald

    2005-06-01

    Electronic prescribing (e-prescribing) provides formulary information at the point of care. The objective of this study was to assess the effects of e-prescribing on formulary compliance and generic utilization. This was a retrospective analysis of pharmacy claims data from a large national managed care organization. A sample of 95 providers using predominantly e-prescribing was randomly selected (e-prescriber group). A matched sample of 95 traditional prescribers was selected (traditional prescriber group), matched to the e-prescriber group by zip code and medical specialty. A total of 110,975 paid pharmacy claims, for the 12 months from August 1, 2001, through July 31, 2002, were analyzed to assess the effect of e-prescribing on formulary compliance and generic utilization. All paid pharmacy claims were examined for each group; for the e-prescriber group, this included all claims, not just those prescribed using an e-prescribing device. A written qualitative survey was distributed to physicians and office managers to assess e-prescribing usage, sources of formulary information, and effects of e-prescribing on office resources. Both predominantly e-prescribers and traditional prescribers demonstrated high levels of formulary compliance, 83.2% versus 82.8%, respectively (P=0.32). Formulary compliance for these groups did not differ from the overall prescriber population (82.0%). There was not a difference in generic drug utilization rates between e-prescribers and traditional prescribers (absolute rates 37.3% versus 36.9%, P=0.18). Qualitative survey responses supported previously reported research indicating reductions in calls both to and from pharmacies for prescription orders. An examination of paid pharmacy claims from a large, national managed care organization demonstrated no differences between predominantly e-prescribers and traditional prescribers in measures of formulary compliance or generic drug utilization. Future studies should examine keystroke data at the point of care to observe more detail about drug selection methods.

  19. Early Talk About the Past Revisited: Affect in Working-Class and Middle-Class Children's Co-Narrations.

    ERIC Educational Resources Information Center

    Burger, Lisa K.; Miller, Peggy J.

    1999-01-01

    Investigated personal storytelling among young working-class and middle-class children, observing them at home at age 2; age 6 and 3; and under-one year. Analysis of generic properties, narrative content, and emotion talk revealed a complex configuration of similarities and differences. Differentiation between working-class and middle-class…

  20. The Administrator Training Program. A Model of Educational Leadership.

    ERIC Educational Resources Information Center

    Funderburg, Jean; And Others

    This paper describes the Administrator Training Program (ATP), a joint venture between San Jose Unified School District and Stanford University. A discussion of the ATP's theoretical framework is followed by an outline of the structure and content of the program and a review of the ATP outcomes. Then the generic elements of the ATP model are…

  1. Developing Professionalism in the Child Care Industry. An Instructional Program Guide for Child Care Workers.

    ERIC Educational Resources Information Center

    Johnson, Ann; And Others

    This program guide documents a child care job family curriculum that develops competence in generic work force education skills through two minicourses: Basic Issues in Child Care and Child Development Associate. An annotated table of contents lists a brief description of the questions answered in each section. An introduction presents a program…

  2. What Is Heat? Inquiry regarding the Science of Heat

    ERIC Educational Resources Information Center

    Rascoe, Barbara

    2010-01-01

    This lab activity uses inquiry to help students define heat. It is generic in that it can be used to introduce a plethora of science content across middle and high school grade levels and across science disciplines that include biology, Earth and space science, and physical science. Even though heat is a universal science phenomenon that is…

  3. Synergy and Students' Explanations: Exploring the Role of Generic and Content-Specific Scaffolds

    ERIC Educational Resources Information Center

    Delen, Ibrahim; Krajcik, Joseph

    2018-01-01

    In this study, we explored how a teacher used a new mobile application that enables students to collect data inside and outside the classroom, and then use the data to create scientific explanations by using claim-evidence-reasoning framework. Previous technologies designed to support scientific explanations focused on how these programs improve…

  4. Experimental localization of an acoustic sound source in a wind-tunnel flow by using a numerical time-reversal technique.

    PubMed

    Padois, Thomas; Prax, Christian; Valeau, Vincent; Marx, David

    2012-10-01

    The possibility of using the time-reversal technique to localize acoustic sources in a wind-tunnel flow is investigated. While the technique is widespread, it has scarcely been used in aeroacoustics up to now. The proposed method consists of two steps: in a first experimental step, the acoustic pressure fluctuations are recorded over a linear array of microphones; in a second numerical step, the experimental data are time-reversed and used as input data for a numerical code solving the linearized Euler equations. The simulation achieves the back-propagation of the waves from the array to the source and takes into account the effect of the mean flow on sound propagation. The ability of the method to localize a sound source in a typical wind-tunnel flow is first demonstrated using simulated data. A generic experiment is then set up in an anechoic wind tunnel to validate the proposed method with a flow at Mach number 0.11. Monopolar sources are first considered that are either monochromatic or have a narrow or wide-band frequency content. The source position estimation is well-achieved with an error inferior to the wavelength. An application to a dipolar sound source shows that this type of source is also very satisfactorily characterized.

  5. PACER -- A fast running computer code for the calculation of short-term containment/confinement loads following coolant boundary failure. Volume 2: User information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sienicki, J.J.

    A fast running and simple computer code has been developed to calculate pressure loadings inside light water reactor containments/confinements under loss-of-coolant accident conditions. PACER was originally developed to calculate containment/confinement pressure and temperature time histories for loss-of-coolant accidents in Soviet-designed VVER reactors and is relevant to the activities of the US International Nuclear Safety Center. The code employs a multicompartment representation of the containment volume and is focused upon application to early time containment phenomena during and immediately following blowdown. PACER has been developed for FORTRAN 77 and earlier versions of FORTRAN. The code has been successfully compiled and executedmore » on SUN SPARC and Hewlett-Packard HP-735 workstations provided that appropriate compiler options are specified. The code incorporates both capabilities built around a hardwired default generic VVER-440 Model V230 design as well as fairly general user-defined input. However, array dimensions are hardwired and must be changed by modifying the source code if the number of compartments/cells differs from the default number of nine. Detailed input instructions are provided as well as a description of outputs. Input files and selected output are presented for two sample problems run on both HP-735 and SUN SPARC workstations.« less

  6. Health claims in the labelling and marketing of food products:

    PubMed Central

    Asp, Nils-Georg; Bryngelsson, Susanne

    2007-01-01

    Since 1990 certain health claims in the labelling and marketing of food products have been allowed in Sweden within the food sector's Code of Practice. The rules were developed in close dialogue with the authorities. The legal basis was a decision by the authorities not to apply the medicinal products’ legislation to “foods normally found on the dinner table” provided the rules defined in the Code were followed. The Code of Practice lists nine well-established diet–health relationships eligible for generic disease risk reduction claims in two steps and general rules regarding nutrient function claims. Since 2001, there has also been the possibility for using “product-specific physiological claims (PFP)”, subject to premarketing evaluation of the scientific dossier supporting the claim. The scientific documentation has been approved for 10 products with PFP, and another 15 products have been found to fulfil the Code's criteria for “low glycaemic index”. In the third edition of the Code, active since 2004, conditions in terms of nutritional composition were set, i.e. “nutrient profiles”, with a general reference to the Swedish National Food Administration's regulation on the use of a particular symbol, i.e. the keyhole symbol. Applying the Swedish Code of practice has provided experience useful in the implementation of the European Regulation on nutrition and health claims made on foods, effective from 2007.

  7. Measuring Belief in Conspiracy Theories: The Generic Conspiracist Beliefs Scale

    PubMed Central

    Brotherton, Robert; French, Christopher C.; Pickering, Alan D.

    2013-01-01

    The psychology of conspiracy theory beliefs is not yet well understood, although research indicates that there are stable individual differences in conspiracist ideation – individuals’ general tendency to engage with conspiracy theories. Researchers have created several short self-report measures of conspiracist ideation. These measures largely consist of items referring to an assortment of prominent conspiracy theories regarding specific real-world events. However, these instruments have not been psychometrically validated, and this assessment approach suffers from practical and theoretical limitations. Therefore, we present the Generic Conspiracist Beliefs (GCB) scale: a novel measure of individual differences in generic conspiracist ideation. The scale was developed and validated across four studies. In Study 1, exploratory factor analysis of a novel 75-item measure of non-event-based conspiracist beliefs identified five conspiracist facets. The 15-item GCB scale was developed to sample from each of these themes. Studies 2, 3, and 4 examined the structure and validity of the GCB, demonstrating internal reliability, content, criterion-related, convergent and discriminant validity, and good test-retest reliability. In sum, this research indicates that the GCB is a psychometrically sound and practically useful measure of conspiracist ideation, and the findings add to our theoretical understanding of conspiracist ideation as a monological belief system unpinned by a relatively small number of generic assumptions about the typicality of conspiratorial activity in the world. PMID:23734136

  8. Chroma sampling and modulation techniques in high dynamic range video coding

    NASA Astrophysics Data System (ADS)

    Dai, Wei; Krishnan, Madhu; Topiwala, Pankaj

    2015-09-01

    High Dynamic Range and Wide Color Gamut (HDR/WCG) Video Coding is an area of intense research interest in the engineering community, for potential near-term deployment in the marketplace. HDR greatly enhances the dynamic range of video content (up to 10,000 nits), as well as broadens the chroma representation (BT.2020). The resulting content offers new challenges in its coding and transmission. The Moving Picture Experts Group (MPEG) of the International Standards Organization (ISO) is currently exploring coding efficiency and/or the functionality enhancements of the recently developed HEVC video standard for HDR and WCG content. FastVDO has developed an advanced approach to coding HDR video, based on splitting the HDR signal into a smoothed luminance (SL) signal, and an associated base signal (B). Both signals are then chroma downsampled to YFbFr 4:2:0 signals, using advanced resampling filters, and coded using the Main10 High Efficiency Video Coding (HEVC) standard, which has been developed jointly by ISO/IEC MPEG and ITU-T WP3/16 (VCEG). Our proposal offers both efficient coding, and backwards compatibility with the existing HEVC Main10 Profile. That is, an existing Main10 decoder can produce a viewable standard dynamic range video, suitable for existing screens. Subjective tests show visible improvement over the anchors. Objective tests show a sizable gain of over 25% in PSNR (RGB domain) on average, for a key set of test clips selected by the ISO/MPEG committee.

  9. Coding For Compression Of Low-Entropy Data

    NASA Technical Reports Server (NTRS)

    Yeh, Pen-Shu

    1994-01-01

    Improved method of encoding digital data provides for efficient lossless compression of partially or even mostly redundant data from low-information-content source. Method of coding implemented in relatively simple, high-speed arithmetic and logic circuits. Also increases coding efficiency beyond that of established Huffman coding method in that average number of bits per code symbol can be less than 1, which is the lower bound for Huffman code.

  10. Simulations of recoiling black holes: adaptive mesh refinement and radiative transfer

    NASA Astrophysics Data System (ADS)

    Meliani, Zakaria; Mizuno, Yosuke; Olivares, Hector; Porth, Oliver; Rezzolla, Luciano; Younsi, Ziri

    2017-02-01

    Context. In many astrophysical phenomena, and especially in those that involve the high-energy regimes that always accompany the astronomical phenomenology of black holes and neutron stars, physical conditions that are achieved are extreme in terms of speeds, temperatures, and gravitational fields. In such relativistic regimes, numerical calculations are the only tool to accurately model the dynamics of the flows and the transport of radiation in the accreting matter. Aims: We here continue our effort of modelling the behaviour of matter when it orbits or is accreted onto a generic black hole by developing a new numerical code that employs advanced techniques geared towards solving the equations of general-relativistic hydrodynamics. Methods: More specifically, the new code employs a number of high-resolution shock-capturing Riemann solvers and reconstruction algorithms, exploiting the enhanced accuracy and the reduced computational cost of adaptive mesh-refinement (AMR) techniques. In addition, the code makes use of sophisticated ray-tracing libraries that, coupled with general-relativistic radiation-transfer calculations, allow us to accurately compute the electromagnetic emissions from such accretion flows. Results: We validate the new code by presenting an extensive series of stationary accretion flows either in spherical or axial symmetry that are performed either in two or three spatial dimensions. In addition, we consider the highly nonlinear scenario of a recoiling black hole produced in the merger of a supermassive black-hole binary interacting with the surrounding circumbinary disc. In this way, we can present for the first time ray-traced images of the shocked fluid and the light curve resulting from consistent general-relativistic radiation-transport calculations from this process. Conclusions: The work presented here lays the ground for the development of a generic computational infrastructure employing AMR techniques to accurately and self-consistently calculate general-relativistic accretion flows onto compact objects. In addition to the accurate handling of the matter, we provide a self-consistent electromagnetic emission from these scenarios by solving the associated radiative-transfer problem. While magnetic fields are currently excluded from our analysis, the tools presented here can have a number of applications to study accretion flows onto black holes or neutron stars.

  11. Bioequivalence between innovator and generic tacrolimus in liver and kidney transplant recipients: A randomized, crossover clinical trial

    PubMed Central

    Vinks, Alexander A.; Fukuda, Tsuyoshi; King, Eileen C.; Zou, Yuanshu; Jiang, Wenlei; Klawitter, Jelena; Christians, Uwe

    2017-01-01

    Background Although the generic drug approval process has a long-term successful track record, concerns remain for approval of narrow therapeutic index generic immunosuppressants, such as tacrolimus, in transplant recipients. Several professional transplant societies and publications have generated skepticism of the generic approval process. Three major areas of concern are that the pharmacokinetic properties of generic products and the innovator (that is, “brand”) product in healthy volunteers may not reflect those in transplant recipients, bioequivalence between generic and innovator may not ensure bioequivalence between generics, and high-risk patients may have specific bioequivalence concerns. Such concerns have been fueled by anecdotal observations and retrospective and uncontrolled published studies, while well-designed, controlled prospective studies testing the validity of the regulatory bioequivalence testing approach for narrow therapeutic index immunosuppressants in transplant recipients have been lacking. Thus, the present study prospectively assesses bioequivalence between innovator tacrolimus and 2 generics in individuals with a kidney or liver transplant. Methods and findings From December 2013 through October 2014, a prospective, replicate dosing, partially blinded, randomized, 3-treatment, 6-period crossover bioequivalence study was conducted at the University of Cincinnati in individuals with a kidney (n = 35) or liver transplant (n = 36). Abbreviated New Drug Applications (ANDA) data that included manufacturing and healthy individual pharmacokinetic data for all generics were evaluated to select the 2 most disparate generics from innovator, and these were named Generic Hi and Generic Lo. During the 8-week study period, pharmacokinetic studies assessed the bioequivalence of Generic Hi and Generic Lo with the Innovator tacrolimus and with each other. Bioequivalence of the major tacrolimus metabolite was also assessed. All products fell within the US Food and Drug Administration (FDA) average bioequivalence (ABE) acceptance criteria of a 90% confidence interval contained within the confidence limits of 80.00% and 125.00%. Within-subject variability was similar for the area under the curve (AUC) (range 12.11–15.81) and the concentration maximum (Cmax) (range 17.96–24.72) for all products. The within-subject variability was utilized to calculate the scaled average bioequivalence (SCABE) 90% confidence interval. The calculated SCABE 90% confidence interval was 84.65%–118.13% and 80.00%–125.00% for AUC and Cmax, respectively. The more stringent SCABE acceptance criteria were met for all product comparisons for AUC and Cmax in both individuals with a kidney transplant and those with a liver transplant. European Medicines Agency (EMA) acceptance criteria for narrow therapeutic index drugs were also met, with the only exception being in the case of Brand versus Generic Lo, in which the upper limits of the 90% confidence intervals were 111.30% (kidney) and 112.12% (liver). These were only slightly above the upper EMA acceptance criteria limit for an AUC of 111.11%. SCABE criteria were also met for the major tacrolimus metabolite 13-O-desmethyl tacrolimus for AUC, but it failed the EMA criterion. No acute rejections, no differences in renal function in all individuals, and no differences in liver function were observed in individuals with a liver transplant using the Tukey honest significant difference (HSD) test for multiple comparisons. Fifty-two percent and 65% of all individuals with a kidney or liver transplant, respectively, reported an adverse event. The Exact McNemar test for paired categorical data with adjustments for multiple comparisons was used to compare adverse event rates among the products. No statistically significant differences among any pairs of products were found for any adverse event code or for adverse events overall. Limitations of this study include that the observations were made under strictly controlled conditions that did not allow for the impact of nonadherence or feeding on the possible pharmacokinetic differences. Generic Hi and Lo were selected based upon bioequivalence data in healthy volunteers because no pharmacokinetic data in recipients were available for all products. The safety data should be interpreted in light of the small number of participants and the short observation periods. Lastly, only the 1 mg tacrolimus strength was utilized in this study. Conclusions Using an innovative, controlled bioequivalence study design, we observed equivalence between tacrolimus innovator and 2 generic products as well as between 2 generic products in individuals after kidney or liver transplantation following current FDA bioequivalence metrics. These results support the position that bioequivalence for the narrow therapeutic index drug tacrolimus translates from healthy volunteers to individuals receiving a kidney or liver transplant and provides evidence that generic products that are bioequivalent with the innovator product are also bioequivalent to each other. Trial registration ClinicalTrials.gov NCT01889758. PMID:29135993

  12. Bioequivalence between innovator and generic tacrolimus in liver and kidney transplant recipients: A randomized, crossover clinical trial.

    PubMed

    Alloway, Rita R; Vinks, Alexander A; Fukuda, Tsuyoshi; Mizuno, Tomoyuki; King, Eileen C; Zou, Yuanshu; Jiang, Wenlei; Woodle, E Steve; Tremblay, Simon; Klawitter, Jelena; Klawitter, Jost; Christians, Uwe

    2017-11-01

    Although the generic drug approval process has a long-term successful track record, concerns remain for approval of narrow therapeutic index generic immunosuppressants, such as tacrolimus, in transplant recipients. Several professional transplant societies and publications have generated skepticism of the generic approval process. Three major areas of concern are that the pharmacokinetic properties of generic products and the innovator (that is, "brand") product in healthy volunteers may not reflect those in transplant recipients, bioequivalence between generic and innovator may not ensure bioequivalence between generics, and high-risk patients may have specific bioequivalence concerns. Such concerns have been fueled by anecdotal observations and retrospective and uncontrolled published studies, while well-designed, controlled prospective studies testing the validity of the regulatory bioequivalence testing approach for narrow therapeutic index immunosuppressants in transplant recipients have been lacking. Thus, the present study prospectively assesses bioequivalence between innovator tacrolimus and 2 generics in individuals with a kidney or liver transplant. From December 2013 through October 2014, a prospective, replicate dosing, partially blinded, randomized, 3-treatment, 6-period crossover bioequivalence study was conducted at the University of Cincinnati in individuals with a kidney (n = 35) or liver transplant (n = 36). Abbreviated New Drug Applications (ANDA) data that included manufacturing and healthy individual pharmacokinetic data for all generics were evaluated to select the 2 most disparate generics from innovator, and these were named Generic Hi and Generic Lo. During the 8-week study period, pharmacokinetic studies assessed the bioequivalence of Generic Hi and Generic Lo with the Innovator tacrolimus and with each other. Bioequivalence of the major tacrolimus metabolite was also assessed. All products fell within the US Food and Drug Administration (FDA) average bioequivalence (ABE) acceptance criteria of a 90% confidence interval contained within the confidence limits of 80.00% and 125.00%. Within-subject variability was similar for the area under the curve (AUC) (range 12.11-15.81) and the concentration maximum (Cmax) (range 17.96-24.72) for all products. The within-subject variability was utilized to calculate the scaled average bioequivalence (SCABE) 90% confidence interval. The calculated SCABE 90% confidence interval was 84.65%-118.13% and 80.00%-125.00% for AUC and Cmax, respectively. The more stringent SCABE acceptance criteria were met for all product comparisons for AUC and Cmax in both individuals with a kidney transplant and those with a liver transplant. European Medicines Agency (EMA) acceptance criteria for narrow therapeutic index drugs were also met, with the only exception being in the case of Brand versus Generic Lo, in which the upper limits of the 90% confidence intervals were 111.30% (kidney) and 112.12% (liver). These were only slightly above the upper EMA acceptance criteria limit for an AUC of 111.11%. SCABE criteria were also met for the major tacrolimus metabolite 13-O-desmethyl tacrolimus for AUC, but it failed the EMA criterion. No acute rejections, no differences in renal function in all individuals, and no differences in liver function were observed in individuals with a liver transplant using the Tukey honest significant difference (HSD) test for multiple comparisons. Fifty-two percent and 65% of all individuals with a kidney or liver transplant, respectively, reported an adverse event. The Exact McNemar test for paired categorical data with adjustments for multiple comparisons was used to compare adverse event rates among the products. No statistically significant differences among any pairs of products were found for any adverse event code or for adverse events overall. Limitations of this study include that the observations were made under strictly controlled conditions that did not allow for the impact of nonadherence or feeding on the possible pharmacokinetic differences. Generic Hi and Lo were selected based upon bioequivalence data in healthy volunteers because no pharmacokinetic data in recipients were available for all products. The safety data should be interpreted in light of the small number of participants and the short observation periods. Lastly, only the 1 mg tacrolimus strength was utilized in this study. Using an innovative, controlled bioequivalence study design, we observed equivalence between tacrolimus innovator and 2 generic products as well as between 2 generic products in individuals after kidney or liver transplantation following current FDA bioequivalence metrics. These results support the position that bioequivalence for the narrow therapeutic index drug tacrolimus translates from healthy volunteers to individuals receiving a kidney or liver transplant and provides evidence that generic products that are bioequivalent with the innovator product are also bioequivalent to each other. ClinicalTrials.gov NCT01889758.

  13. Treatment provider's knowledge of the Health and Disability Commissioner's Code of Consumer Rights.

    PubMed

    Townshend, Philip L; Sellman, J Douglas

    2002-06-01

    The Health and Disability Commissioner's (HDC) Code of Health and and Disability Consumers' Rights (the Code) defines in law the rights of consumers of health and disability services in New Zealand. In the first few years after the publication health educators, service providers and the HDC extensively promoted the Code. Providers of health and disability services would be expected to be knowledgeable about the areas covered by the Code if it is routinely used in the development and monitoring of treatment plans. In this study knowledge of the Code was tested in a random sample of 217 clinical staff that included medical staff, psychologists and counsellors working in Alcohol and Drug Treatment (A&D) centres in New Zealand. Any response showing awareness of a right, regardless of wording, was taken as a positive response as it was the areas covered by rights rather than their actual wording that was considered to be the important knowledge for providers. The main finding of this research was that 23% of staff surveyed were aware of none of the ten rights in the Code and only 6% were aware of more than five of the ten rights. Relating these data to results from a wider sample of treatment providers raises the possibility that A&D treatment providers are slightly more aware of the content of the Code than a general sample of health and disability service providers however overall awareness of the content of the Code by health providers is very low. These results imply that consumer rights issues are not prominent in the minds of providers perhaps indicating an ethical blind spot on their part. Ignorance of the content of the Code may indicate that the treatment community do not find it a useful working document or alternatively that clinicians are content to rely on their own good intentions to preserve the rights of their patients. Further research will be required to explain this lack of knowledge, however the current situation is that consumers cannot rely on clinicians being aware of the consumer's rights in health and disability services.

  14. Canada's Patented Medicine Notice of Compliance regulations: balancing the scales or tipping them?

    PubMed Central

    2011-01-01

    Background In order to comply with the provisions of the North American Free Trade Agreement, in 1993 the Canadian federal government introduced the Patented Medicine Notice of Compliance Linkage Regulations. These regulations were meant to achieve a balance between the timely entry of generic medicines and the rights of patent holders. The regulations tied the regulatory approval of generic medicines to the patent status of the original brand-name product. Discussion Since their introduction the regulations have been a source of contention between the generic and the brand-name industry. While the regulations have generated a considerable amount of work for the Federal Court of Canada both sides dispute the interpretation of the "win rate" in the court cases. Similarly, there is no agreement on whether multiple patents on single drugs represent a legitimate activity by the brand-name industry or an "evergreening" tactic. The generic industry's position is that the regulations are being abused leading to the delay in the introduction of lower cost generic products by as much as 8 years. The brand-name companies counter that the regulations are necessary because injunctions against the introduction of generic products are frequently unavailable to them. The regulations were amended in 2006 and again in 2008 but both sides continue to claim that the regulations favour the other party. The battle around the regulations also has an international dimension with interventions by PhRMA, the trade association representing the United States based multinational companies, arguing that the regulations are not stringent enough and that Canada needs to be placed on the U.S. Priority Watch List of countries. Finally, there are multiple costs to Canadian society as a result of the NOC regulations. Summary Despite the rhetoric there has been almost no empiric academic research done into the effect of the regulations. In order to develop rational policy in this area a number of key research questions have been formulated. PMID:21435247

  15. Validation of Persian Version of PedsQL™ 4.0™ Generic Core Scales in Toddlers and Children

    PubMed Central

    Gheissari, Alaleh; Farajzadegan, Ziba; Heidary, Maryam; Salehi, Fatemeh; Masaeli, Ali; Mazrooei, Amin; Varni, James W; Fallah, Zahra; Zandieh, Fariborz

    2012-01-01

    Introduction: To evaluate the reliability, validity and feasibility of the Persian version of the Pediatric Quality of Life inventory (PedsQL™ 4.0™ 4.0) Generic Core Scales in Iranian healthy students ages 7-15 and chronically ill children ages 2-18. Methods: We followed the translation methodology proposed by developer to validate Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales for children. Six hundred and sixty children and adolescents and their parents were enrolled. Sample of 160 healthy students were chosen by random cluster method between 4 regions of Isfahan education offices and 60 chronically ill children were recruited from St. Alzahra hospital private clinics. The questionnaires were fulfilled by the participants. Results: The Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales discriminated between healthy and chronically ill children (healthy students mean score was 12.3 better than chronically ill children, P<0.001). Cronbachs’ alpha internal consistency values exceeded 0.7 for children self reports and proxy reports of children 5-7 years old and 13-18 years old. Reliability of proxy reports for 2-4 years old was much lower than 0.7. Although, proxy reports for chronically ill children 8-12 years old was more than 0.7, these reports for healthy children with same age group was slightly lower than 0.7. Constructive, criterion face and content validity were acceptable. In addition, the Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales was feasible and easy to complete. Conclusion: Results showed that Persian version of PedsQL™ 4.0™ 4.0 Generic Core Scales is valid and acceptable for pediatric health researches. It is necessary to alternate scoring for 2-4 years old questionnaire and to find a way to increase reliability for healthy children aged 8-12 years especially, according to Iranian culture. PMID:22701775

  16. Overview of free-piston Stirling technology at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Slaby, J. G.

    1985-01-01

    An overview of the National Aeronautics and Space Administration (NASA) Lewis Research Center (Lewis) free-piston Stirling engine activities is presented. These activities include: (1) a generic free-piston Stirling technology project being conducted to develop technologies synergistic to both space power and terrestrial heat pump applications in a cooperative, cost-shared effort with the Department of Energy (DOE/Oak Ridge National Laboratory (ONRL)), and (2) a free-piston Stirling space-power technology demonstration project as part of the SP-100 program being conducted in support of the Department of Defense (DOD), DOE, and NASA/Lewis. The generic technology effort includes extensive parametric testing of a 1 kw free-piston Stirling engine (RE-1000), development and validation of a free-piston Stirling performance computer code, and fabrication and initial testing of an hydraulic output modification for the RE-1000 engine. The space power technology effort, under SP-100, addresses the status of the 25 kWe Space Power Demonstrator Engine (SPDE) including early test results.

  17. From Pixels to Response Maps: Discriminative Image Filtering for Face Alignment in the Wild.

    PubMed

    Asthana, Akshay; Zafeiriou, Stefanos; Tzimiropoulos, Georgios; Cheng, Shiyang; Pantic, Maja

    2015-06-01

    We propose a face alignment framework that relies on the texture model generated by the responses of discriminatively trained part-based filters. Unlike standard texture models built from pixel intensities or responses generated by generic filters (e.g. Gabor), our framework has two important advantages. First, by virtue of discriminative training, invariance to external variations (like identity, pose, illumination and expression) is achieved. Second, we show that the responses generated by discriminatively trained filters (or patch-experts) are sparse and can be modeled using a very small number of parameters. As a result, the optimization methods based on the proposed texture model can better cope with unseen variations. We illustrate this point by formulating both part-based and holistic approaches for generic face alignment and show that our framework outperforms the state-of-the-art on multiple "wild" databases. The code and dataset annotations are available for research purposes from http://ibug.doc.ic.ac.uk/resources.

  18. A Simulation and Modeling Framework for Space Situational Awareness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S S

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellitemore » intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.« less

  19. Experimental Stage Separation Tool Development in NASA Langley's Aerothermodynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Scallion, William I.

    2005-01-01

    As part of the research effort at NASA in support of the stage separation and ascent aerothermodynamics research program, proximity testing of a generic bimese wing-body configuration was conducted in NASA Langley's Aerothermodynamics Laboratory in the 20-Inch Mach 6 Air Tunnel. The objective of this work is the development of experimental tools and testing methodologies to apply to hypersonic stage separation problems for future multi-stage launch vehicle systems. Aerodynamic force and moment proximity data were generated at a nominal Mach number of 6 over a small range of angles of attack. The generic bimese configuration was tested in a belly-to-belly and back-to-belly orientation at 86 relative proximity locations. Over 800 aerodynamic proximity data points were taken to serve as a database for code validation. Longitudinal aerodynamic data generated in this test program show very good agreement with viscous computational predictions. Thus a framework has been established to study separation problems in the hypersonic regime using coordinated experimental and computational tools.

  20. Cephalothrix gen. nov. (Cyanobacteria): towards an intraspecific phylogenetic evaluation by multilocus analyses.

    PubMed

    da Silva Malone, Camila Francieli; Rigonato, Janaína; Laughinghouse, Haywood Dail; Schmidt, Éder Carlos; Bouzon, Zenilda Laurita; Wilmotte, Annick; Fiore, Marli Fátima; Sant'Anna, Célia Leite

    2015-09-01

    For more than a decade, the taxonomy of the Phormidiaceae has been problematic, since morphologically similar organisms represent phylogenetically distinct entities. Based on 16S rRNA gene sequence analyses, the polyphyletic genus Phormidium and other gas-vacuolated oscillatorioids appear scattered throughout the cyanobacterial tree of life. Recently, several studies have focused on understanding the oscillatorioid taxa at the generic level. At the specific level, few studies have characterized cyanobacterial strains using combined datasets (morphology, ultrastructure and molecular multilocus analyses). Using a multifaceted approach, we propose a new, well-defined genus, Cephalothrix gen. nov., by analysing seven filamentous strains that are morphologically 'intermediate' between gas-vacuolated taxa and Phormidium. Furthermore, we characterize two novel species: Cephalothrix komarekiana sp. nov. (strains CCIBt 3277, CCIBt 3279, CCIBt 3523, CCALA 155, SAG 75.79 and UTEX 1580) and Cephalothrix lacustris sp. nov. (strain CCIBt 3261). The generic name and specific epithets are proposed under the provisions of the International Code of Nomenclature for Algae, Fungi, and Plants.

  1. Using a Mixed Methods Content Analysis to Analyze Mission Statements from Colleges of Engineering

    ERIC Educational Resources Information Center

    Creamer, Elizabeth G.; Ghoston, Michelle

    2013-01-01

    A mixed method design was used to conduct a content analysis of the mission statements of colleges of engineering to map inductively derived codes with the EC 2000 outcomes and to test if any of the codes were significantly associated with institutions with reasonably strong representation of women. Most institution's (25 of 48) mission statement…

  2. Embedding QR codes in tumor board presentations, enhancing educational content for oncology information management.

    PubMed

    Siderits, Richard; Yates, Stacy; Rodriguez, Arelis; Lee, Tina; Rimmer, Cheryl; Roche, Mark

    2011-01-01

    Quick Response (QR) Codes are standard in supply management and seen with increasing frequency in advertisements. They are now present regularly in healthcare informatics and education. These 2-dimensional square bar codes, originally designed by the Toyota car company, are free of license and have a published international standard. The codes can be generated by free online software and the resulting images incorporated into presentations. The images can be scanned by "smart" phones and tablets using either the iOS or Android platforms, which link the device with the information represented by the QR code (uniform resource locator or URL, online video, text, v-calendar entries, short message service [SMS] and formatted text). Once linked to the device, the information can be viewed at any time after the original presentation, saved in the device or to a Web-based "cloud" repository, printed, or shared with others via email or Bluetooth file transfer. This paper describes how we use QR codes in our tumor board presentations, discusses the benefits, the different QR codes from Web links and how QR codes facilitate the distribution of educational content.

  3. Visual Tracking via Sparse and Local Linear Coding.

    PubMed

    Wang, Guofeng; Qin, Xueying; Zhong, Fan; Liu, Yue; Li, Hongbo; Peng, Qunsheng; Yang, Ming-Hsuan

    2015-11-01

    The state search is an important component of any object tracking algorithm. Numerous algorithms have been proposed, but stochastic sampling methods (e.g., particle filters) are arguably one of the most effective approaches. However, the discretization of the state space complicates the search for the precise object location. In this paper, we propose a novel tracking algorithm that extends the state space of particle observations from discrete to continuous. The solution is determined accurately via iterative linear coding between two convex hulls. The algorithm is modeled by an optimal function, which can be efficiently solved by either convex sparse coding or locality constrained linear coding. The algorithm is also very flexible and can be combined with many generic object representations. Thus, we first use sparse representation to achieve an efficient searching mechanism of the algorithm and demonstrate its accuracy. Next, two other object representation models, i.e., least soft-threshold squares and adaptive structural local sparse appearance, are implemented with improved accuracy to demonstrate the flexibility of our algorithm. Qualitative and quantitative experimental results demonstrate that the proposed tracking algorithm performs favorably against the state-of-the-art methods in dynamic scenes.

  4. Brand name to generic substitution of antiepileptic drugs does not lead to seizure-related hospitalization: a population-based case-crossover study.

    PubMed

    Polard, Elisabeth; Nowak, Emmanuel; Happe, André; Biraben, Arnaud; Oger, Emmanuel

    2015-11-01

    There is still controversy on brand-to-generic (B-G) antiepileptic drugs (AEDs) substitution. To assess association between B-G AED substitution and seizure-related hospitalization, we designed a case crossover using the French National Health Insurance Database. We identified a cohort of adult patients who filled a prescription in 2009-2011 for AEDs with at least one brand name and one generic form. The outcome date was defined as the date of hospitalization, coded G40.x or G41.x, with a G40/G41 hospitalization-free period of at least 1 year. Patients with a medical history of cancer and women who gave birth in 2009-2011 were excluded. We required individuals to have regular dispensations of AEDs within the year preceding the outcome date. Free patients were defined as patients who had only brand name dispensations before the control period. Eight thousand three hundred seventy nine patients (mean age ± standard deviation, 52.7 ± 18.8 years; sex ratio male/female, 1.27) were analyzed. Discordant pairs were 491 with B-G substitution in the control period only and 478 with B-G substitution in the case period only; odds ratio (95% confidence interval) 0.97 (0.86-1.10). No statistically significant interaction was detected among the four prespecified subgroup analyses (gender, age strata, free or non-free, and strict AED monotherapy or not). Controlling for non-seizure-related hospitalizations made no material difference. Sensitivity analyses yielded similar results. Brand-to-generic AED substitution was not associated with an elevated risk of seizure-related hospitalization. Copyright © 2015 John Wiley & Sons, Ltd.

  5. The Generic Short Patient Experiences Questionnaire (GS-PEQ): identification of core items from a survey in Norway

    PubMed Central

    2011-01-01

    Background Questionnaires are commonly used to collect patient, or user, experiences with health care encounters; however, their adaption to specific target groups limits comparison between groups. We present the construction of a generic questionnaire (maximum of ten questions) for user evaluation across a range of health care services. Methods Based on previous testing of six group-specific questionnaires, we first constructed a generic questionnaire with 23 items related to user experiences. All questions included a "not applicable" response option, as well as a follow-up question about the item's importance. Nine user groups from one health trust were surveyed. Seven groups received questionnaires by mail and two by personal distribution. Selection of core questions was based on three criteria: applicability (proportion "not applicable"), importance (mean scores on follow-up questions), and comprehensiveness (content coverage, maximum two items per dimension). Results 1324 questionnaires were returned providing subsample sizes ranging from 52 to 323. Ten questions were excluded because the proportion of "not applicable" responses exceeded 20% in at least one user group. The number of remaining items was reduced to ten by applying the two other criteria. The final short questionnaire included items on outcome (2), clinician services (2), user involvement (2), incorrect treatment (1), information (1), organisation (1), and accessibility (1). Conclusion The Generic Short Patient Experiences Questionnaire (GS-PEQ) is a short, generic set of questions on user experiences with specialist health care that covers important topics for a range of groups. It can be used alone or with other instruments in quality assessment or in research. The psychometric properties and the relevance of the GS-PEQ in other health care settings and countries need further evaluation. PMID:21510871

  6. The effect of repeated irrigation with varying total organic carbon content on the persistence of E. coli O157:H7 on baby spinach

    USDA-ARS?s Scientific Manuscript database

    In response to U.S. foodborne illnesses caused by contaminated spinach, growers have adopted regulations stated in the California Leafy Greens Marketing Agreement (LGMA). The LGMA permits a maximum population mean of 126 Most Probable Number (MPN) generic E. coli per 100 ml irrigation water. These...

  7. Synergy between Teacher Practices and Curricular Scaffolds to Support Students in Using Domain-Specific and Domain-General Knowledge in Writing Arguments to Explain Phenomena

    ERIC Educational Resources Information Center

    McNeill, Katherine L.; Krajcik, Joseph

    2009-01-01

    We investigated how 2 different curricular scaffolds (context-specific vs. generic), teacher instructional practices, and the interaction between these 2 types of support influenced students' learning of science content and their ability to write scientific arguments to explain phenomena. The context-specific scaffolds provided students with hints…

  8. A joint signal processing and cryptographic approach to multimedia encryption.

    PubMed

    Mao, Yinian; Wu, Min

    2006-07-01

    In recent years, there has been an increasing trend for multimedia applications to use delegate service providers for content distribution, archiving, search, and retrieval. These delegate services have brought new challenges to the protection of multimedia content confidentiality. This paper discusses the importance and feasibility of applying a joint signal processing and cryptographic approach to multimedia encryption, in order to address the access control issues unique to multimedia applications. We propose two atomic encryption operations that can preserve standard compliance and are friendly to delegate processing. Quantitative analysis for these operations is presented to demonstrate that a good tradeoff can be made between security and bitrate overhead. In assisting the design and evaluation of media security systems, we also propose a set of multimedia-oriented security scores to quantify the security against approximation attacks and to complement the existing notion of generic data security. Using video as an example, we present a systematic study on how to strategically integrate different atomic operations to build a video encryption system. The resulting system can provide superior performance over both generic encryption and its simple adaptation to video in terms of a joint consideration of security, bitrate overhead, and friendliness to delegate processing.

  9. Embed dynamic content in your poster.

    PubMed

    Hutchins, B Ian

    2013-01-29

    A new technology has emerged that will facilitate the presentation of dynamic or otherwise inaccessible data on posters at scientific meetings. Video, audio, or other digital files hosted on mobile-friendly sites can be linked to through a quick response (QR) code, a two-dimensional barcode that can be scanned by smartphones, which then display the content. This approach is more affordable than acquiring tablet computers for playing dynamic content and can reach many users at large conferences. This resource details how to host videos, generate QR codes, and view the associated files on mobile devices.

  10. Scenario tree model for animal disease freedom framed in the OIE context using the example of a generic swine model for Aujeszky's disease in commercial swine in Canada.

    PubMed

    Christensen, Jette; Vallières, André

    2016-01-01

    "Freedom from animal disease" is an ambiguous concept that may have a different meaning in trade and science. For trade alone, there are different levels of freedom from OIE listed diseases. A country can: be recognized by OIE to be "officially free"; self-declare freedom, with no official recognition by the OIE; or report animal disease as absent (no occurrence) in six-monthly reports. In science, we apply scenario tree models to calculate the probability of a population being free from disease at a given prevalence to provide evidence of freedom from animal disease. Here, we link science with application by describing how a scenario tree model may contribute to a country's claim of freedom from animal disease. We combine the idea of a standardized presentation of scenario tree models for disease freedom and having a similar model for two different animal diseases to suggest that a simple generic model may help veterinary authorities to build and evaluate scenario tree models for disease freedom. Here, we aim to develop a generic scenario tree model for disease freedom that is: animal species specific, population specific, and has a simple structure. The specific objectives were: to explore the levels of freedom described in the OIE Terrestrial Animal Health Code; to describe how scenario tree models may contribute to a country's claim of freedom from animal disease; and to present a generic swine scenario tree model for disease freedom in Canada's domestic (commercial) swine applied to Aujeszky's disease (AD). In particular, to explore how historical survey data, and data mining may affect the probability of freedom and to explore different sampling strategies. Finally, to frame the generic scenario tree model in the context of Canada's claim of freedom from AD. We found that scenario tree models are useful to support a country's claim of freedom either as "recognized officially free" or as part of a self-declaration but the models should not stand alone in a claim. The generic AD scenario tree model demonstrated the benefit of combining three sources of surveillance data and helped to design the surveillance for the next year. The generic AD scenario model is one piece in Canada's self-declaration of freedom from AD. The model is strongly supported by the fact that AD has never been detected in Canada. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  11. A hybrid design methodology for structuring an Integrated Environmental Management System (IEMS) for shipping business.

    PubMed

    Celik, Metin

    2009-03-01

    The International Safety Management (ISM) Code defines a broad framework for the safe management and operation of merchant ships, maintaining high standards of safety and environmental protection. On the other hand, ISO 14001:2004 provides a generic, worldwide environmental management standard that has been utilized by several industries. Both the ISM Code and ISO 14001:2004 have the practical goal of establishing a sustainable Integrated Environmental Management System (IEMS) for shipping businesses. This paper presents a hybrid design methodology that shows how requirements from both standards can be combined into a single execution scheme. Specifically, the Analytic Hierarchy Process (AHP) and Fuzzy Axiomatic Design (FAD) are used to structure an IEMS for ship management companies. This research provides decision aid to maritime executives in order to enhance the environmental performance in the shipping industry.

  12. A generic interface between COSMIC/NASTRAN and PATRAN (R)

    NASA Technical Reports Server (NTRS)

    Roschke, Paul N.; Premthamkorn, Prakit; Maxwell, James C.

    1990-01-01

    Despite its powerful analytical capabilities, COSMIC/NASTRAN lacks adequate post-processing adroitness. PATRAN, on the other hand is widely accepted for its graphical capabilities. A nonproprietary, public domain code mnemonically titled CPI (for COSMIC/NASTRAN-PATRAN Interface) is designed to manipulate a large number of files rapidly and efficiently between the two parent codes. In addition to PATRAN's results file preparation, CPI also prepares PATRAN's P/PLOT data files for xy plotting. The user is prompted for necessary information during an interactive session. Current implementation supports NASTRAN's displacement approach including the following rigid formats: (1) static analysis, (2) normal modal analysis, (3) direct transient response, and (4) modal transient response. A wide variety of data blocks are also supported. Error trapping is given special consideration. A sample session with CPI illustrates its simplicity and ease of use.

  13. Impact of Reactor Operating Parameters on Cask Reactivity in BWR Burnup Credit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilas, Germina; Betzler, Benjamin R; Ade, Brian J

    This paper discusses the effect of reactor operating parameters used in fuel depletion calculations on spent fuel cask reactivity, with relevance for boiling-water reactor (BWR) burnup credit (BUC) applications. Assessments that used generic BWR fuel assembly and spent fuel cask configurations are presented. The considered operating parameters, which were independently varied in the depletion simulations for the assembly, included fuel temperature, bypass water density, specific power, and operating history. Different operating history scenarios were considered for the assembly depletion to determine the effect of relative power distribution during the irradiation cycles, as well as the downtime between cycles. Depletion, decay,more » and criticality simulations were performed using computer codes and associated nuclear data within the SCALE code system. Results quantifying the dependence of cask reactivity on the assembly depletion parameters are presented herein.« less

  14. MIMO-OFDM System's Performance Using LDPC Codes for a Mobile Robot

    NASA Astrophysics Data System (ADS)

    Daoud, Omar; Alani, Omar

    This work deals with the performance of a Sniffer Mobile Robot (SNFRbot)-based spatial multiplexed wireless Orthogonal Frequency Division Multiplexing (OFDM) transmission technology. The use of Multi-Input Multi-Output (MIMO)-OFDM technology increases the wireless transmission rate without increasing transmission power or bandwidth. A generic multilayer architecture of the SNFRbot is proposed with low power and low cost. Some experimental results are presented and show the efficiency of sniffing deadly gazes, sensing high temperatures and sending live videos of the monitored situation. Moreover, simulation results show the achieved performance by tackling the Peak-to-Average Power Ratio (PAPR) problem of the used technology using Low Density Parity Check (LDPC) codes; and the effect of combating the PAPR on the bit error rate (BER) and the signal to noise ratio (SNR) over a Doppler spread channel.

  15. A game-theoretical approach to multimedia social networks security.

    PubMed

    Liu, Enqiang; Liu, Zengliang; Shao, Fei; Zhang, Zhiyong

    2014-01-01

    The contents access and sharing in multimedia social networks (MSNs) mainly rely on access control models and mechanisms. Simple adoptions of security policies in the traditional access control model cannot effectively establish a trust relationship among parties. This paper proposed a novel two-party trust architecture (TPTA) to apply in a generic MSN scenario. According to the architecture, security policies are adopted through game-theoretic analyses and decisions. Based on formalized utilities of security policies and security rules, the choice of security policies in content access is described as a game between the content provider and the content requester. By the game method for the combination of security policies utility and its influences on each party's benefits, the Nash equilibrium is achieved, that is, an optimal and stable combination of security policies, to establish and enhance trust among stakeholders.

  16. A Game-Theoretical Approach to Multimedia Social Networks Security

    PubMed Central

    Liu, Enqiang; Liu, Zengliang; Shao, Fei; Zhang, Zhiyong

    2014-01-01

    The contents access and sharing in multimedia social networks (MSNs) mainly rely on access control models and mechanisms. Simple adoptions of security policies in the traditional access control model cannot effectively establish a trust relationship among parties. This paper proposed a novel two-party trust architecture (TPTA) to apply in a generic MSN scenario. According to the architecture, security policies are adopted through game-theoretic analyses and decisions. Based on formalized utilities of security policies and security rules, the choice of security policies in content access is described as a game between the content provider and the content requester. By the game method for the combination of security policies utility and its influences on each party's benefits, the Nash equilibrium is achieved, that is, an optimal and stable combination of security policies, to establish and enhance trust among stakeholders. PMID:24977226

  17. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  18. Transient performance of fan engine with water ingestion

    NASA Technical Reports Server (NTRS)

    Murthy, S. N. B.; Mullican, A.

    1993-01-01

    In a continuing investigation on developing and applying codes for prediction of performance of a turbine jet engine and its components with water ingestion during flight operation, including power settings, and flight altitudes and speed changes, an attempt was made to establish the effects of water ingestion through simulation of a generic high bypass ratio engine with a generic control. In view of the large effects arising in the air compression system and the prediffuser-combustor unit during water ingestion, attention was focused on those effects and the resulting changes in engine performance. Under all conditions of operation, whether ingestion is steady or not, it became evident that water ingestion causes a fan-compressor unit to operate in a time-dependent fashion with periodic features, particularly with respect to the state of water in the span and the film in the casing clearance space, at the exit of the machine. On the other hand, the aerodynamic performance of the unit may be considered as quasi-steady once the distribution of water has attained an equilibrium state with respect to its distribution and motion. For purposes of engine simulation, the performance maps for the generic fan-compressor unit were generated based on the attainment of a quasi-steady state (meaning steady except for long-period variations in performance) during ingestion and operation over a wide enough range of rotational speeds.

  19. A generic implementation of replica exchange with solute tempering (REST2) algorithm in NAMD for complex biophysical simulations

    NASA Astrophysics Data System (ADS)

    Jo, Sunhwan; Jiang, Wei

    2015-12-01

    Replica Exchange with Solute Tempering (REST2) is a powerful sampling enhancement algorithm of molecular dynamics (MD) in that it needs significantly smaller number of replicas but achieves higher sampling efficiency relative to standard temperature exchange algorithm. In this paper, we extend the applicability of REST2 for quantitative biophysical simulations through a robust and generic implementation in greatly scalable MD software NAMD. The rescaling procedure of force field parameters controlling REST2 "hot region" is implemented into NAMD at the source code level. A user can conveniently select hot region through VMD and write the selection information into a PDB file. The rescaling keyword/parameter is written in NAMD Tcl script interface that enables an on-the-fly simulation parameter change. Our implementation of REST2 is within communication-enabled Tcl script built on top of Charm++, thus communication overhead of an exchange attempt is vanishingly small. Such a generic implementation facilitates seamless cooperation between REST2 and other modules of NAMD to provide enhanced sampling for complex biomolecular simulations. Three challenging applications including native REST2 simulation for peptide folding-unfolding transition, free energy perturbation/REST2 for absolute binding affinity of protein-ligand complex and umbrella sampling/REST2 Hamiltonian exchange for free energy landscape calculation were carried out on IBM Blue Gene/Q supercomputer to demonstrate efficacy of REST2 based on the present implementation.

  20. Single-trabecula building block for large-scale finite element models of cancellous bone.

    PubMed

    Dagan, D; Be'ery, M; Gefen, A

    2004-07-01

    Recent development of high-resolution imaging of cancellous bone allows finite element (FE) analysis of bone tissue stresses and strains in individual trabeculae. However, specimen-specific stress/strain analyses can include effects of anatomical variations and local damage that can bias the interpretation of the results from individual specimens with respect to large populations. This study developed a standard (generic) 'building-block' of a trabecula for large-scale FE models. Being parametric and based on statistics of dimensions of ovine trabeculae, this building block can be scaled for trabecular thickness and length and be used in commercial or custom-made FE codes to construct generic, large-scale FE models of bone, using less computer power than that currently required to reproduce the accurate micro-architecture of trabecular bone. Orthogonal lattices constructed with this building block, after it was scaled to trabeculae of the human proximal femur, provided apparent elastic moduli of approximately 150 MPa, in good agreement with experimental data for the stiffness of cancellous bone from this site. Likewise, lattices with thinner, osteoporotic-like trabeculae could predict a reduction of approximately 30% in the apparent elastic modulus, as reported in experimental studies of osteoporotic femora. Based on these comparisons, it is concluded that the single-trabecula element developed in the present study is well-suited for representing cancellous bone in large-scale generic FE simulations.

  1. View-Independent Working Memory Representations of Artificial Shapes in Prefrontal and Posterior Regions of the Human Brain.

    PubMed

    Christophel, Thomas B; Allefeld, Carsten; Endisch, Christian; Haynes, John-Dylan

    2018-06-01

    Traditional views of visual working memory postulate that memorized contents are stored in dorsolateral prefrontal cortex using an adaptive and flexible code. In contrast, recent studies proposed that contents are maintained by posterior brain areas using codes akin to perceptual representations. An important question is whether this reflects a difference in the level of abstraction between posterior and prefrontal representations. Here, we investigated whether neural representations of visual working memory contents are view-independent, as indicated by rotation-invariance. Using functional magnetic resonance imaging and multivariate pattern analyses, we show that when subjects memorize complex shapes, both posterior and frontal brain regions maintain the memorized contents using a rotation-invariant code. Importantly, we found the representations in frontal cortex to be localized to the frontal eye fields rather than dorsolateral prefrontal cortices. Thus, our results give evidence for the view-independent storage of complex shapes in distributed representations across posterior and frontal brain regions.

  2. MacSyFinder: A Program to Mine Genomes for Molecular Systems with an Application to CRISPR-Cas Systems

    PubMed Central

    Abby, Sophie S.; Néron, Bertrand; Ménager, Hervé; Touchon, Marie; Rocha, Eduardo P. C.

    2014-01-01

    Motivation Biologists often wish to use their knowledge on a few experimental models of a given molecular system to identify homologs in genomic data. We developed a generic tool for this purpose. Results Macromolecular System Finder (MacSyFinder) provides a flexible framework to model the properties of molecular systems (cellular machinery or pathway) including their components, evolutionary associations with other systems and genetic architecture. Modelled features also include functional analogs, and the multiple uses of a same component by different systems. Models are used to search for molecular systems in complete genomes or in unstructured data like metagenomes. The components of the systems are searched by sequence similarity using Hidden Markov model (HMM) protein profiles. The assignment of hits to a given system is decided based on compliance with the content and organization of the system model. A graphical interface, MacSyView, facilitates the analysis of the results by showing overviews of component content and genomic context. To exemplify the use of MacSyFinder we built models to detect and class CRISPR-Cas systems following a previously established classification. We show that MacSyFinder allows to easily define an accurate “Cas-finder” using publicly available protein profiles. Availability and Implementation MacSyFinder is a standalone application implemented in Python. It requires Python 2.7, Hmmer and makeblastdb (version 2.2.28 or higher). It is freely available with its source code under a GPLv3 license at https://github.com/gem-pasteur/macsyfinder. It is compatible with all platforms supporting Python and Hmmer/makeblastdb. The “Cas-finder” (models and HMM profiles) is distributed as a compressed tarball archive as Supporting Information. PMID:25330359

  3. Assessment of the pharmaceutical quality of marketed enteric coated pantoprazole sodium sesquihydrate products.

    PubMed

    Mostafa, Haitham F; Ibrahim, Mohamed A; Mahrous, Gamal M; Sakr, Adel

    2011-04-01

    Pantoprazole sodium sesquihydrate (PSS) is a proton pump inhibitor, used in acid-related disorders, like peptic ulcer and gastroesophageal reflux. Increasing the number of pantoprazole containing products in the market, raises questions of its efficacy and generic substitution. The pharmaceutical quality of 6 generic PSS enteric coated tablets in 2 local markets was assessed relative to the innovator product (pantozol®). Uniformity of dosage unit, disintegration and in vitro drug release were determined using United States pharmacopeia for delayed release tablets. The similarity factor (f2) was assessed using the FDA recommended approach (f2 similarity factor). The content uniformity of the innovator product was 98.39% of the labeled claim with RSD value of 1.08%, while the content of generic products ranged from 96.98% to 98.80% with RSD values of 1.24-2.19%. All the products showed no disintegration, cracks or swelling in 0.1 N HCl, except product 1, which showed complete disintegration after 20 min. However, the disintegration of all the products in phosphate buffer met USP requirements. Dissolution of tablets in 0.1 N HCl showed no drug release after 2 h except product 1 in which one tablet showed a drug release more than 10% at acid stage level A1. In addition, three tablets of this product showed dissolution of 45%, 48% and 69% at acid stage level A2. The similarity factor f2 of the products was between 71 and 74 indicating the similarity in dissolution profiles of all the products in accordance to FDA requirements, except product 1 in which f2 value was 18.67.

  4. Specialty-specific multi-source feedback: assuring validity, informing training.

    PubMed

    Davies, Helena; Archer, Julian; Bateman, Adrian; Dewar, Sandra; Crossley, Jim; Grant, Janet; Southgate, Lesley

    2008-10-01

    The white paper 'Trust, Assurance and Safety: the Regulation of Health Professionals in the 21st Century' proposes a single, generic multi-source feedback (MSF) instrument in the UK. Multi-source feedback was proposed as part of the assessment programme for Year 1 specialty training in histopathology. An existing instrument was modified following blueprinting against the histopathology curriculum to establish content validity. Trainees were also assessed using an objective structured practical examination (OSPE). Factor analysis and correlation between trainees' OSPE performance and the MSF were used to explore validity. All 92 trainees participated and the assessor response rate was 93%. Reliability was acceptable with eight assessors (95% confidence interval 0.38). Factor analysis revealed two factors: 'generic' and 'histopathology'. Pearson correlation of MSF scores with OSPE performances was 0.48 (P = 0.001) and the histopathology factor correlated more highly (histopathology r = 0.54, generic r = 0.42; t = - 2.76, d.f. = 89, P < 0.01). Trainees scored least highly in relation to ability to use histopathology to solve clinical problems (mean = 4.39) and provision of good reports (mean = 4.39). Three of six doctors whose means were < 4.0 received free text comments about report writing. There were 83 forms with aggregate scores of < 4. Of these, 19.2% included comments about report writing. Specialty-specific MSF is feasible and achieves satisfactory reliability. The higher correlation of the 'histopathology' factor with the OSPE supports validity. This paper highlights the importance of validating an MSF instrument within the specialty-specific context as, in addition to assuring content validity, the PATH-SPRAT (Histopathology-Sheffield Peer Review Assessment Tool) also demonstrates the potential to inform training as part of a quality improvement model.

  5. Brand name and generic proton pump inhibitor prescriptions in the United States: insights from the national ambulatory medical care survey (2006-2010).

    PubMed

    Gawron, Andrew J; Feinglass, Joseph; Pandolfino, John E; Tan, Bruce K; Bove, Michiel J; Shintani-Smith, Stephanie

    2015-01-01

    Introduction. Proton pump inhibitors (PPI) are one of the most commonly prescribed medication classes with similar efficacy between brand name and generic PPI formulations. Aims. We determined demographic, clinical, and practice characteristics associated with brand name PPI prescriptions at ambulatory care visits in the United States. Methods. Observational cross sectional analysis using the National Ambulatory Medical Care Survey (NAMCS) of all adult (≥18 yrs of age) ambulatory care visits from 2006 to 2010. PPI prescriptions were identified by using the drug entry code as brand name only or generic available formulations. Descriptive statistics were reported in terms of unweighted patient visits and proportions of encounters with brand name PPI prescriptions. Global chi-square tests were used to compare visits with brand name PPI prescriptions versus generic PPI prescriptions for each measure. Poisson regression was used to determine the incidence rate ratio (IRR) for generic versus brand PPI prescribing. Results. A PPI was prescribed at 269.7 million adult ambulatory visits, based on 9,677 unweighted visits, of which 53% were brand name only prescriptions. In 2006, 76.0% of all PPI prescriptions had a brand name only formulation compared to 31.6% of PPI prescriptions in 2010. Visits by patients aged 25-44 years had the greatest proportion of brand name PPI formulations (57.9%). Academic medical centers and physician-owned practices had the greatest proportion of visits with brand name PPI prescriptions (58.9% and 55.6% of visits with a PPI prescription, resp.). There were no significant differences in terms of median income, patient insurance type, or metropolitan status when comparing the proportion of visits with brand name versus generic PPI prescriptions. Poisson regression results showed that practice ownership type was most strongly associated with the likelihood of receiving a brand name PPI over the entire study period. Compared to HMO visits, patient visits at academic medical centers (IRR 4.2, 95% CI 2.2-8.0), physician-owned practices (IRR 3.9, 95% CI 2.1-7.1), and community health centers (IRR 3.6, 95% CI 1.9-6.6) were all more likely to have brand name PPIs. Conclusion. PPI prescriptions with brand name only formulations are most strongly associated with physician practice type.

  6. Beam Instrument Development System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DOOLITTLE, LAWRENCE; HUANG, GANG; DU, QIANG

    Beam Instrumentation Development System (BIDS) is a collection of common support libraries and modules developed during a series of Low-Level Radio Frequency (LLRF) control and timing/synchronization projects. BIDS includes a collection of Hardware Description Language (HDL) libraries and software libraries. The BIDS can be used for the development of any FPGA-based system, such as LLRF controllers. HDL code in this library is generic and supports common Digital Signal Processing (DSP) functions, FPGA-specific drivers (high-speed serial link wrappers, clock generation, etc.), ADC/DAC drivers, Ethernet MAC implementation, etc.

  7. Experimental investigation of hypersonic aerodynamics

    NASA Technical Reports Server (NTRS)

    Heinemann, K.; Intrieri, Peter F.

    1987-01-01

    An extensive series of ballistic range tests are currently being conducted at the Ames Research Center. These tests are intended to investigate the hypersonic aerodynamic characteristics of two basic configurations, which are: the blunt-cone Galileo probe which is scheduled to be launched in late 1989 and will enter the atmosphere of Jupiter in 1994, and a generic slender cone configuration to provide experimental aerodynamic data including good flow-field definition which computational aerodynamicists could use to validate their computer codes. Some of the results obtained thus far are presented and work for the near future is discussed.

  8. tf_unet: Generic convolutional neural network U-Net implementation in Tensorflow

    NASA Astrophysics Data System (ADS)

    Akeret, Joel; Chang, Chihway; Lucchi, Aurelien; Refregier, Alexandre

    2016-11-01

    tf_unet mitigates radio frequency interference (RFI) signals in radio data using a special type of Convolutional Neural Network, the U-Net, that enables the classification of clean signal and RFI signatures in 2D time-ordered data acquired from a radio telescope. The code is not tied to a specific segmentation and can be used, for example, to detect radio frequency interference (RFI) in radio astronomy or galaxies and stars in widefield imaging data. This U-Net implementation can outperform classical RFI mitigation algorithms.

  9. QuantumOptics.jl: A Julia framework for simulating open quantum systems

    NASA Astrophysics Data System (ADS)

    Krämer, Sebastian; Plankensteiner, David; Ostermann, Laurin; Ritsch, Helmut

    2018-06-01

    We present an open source computational framework geared towards the efficient numerical investigation of open quantum systems written in the Julia programming language. Built exclusively in Julia and based on standard quantum optics notation, the toolbox offers speed comparable to low-level statically typed languages, without compromising on the accessibility and code readability found in dynamic languages. After introducing the framework, we highlight its features and showcase implementations of generic quantum models. Finally, we compare its usability and performance to two well-established and widely used numerical quantum libraries.

  10. Retrieval of canopy water content of different crop types with two new hyperspectral indices: Water Absorption Area Index and Depth Water Index

    NASA Astrophysics Data System (ADS)

    Pasqualotto, Nieves; Delegido, Jesús; Van Wittenberghe, Shari; Verrelst, Jochem; Rivera, Juan Pablo; Moreno, José

    2018-05-01

    Crop canopy water content (CWC) is an essential indicator of the crop's physiological state. While a diverse range of vegetation indices have earlier been developed for the remote estimation of CWC, most of them are defined for specific crop types and areas, making them less universally applicable. We propose two new water content indices applicable to a wide variety of crop types, allowing to derive CWC maps at a large spatial scale. These indices were developed based on PROSAIL simulations and then optimized with an experimental dataset (SPARC03; Barrax, Spain). This dataset consists of water content and other biophysical variables for five common crop types (lucerne, corn, potato, sugar beet and onion) and corresponding top-of-canopy (TOC) reflectance spectra acquired by the hyperspectral HyMap airborne sensor. First, commonly used water content index formulations were analysed and validated for the variety of crops, overall resulting in a R2 lower than 0.6. In an attempt to move towards more generically applicable indices, the two new CWC indices exploit the principal water absorption features in the near-infrared by using multiple bands sensitive to water content. We propose the Water Absorption Area Index (WAAI) as the difference between the area under the null water content of TOC reflectance (reference line) simulated with PROSAIL and the area under measured TOC reflectance between 911 and 1271 nm. We also propose the Depth Water Index (DWI), a simplified four-band index based on the spectral depths produced by the water absorption at 970 and 1200 nm and two reference bands. Both the WAAI and DWI outperform established indices in predicting CWC when applied to heterogeneous croplands, with a R2 of 0.8 and 0.7, respectively, using an exponential fit. However, these indices did not perform well for species with a low fractional vegetation cover (<30%). HyMap CWC maps calculated with both indices are shown for the Barrax region. The results confirmed the potential of using generically applicable indices for calculating CWC over a great variety of crops.

  11. Predicting subsurface uranium transport: Mechanistic modeling constrained by experimental data

    NASA Astrophysics Data System (ADS)

    Ottman, Michael; Schenkeveld, Walter D. C.; Kraemer, Stephan

    2017-04-01

    Depleted uranium (DU) munitions and their widespread use throughout conflict zones around the world pose a persistent health threat to the inhabitants of those areas long after the conclusion of active combat. However, little emphasis has been put on developing a comprehensive, quantitative tool for use in remediation and hazard avoidance planning in a wide range of environments. In this context, we report experimental data on U interaction with soils and sediments. Here, we strive to improve existing risk assessment modeling paradigms by incorporating a variety of experimental data into a mechanistic U transport model for subsurface environments. 20 different soils and sediments from a variety of environments were chosen to represent a range of geochemical parameters that are relevant to U transport. The parameters included pH, organic matter content, CaCO3, Fe content and speciation, and clay content. pH ranged from 3 to 10, organic matter content from 6 to 120 g kg-1, CaCO3 from 0 to 700 g kg-1, amorphous Fe content from 0.3 to 6 g kg-1 and clay content from 4 to 580 g kg-1. Sorption experiments were then performed, and linear isotherms were constructed. Sorption experiment results show that among separate sets of sediments and soils, there is an inverse correlation between both soil pH and CaCO¬3 concentration relative to U sorptive affinity. The geological materials with the highest and lowest sorptive affinities for U differed in CaCO3 and organic matter concentrations, as well as clay content and pH. In a further step, we are testing if transport behavior in saturated porous media can be predicted based on adsorption isotherms and generic geochemical parameters, and comparing these modeling predictions with the results from column experiments. The comparison of these two data sets will examine if U transport can be effectively predicted from reactive transport modeling that incorporates the generic geochemical parameters. This work will serve to show whether a more mechanistic approach offers an improvement over statistical regression-based risk assessment models.

  12. The semantic pathfinder: using an authoring metaphor for generic multimedia indexing.

    PubMed

    Snoek, Cees G M; Worring, Marcel; Geusebroek, Jan-Mark; Koelma, Dennis C; Seinstra, Frank J; Smeulders, Arnold W M

    2006-10-01

    This paper presents the semantic pathfinder architecture for generic indexing of multimedia archives. The semantic pathfinder extracts semantic concepts from video by exploring different paths through three consecutive analysis steps, which we derive from the observation that produced video is the result of an authoring-driven process. We exploit this authoring metaphor for machine-driven understanding. The pathfinder starts with the content analysis step. In this analysis step, we follow a data-driven approach of indexing semantics. The style analysis step is the second analysis step. Here, we tackle the indexing problem by viewing a video from the perspective of production. Finally, in the context analysis step, we view semantics in context. The virtue of the semantic pathfinder is its ability to learn the best path of analysis steps on a per-concept basis. To show the generality of this novel indexing approach, we develop detectors for a lexicon of 32 concepts and we evaluate the semantic pathfinder against the 2004 NIST TRECVID video retrieval benchmark, using a news archive of 64 hours. Top ranking performance in the semantic concept detection task indicates the merit of the semantic pathfinder for generic indexing of multimedia archives.

  13. Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games.

    PubMed

    Alber, Julia M; Watson, Anna M; Barnett, Tracey E; Mercado, Rebeccah; Bernhardt, Jay M

    2015-07-01

    Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development.

  14. What does music express? Basic emotions and beyond.

    PubMed

    Juslin, Patrik N

    2013-01-01

    Numerous studies have investigated whether music can reliably convey emotions to listeners, and-if so-what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of "multiple layers" of musical expression of emotions. The "core" layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this "core" layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions-though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions.

  15. Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games

    PubMed Central

    Alber, Julia M.; Watson, Anna M.; Barnett, Tracey E.; Mercado, Rebeccah

    2015-01-01

    Abstract Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development. PMID:26167842

  16. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    PubMed

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  17. Topological quantum error correction in the Kitaev honeycomb model

    NASA Astrophysics Data System (ADS)

    Lee, Yi-Chan; Brell, Courtney G.; Flammia, Steven T.

    2017-08-01

    The Kitaev honeycomb model is an approximate topological quantum error correcting code in the same phase as the toric code, but requiring only a 2-body Hamiltonian. As a frustrated spin model, it is well outside the commuting models of topological quantum codes that are typically studied, but its exact solubility makes it more amenable to analysis of effects arising in this noncommutative setting than a generic topologically ordered Hamiltonian. Here we study quantum error correction in the honeycomb model using both analytic and numerical techniques. We first prove explicit exponential bounds on the approximate degeneracy, local indistinguishability, and correctability of the code space. These bounds are tighter than can be achieved using known general properties of topological phases. Our proofs are specialized to the honeycomb model, but some of the methods may nonetheless be of broader interest. Following this, we numerically study noise caused by thermalization processes in the perturbative regime close to the toric code renormalization group fixed point. The appearance of non-topological excitations in this setting has no significant effect on the error correction properties of the honeycomb model in the regimes we study. Although the behavior of this model is found to be qualitatively similar to that of the standard toric code in most regimes, we find numerical evidence of an interesting effect in the low-temperature, finite-size regime where a preferred lattice direction emerges and anyon diffusion is geometrically constrained. We expect this effect to yield an improvement in the scaling of the lifetime with system size as compared to the standard toric code.

  18. Using Podcasts to Support Communication Skills Development: A Case Study for Content Format Preferences among Postgraduate Research Students

    ERIC Educational Resources Information Center

    Lawlor, Bob; Donnelly, Roisin

    2010-01-01

    The need for the integration of generic skills training into structured PhD programmes is widely accepted. However, effective integration of such training requires flexible delivery mechanisms which facilitate self-paced and independent learning. A video recording was made of an eminent speaker delivering a 1-h live presentation to a group of 15…

  19. At the Crossroads of Learning and Culture: Identifying a Construct for Effective Computer-Assisted Language Learning for English Language Learners

    ERIC Educational Resources Information Center

    Shaw, Yun

    2010-01-01

    Many of the commercial Computer-Assisted Language Learning (CALL) programs available today typically take a generic approach. This approach standardizes the program so that it can be used to teach any language merely by translating the content from one language to another. These CALL programs rarely consider the cultural background or preferred…

  20. A Generic Framework for Extraction of Knowledge from Social Web Sources (Social Networking Websites) for an Online Recommendation System

    ERIC Educational Resources Information Center

    Sathick, Javubar; Venkat, Jaya

    2015-01-01

    Mining social web data is a challenging task and finding user interest for personalized and non-personalized recommendation systems is another important task. Knowledge sharing among web users has become crucial in determining usage of web data and personalizing content in various social websites as per the user's wish. This paper aims to design a…

  1. Determination of Fe Content of Some Food Items by Flame Atomic Absorption Spectroscopy (FAAS): A Guided-Inquiry Learning Experience in Instrumental Analysis Laboratory

    ERIC Educational Resources Information Center

    Fakayode, Sayo O.; King, Angela G.; Yakubu, Mamudu; Mohammed, Abdul K.; Pollard, David A.

    2012-01-01

    This article presents a guided-inquiry (GI) hands-on determination of Fe in food samples including plantains, spinach, lima beans, oatmeal, Frosted Flakes cereal (generic), tilapia fish, and chicken using flame atomic absorption spectroscopy (FAAS). The utility of the GI experiment, which is part of an instrumental analysis laboratory course,…

  2. Learning in the Age of Global Information Technology: Development of a Generic Architecture for an Advanced Learning Management System

    ERIC Educational Resources Information Center

    Watson, Jason; Ahmed, Pervaiz K.

    2004-01-01

    This paper briefly introduces the trends towards e-learning and amplifies some examples of state of the art systems, pointing out that all of these are, to date, limited by adaptability and shareability of content and that it is necessary for industry to develop and use an inter-operability standard. Uses SCORM specifications to specify the…

  3. Towards a generic procedure for the detection of relevant contaminants from waste electric and electronic equipment (WEEE) in plastic food-contact materials: a review and selection of key parameters.

    PubMed

    Puype, Franky; Samsonek, Jiří; Vilímková, Věra; Kopečková, Šárka; Ratiborská, Andrea; Knoop, Jan; Egelkraut-Holtus, Marion; Ortlieb, Markus; Oppermann, Uwe

    2017-10-01

    Recently, traces of brominated flame retardants (BFRs) have been detected in black plastic food-contact materials (FCMs), indicating the presence of recycled plastics, mainly coming from waste electric and electronic equipment (WEEE) as BFRs are one of the main additives in electric applications. In order to evaluate efficiently and preliminary in situ the presence of WEEE in plastic FCMs, a generic procedure for the evaluation of WEEE presence in plastic FCMs by using defined parameters having each an associated importance level has been proposed. This can be achieved by combining parameters like overall bromine (Br) and antimony (Sb) content; additive and reactive BFR, rare earth element (REE) and WEEE-relevant elemental content and additionally polymer purity. In most of the cases, the WEEE contamination could be confirmed by combining X-ray fluorescence (XRF) spectrometry and thermal desorption/pyrolysis gas chromatography-mass spectrometry (GC-MS) at first. The Sb and REE content did not give a full confirmation as to the source of contamination, however for Sb the opposite counts: Sb was joined with elevated Br signals. Therefore, Br at first followed by Sb were used as WEEE precursors as both elements are used as synergetic flame-retardant systems. WEEE-specific REEs could be used for small WEEE (sWEEE) confirmation; however, this parameter should be interpreted with care. The polymer purity by Fourier-transform infrared spectrometer (FTIR) and pyrolysis GC-MS in many cases could not confirm WEEE-specific contamination; however, it can be used for purity measurements and for the suspicion of the usage of recycled fractions (WEEE and non-WEEE) as a third-line confirmation. To the best of our knowledge, the addition of WEEE waste to plastic FCMs is illegal; however, due to lack on screening mechanisms, there is still the breakthrough of such articles onto the market, and, therefore, our generic procedure enables the quick and effective screening of suspicious samples.

  4. The use of a panel code on high lift configurations of a swept forward wing

    NASA Technical Reports Server (NTRS)

    Scheib, J. S.; Sandlin, D. R.

    1985-01-01

    A study was done on high lift configurations of a generic swept forward wing using a panel code prediction method. A survey was done of existing codes available at Ames, frow which the program VSAERO was chosen. The results of VSAERO were compared with data obtained from the Ames 7- by 10-foot wind tunnel. The results of the comparison in lift were good (within 3.5%). The comparison of the pressure coefficients was also good. The pitching moment coefficients obtained by VSAERO were not in good agreement with experiment. VSAERO's ability to predict drag is questionable and cannot be counted on for accurate trends. Further studies were done on the effects of a leading edge glove, canards, leading edge sweeps and various wing twists on spanwise loading and trim lift with encouraging results. An unsuccessful attempt was made to model spanwise blowing and boundary layer control on the trailing edge flap. The potential results of VSAERO were compared with experimental data of flap deflections with boundary layer control to check the first order effects.

  5. A generic framework for individual-based modelling and physical-biological interaction

    PubMed Central

    2018-01-01

    The increased availability of high-resolution ocean data globally has enabled more detailed analyses of physical-biological interactions and their consequences to the ecosystem. We present IBMlib, which is a versatile, portable and computationally effective framework for conducting Lagrangian simulations in the marine environment. The purpose of the framework is to handle complex individual-level biological models of organisms, combined with realistic 3D oceanographic model of physics and biogeochemistry describing the environment of the organisms without assumptions about spatial or temporal scales. The open-source framework features a minimal robust interface to facilitate the coupling between individual-level biological models and oceanographic models, and we provide application examples including forward/backward simulations, habitat connectivity calculations, assessing ocean conditions, comparison of physical circulation models, model ensemble runs and recently posterior Eulerian simulations using the IBMlib framework. We present the code design ideas behind the longevity of the code, our implementation experiences, as well as code performance benchmarking. The framework may contribute substantially to progresses in representing, understanding, predicting and eventually managing marine ecosystems. PMID:29351280

  6. Initial Kernel Timing Using a Simple PIM Performance Model

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Block, Gary L.; Springer, Paul L.; Sterling, Thomas; Brockman, Jay B.; Callahan, David

    2005-01-01

    This presentation will describe some initial results of paper-and-pencil studies of 4 or 5 application kernels applied to a processor-in-memory (PIM) system roughly similar to the Cascade Lightweight Processor (LWP). The application kernels are: * Linked list traversal * Sun of leaf nodes on a tree * Bitonic sort * Vector sum * Gaussian elimination The intent of this work is to guide and validate work on the Cascade project in the areas of compilers, simulators, and languages. We will first discuss the generic PIM structure. Then, we will explain the concepts needed to program a parallel PIM system (locality, threads, parcels). Next, we will present a simple PIM performance model that will be used in the remainder of the presentation. For each kernel, we will then present a set of codes, including codes for a single PIM node, and codes for multiple PIM nodes that move data to threads and move threads to data. These codes are written at a fairly low level, between assembly and C, but much closer to C than to assembly. For each code, we will present some hand-drafted timing forecasts, based on the simple PIM performance model. Finally, we will conclude by discussing what we have learned from this work, including what programming styles seem to work best, from the point-of-view of both expressiveness and performance.

  7. Development of a 3-D upwind PNS code for chemically reacting hypersonic flowfields

    NASA Technical Reports Server (NTRS)

    Tannehill, J. C.; Wadawadigi, G.

    1992-01-01

    Two new parabolized Navier-Stokes (PNS) codes were developed to compute the three-dimensional, viscous, chemically reacting flow of air around hypersonic vehicles such as the National Aero-Space Plane (NASP). The first code (TONIC) solves the gas dynamic and species conservation equations in a fully coupled manner using an implicit, approximately-factored, central-difference algorithm. This code was upgraded to include shock fitting and the capability of computing the flow around complex body shapes. The revised TONIC code was validated by computing the chemically-reacting (M(sub infinity) = 25.3) flow around a 10 deg half-angle cone at various angles of attack and the Ames All-Body model at 0 deg angle of attack. The results of these calculations were in good agreement with the results from the UPS code. One of the major drawbacks of the TONIC code is that the central-differencing of fluxes across interior flowfield discontinuities tends to introduce errors into the solution in the form of local flow property oscillations. The second code (UPS), originally developed for a perfect gas, has been extended to permit either perfect gas, equilibrium air, or nonequilibrium air computations. The code solves the PNS equations using a finite-volume, upwind TVD method based on Roe's approximate Riemann solver that was modified to account for real gas effects. The dissipation term associated with this algorithm is sufficiently adaptive to flow conditions that, even when attempting to capture very strong shock waves, no additional smoothing is required. For nonequilibrium calculations, the code solves the fluid dynamic and species continuity equations in a loosely-coupled manner. This code was used to calculate the hypersonic, laminar flow of chemically reacting air over cones at various angles of attack. In addition, the flow around the McDonnel Douglas generic option blended-wing-body was computed and comparisons were made between the perfect gas, equilibrium air, and the nonequilibrium air results.

  8. TOUGH+ v1.5 Core Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, George J.

    TOUGH+ v1.5 is a numerical code for the simulation of multi-phase, multi-component flow and transport of mass and heat through porous and fractured media, and represents the third update of the code since its first release [Moridis et al., 2008]. TOUGH+ is a successor to the TOUGH2 [Pruess et al., 1991; 2012] family of codes for multi-component, multiphase fluid and heat flow developed at the Lawrence Berkeley National Laboratory. It is written in standard FORTRAN 95/2003, and can be run on any computational platform (workstations, PC, Macintosh). TOUGH+ v1.5 employs dynamic memory allocation, thus minimizing storage requirements. It has amore » completely modular structure, follows the tenets of Object-Oriented Programming (OOP), and involves the advanced features of FORTRAN 95/2003, i.e., modules, derived data types, the use of pointers, lists and trees, data encapsulation, defined operators and assignments, operator extension and overloading, use of generic procedures, and maximum use of the powerful intrinsic vector and matrix processing operations. TOUGH+ v1.5 is the core code for its family of applications, i.e., the part of the code that is common to all its applications. It provides a description of the underlying physics and thermodynamics of non-isothermal flow, of the mathematical and numerical approaches, as well as a detailed explanation of the general (common to all applications) input requirements, options, capabilities and output specifications. The core code cannot run by itself: it needs to be coupled with the code for the specific TOUGH+ application option that describes a particular type of problem. The additional input requirements specific to a particular TOUGH+ application options and related illustrative examples can be found in the corresponding User's Manual.« less

  9. "Hour of Code": A Case Study

    ERIC Educational Resources Information Center

    Du, Jie; Wimmer, Hayden; Rada, Roy

    2018-01-01

    This study investigates the delivery of the "Hour of Code" tutorials to college students. The college students who participated in this study were surveyed about their opinion of the Hour of Code. First, the students' comments were discussed. Next, a content analysis of the offered tutorials highlights their reliance on visual…

  10. News media coverage of medication research: reporting pharmaceutical company funding and use of generic medication names.

    PubMed

    Hochman, Michael; Hochman, Steven; Bor, David; McCormick, Danny

    2008-10-01

    The news media are an important source of information about medical research for patients and even some physicians. Little is known about how frequently news articles report when medication research has received funding from pharmaceutical companies or how frequently news articles use generic vs brand medication names. To assess the reporting of pharmaceutical company funding and generic medication name use in news articles about medication studies and to determine the views of newspaper editors about these issues. We reviewed US news articles from newspaper and online sources about all pharmaceutical company-funded medication studies published in the 5 most prominent general medical journals between April 1, 2004, and April 30, 2008. We also surveyed editors at the 100 most widely circulated newspapers in the United States. The percentage of news articles indicating when studies have been pharmaceutical company-funded and the percentage that refer to medications by their generic vs brand names. Also the percentage of newspaper editors who indicate that their articles report pharmaceutical company funding; the percentage of editors who indicate that their articles refer to medications by generic names; and the percentage of newspapers with policies about these issues. Of the 306 news articles about medication research identified,130 (42%; 95% confidence interval [CI], 37%-48%) did not report that the research had received company funding. Of the 277 of these articles reporting on medications with both generic and brand names, 186 (67%; 95% CI, 61%-73%) referred to the study medications by their brand names in at least half of the medication references. Eighty-two of the 93 (88%) newspaper editors who responded to our survey reported that articles from their publications always or often indicated when studies had received company funding (95% CI, 80%-94%), and 71 of 92 (77%) responding editors also reported that articles from their publications always or often referred to medications by the generic names (95% CI, 67%-85%). However, only 3 of 92 newspapers (3%) had written policies stating that company funding sources of medical studies be reported (95% CI 1%-9%), and 2 of 93 (2%) newspapers had written policies stating that medications should be referred to by their generic names (95% CI 1%-8%). News articles reporting on medication studies often fail to report pharmaceutical company funding and frequently refer to medications by their brand names despite newspaper editors' contention that this is not the case.

  11. Applying a rateless code in content delivery networks

    NASA Astrophysics Data System (ADS)

    Suherman; Zarlis, Muhammad; Parulian Sitorus, Sahat; Al-Akaidi, Marwan

    2017-09-01

    Content delivery network (CDN) allows internet providers to locate their services, to map their coverage into networks without necessarily to own them. CDN is part of the current internet infrastructures, supporting multi server applications especially social media. Various works have been proposed to improve CDN performances. Since accesses on social media servers tend to be short but frequent, providing redundant to the transmitted packets to ensure lost packets not degrade the information integrity may improve service performances. This paper examines the implementation of rateless code in the CDN infrastructure. The NS-2 evaluations show that rateless code is able to reduce packet loss up to 50%.

  12. [On-site quality control of acupuncture randomized controlled trial: design of content and checklist of quality control based on PICOST].

    PubMed

    Li, Hong-Jiao; He, Li-Yun; Liu, Zhi-Shun; Sun, Ya-Nan; Yan, Shi-Yan; Liu, Jia; Zhao, Ye; Liu, Bao-Yan

    2014-02-01

    To effectively guarantee quality of randomized controlld trial (RCT) of acupuncture and develop reasonable content and checklist of on-site quality control, influencing factors on quality of acupuncture RCT are analyzed and scientificity of quality control content and feasibility of on-site manipulation are put into overall consideration. Based on content and checklist of on-site quality control in National 11th Five-Year Plan Project Optimization of Comprehensive Treatment Plan for TCM in Prevention and Treatment of Serious Disease and Clinical Assessment on Generic Technology and Quality Control Research, it is proposed that on-site quality control of acupuncture RCT should be conducted with PICOST (patient, intervention, comparison, out come, site and time) as core, especially on quality control of interveners' skills and outcome assessment of blinding, and checklist of on-site quality control is developed to provide references for undertaking groups of the project.

  13. Detecting well-being via computerized content analysis of brief diary entries.

    PubMed

    Tov, William; Ng, Kok Leong; Lin, Han; Qiu, Lin

    2013-12-01

    Two studies evaluated the correspondence between self-reported well-being and codings of emotion and life content by the Linguistic Inquiry and Word Count (LIWC; Pennebaker, Booth, & Francis, 2011). Open-ended diary responses were collected from 206 participants daily for 3 weeks (Study 1) and from 139 participants twice a week for 8 weeks (Study 2). LIWC negative emotion consistently correlated with self-reported negative emotion. LIWC positive emotion correlated with self-reported positive emotion in Study 1 but not in Study 2. No correlations were observed with global life satisfaction. Using a co-occurrence coding method to combine LIWC emotion codings with life-content codings, we estimated the frequency of positive and negative events in 6 life domains (family, friends, academics, health, leisure, and money). Domain-specific event frequencies predicted self-reported satisfaction in all domains in Study 1 but not consistently in Study 2. We suggest that the correspondence between LIWC codings and self-reported well-being is affected by the number of writing samples collected per day as well as the target period (e.g., past day vs. past week) assessed by the self-report measure. Extensions and possible implications for the analyses of similar types of open-ended data (e.g., social media messages) are discussed. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  14. Mobile Code: The Future of the Internet

    DTIC Science & Technology

    1999-01-01

    code ( mobile agents) to multiple proxies or servers " Customization " (e.g., re-formatting, filtering, metasearch) Information overload Diversified... Mobile code is necessary, rather than client-side code, since many customization features (such as information monitoring) do not work if the...economic foundation for Web sites, many Web sites earn money solely from advertisements . If these sites allow mobile agents to easily access the content

  15. 3-D inelastic analysis methods for hot section components (base program). [turbine blades, turbine vanes, and combustor liners

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Bak, M. J.; Nakazawa, S.; Banerjee, P. K.

    1984-01-01

    A 3-D inelastic analysis methods program consists of a series of computer codes embodying a progression of mathematical models (mechanics of materials, special finite element, boundary element) for streamlined analysis of combustor liners, turbine blades, and turbine vanes. These models address the effects of high temperatures and thermal/mechanical loadings on the local (stress/strain) and global (dynamics, buckling) structural behavior of the three selected components. These models are used to solve 3-D inelastic problems using linear approximations in the sense that stresses/strains and temperatures in generic modeling regions are linear functions of the spatial coordinates, and solution increments for load, temperature and/or time are extrapolated linearly from previous information. Three linear formulation computer codes, referred to as MOMM (Mechanics of Materials Model), MHOST (MARC-Hot Section Technology), and BEST (Boundary Element Stress Technology), were developed and are described.

  16. Vectors a Fortran 90 module for 3-dimensional vector and dyadic arithmetic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brock, B.C.

    1998-02-01

    A major advance contained in the new Fortran 90 language standard is the ability to define new data types and the operators associated with them. Writing computer code to implement computations with real and complex three-dimensional vectors and dyadics is greatly simplified if the equations can be implemented directly, without the need to code the vector arithmetic explicitly. The Fortran 90 module described here defines new data types for real and complex 3-dimensional vectors and dyadics, along with the common operations needed to work with these objects. Routines to allow convenient initialization and output of the new types are alsomore » included. In keeping with the philosophy of data abstraction, the details of the implementation of the data types are maintained private, and the functions and operators are made generic to simplify the combining of real, complex, single- and double-precision vectors and dyadics.« less

  17. The application of simulation modeling to the cost and performance ranking of solar thermal power plants

    NASA Technical Reports Server (NTRS)

    Rosenberg, L. S.; Revere, W. R.; Selcuk, M. K.

    1981-01-01

    A computer simulation code was employed to evaluate several generic types of solar power systems (up to 10 MWe). Details of the simulation methodology, and the solar plant concepts are given along with cost and performance results. The Solar Energy Simulation computer code (SESII) was used, which optimizes the size of the collector field and energy storage subsystem for given engine-generator and energy-transport characteristics. Nine plant types were examined which employed combinations of different technology options, such as: distributed or central receivers with one- or two-axis tracking or no tracking; point- or line-focusing concentrator; central or distributed power conversion; Rankin, Brayton, or Stirling thermodynamic cycles; and thermal or electrical storage. Optimal cost curves were plotted as a function of levelized busbar energy cost and annualized plant capacity. Point-focusing distributed receiver systems were found to be most efficient (17-26 percent).

  18. A Design for Composing and Extending Vehicle Models

    NASA Technical Reports Server (NTRS)

    Madden, Michael M.; Neuhaus, Jason R.

    2003-01-01

    The Systems Development Branch (SDB) at NASA Langley Research Center (LaRC) creates simulation software products for research. Each product consists of an aircraft model with experiment extensions. SDB treats its aircraft models as reusable components, upon which experiments can be built. SDB has evolved aircraft model design with the following goals: 1. Avoid polluting the aircraft model with experiment code. 2. Discourage the copy and tailor method of reuse. The current evolution of that architecture accomplishes these goals by reducing experiment creation to extend and compose. The architecture mechanizes the operational concerns of the model's subsystems and encapsulates them in an interface inherited by all subsystems. Generic operational code exercises the subsystems through the shared interface. An experiment is thus defined by the collection of subsystems that it creates ("compose"). Teams can modify the aircraft subsystems for the experiment using inheritance and polymorphism to create variants ("extend").

  19. Interactive three-dimensional visualization and creation of geometries for Monte Carlo calculations

    NASA Astrophysics Data System (ADS)

    Theis, C.; Buchegger, K. H.; Brugger, M.; Forkel-Wirth, D.; Roesler, S.; Vincke, H.

    2006-06-01

    The implementation of three-dimensional geometries for the simulation of radiation transport problems is a very time-consuming task. Each particle transport code supplies its own scripting language and syntax for creating the geometries. All of them are based on the Constructive Solid Geometry scheme requiring textual description. This makes the creation a tedious and error-prone task, which is especially hard to master for novice users. The Monte Carlo code FLUKA comes with built-in support for creating two-dimensional cross-sections through the geometry and FLUKACAD, a custom-built converter to the commercial Computer Aided Design package AutoCAD, exists for 3D visualization. For other codes, like MCNPX, a couple of different tools are available, but they are often specifically tailored to the particle transport code and its approach used for implementing geometries. Complex constructive solid modeling usually requires very fast and expensive special purpose hardware, which is not widely available. In this paper SimpleGeo is presented, which is an implementation of a generic versatile interactive geometry modeler using off-the-shelf hardware. It is running on Windows, with a Linux version currently under preparation. This paper describes its functionality, which allows for rapid interactive visualization as well as generation of three-dimensional geometries, and also discusses critical issues regarding common CAD systems.

  20. Analysis of unmitigated large break loss of coolant accidents using MELCOR code

    NASA Astrophysics Data System (ADS)

    Pescarini, M.; Mascari, F.; Mostacci, D.; De Rosa, F.; Lombardo, C.; Giannetti, F.

    2017-11-01

    In the framework of severe accident research activity developed by ENEA, a MELCOR nodalization of a generic Pressurized Water Reactor of 900 MWe has been developed. The aim of this paper is to present the analysis of MELCOR code calculations concerning two independent unmitigated large break loss of coolant accident transients, occurring in the cited type of reactor. In particular, the analysis and comparison between the transients initiated by an unmitigated double-ended cold leg rupture and an unmitigated double-ended hot leg rupture in the loop 1 of the primary cooling system is presented herein. This activity has been performed focusing specifically on the in-vessel phenomenology that characterizes this kind of accidents. The analysis of the thermal-hydraulic transient phenomena and the core degradation phenomena is therefore here presented. The analysis of the calculated data shows the capability of the code to reproduce the phenomena typical of these transients and permits their phenomenological study. A first sequence of main events is here presented and shows that the cold leg break transient results faster than the hot leg break transient because of the position of the break. Further analyses are in progress to quantitatively assess the results of the code nodalization for accident management strategy definition and fission product source term evaluation.

  1. Catalog to families, genera, and species of orders Actiniaria and Corallimorpharia (Cnidaria: Anthozoa).

    PubMed

    Fautin, Daphne Gail

    2016-08-01

    This book inventories all available (and some unavailable) names in the family, genus, and species groups of extant members of orders Actiniaria and Corallimorpharia [cnidarian subclass Hexacorallia (Zoantharia) of class Anthozoa], providing a benchmark of names, their status, and taxon membership. I have attempted to make the compilation complete as of 2010; some names created after 2010 are included. The book is derived from a database I compiled that was available through a website. Most of the book is from the literature that defines taxa and documents their geographic distribution-primarily publications on nomenclature, taxonomy, and biogeography, but also some on ecology, pharmacology, reproductive biology, physiology, etc. of anemones (the common name for these groups); the reference section comprises 845 entries. As for previous anemone catalogs, this contains taxonomic as well as nomenclatural information,  the former based on subjective opinion of working biologists, the latter objectively verifiable and unchanging (except by action of the International Commission on Zoological Nomenclature).        Each family-group name, genus-group name, and original combination for species-group names has an entry. The entry contains the bibliographic reference to the publication in which each name was made available. This book contains for Corallimorpharia seven family names (four considered valid [57%]), 20 generic names (10 considered valid [50%] and one unavailable), and 65 species names (46 considered valid [70%]). It contains for Actiniaria 86 family names (50 considered valid [58%] and three unavailable), 447 generic names (264 considered valid [59%] and two unavailable), and 1427 species names (1101 considered valid [77%] and nine unavailable). Type specimens are inventoried from more than 50 natural history museums in Africa, Australia, Europe, New Zealand, and North America, including those with the largest collections of anemones; the geographic sources of specimens that were the bases of new names are identified. I resolve some nomenclatural issues, acting as First Reviser. A few taxonomic opinions are published for the first time. I have been unable to resolve a small number of problematic names having both nomenclatural and taxonomic problems. Molecular phylogenetic analyses are changing assignment of genera to families and species to genera. Systematics may change, but the basics of nomenclature remain unchanged in face of such alterations.        All actions are in accord with the principles of nomenclature enunciated in the International Code of Zoological Nomenclature. These include the type concept, the Principle of Coordination, and the Principle of Priority. Nomenclatural acts include the creation of new replacement names; seven actiniarian generic names and one species name that are junior homonyms but have been treated as valid are replaced and an eighth new genus name is created. I designate type species for two genera. Except for published misspellings, names are rendered correctly according to the International Code of Zoological Nomenclature; I have altered spelling of some species names to conform to orthographic regulations. I place several species that had been assigned to genera now considered junior synonyms in the genus to which the type species was moved; experts on these anemones should determine whether those generic placements, which follow the nomenclatural rules, are taxonomically appropriate.        This inventory can be a useful starting point in assembling the literature and trying to understand the rationale for the creation and use of names for the taxonomic matters yet to be resolved.  Some nomenclatural conundra will not be resolved until taxonomic uncertainties are. A taxonomist familiar with the animals needs to ascertain whether the published synonymies are justified. If so, the senior synonym should be used, which, in many instances, will involve determining the proper generic assignment of the species and the correct rendering of the name; if changing the name would be disruptive, retaining the junior name would require an appeal to the Commission (Code Article 23.11).

  2. Content Analysis Coding Schemes for Online Asynchronous Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa

    2011-01-01

    Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…

  3. Doclet To Synthesize UML

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Osborne, Richard N.

    2005-01-01

    The RoseDoclet computer program extends the capability of Java doclet software to automatically synthesize Unified Modeling Language (UML) content from Java language source code. [Doclets are Java-language programs that use the doclet application programming interface (API) to specify the content and format of the output of Javadoc. Javadoc is a program, originally designed to generate API documentation from Java source code, now also useful as an extensible engine for processing Java source code.] RoseDoclet takes advantage of Javadoc comments and tags already in the source code to produce a UML model of that code. RoseDoclet applies the doclet API to create a doclet passed to Javadoc. The Javadoc engine applies the doclet to the source code, emitting the output format specified by the doclet. RoseDoclet emits a Rose model file and populates it with fully documented packages, classes, methods, variables, and class diagrams identified in the source code. The way in which UML models are generated can be controlled by use of new Javadoc comment tags that RoseDoclet provides. The advantage of using RoseDoclet is that Javadoc documentation becomes leveraged for two purposes: documenting the as-built API and keeping the design documentation up to date.

  4. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  5. Intercultural communication through the eyes of patients: experiences and preferences.

    PubMed

    Paternotte, Emma; van Dulmen, Sandra; Bank, Lindsay; Seeleman, Conny; Scherpbier, Albert; Scheele, Fedde

    2017-05-16

    To explore patients' preferences and experiences regarding intercultural communication which could influence the development of intercultural patient-centred communication training. This qualitative study is based on interviews with non-native patients. Thirty non-native patients were interviewed between September and December 2015 about their preferences and experiences regarding communication with a native Dutch doctor. Fourteen interviews were established with an interpreter. The semi-structured interviews took place in Amsterdam. They were focused on generic and intercultural communication skills of doctors. Relevant fragments were coded by two researchers and analysed by the research team by means of thematic network analysis. Informed consent and ethical approval was obtained beforehand. All patients preferred a doctor with a professional patient-centred attitude regardless of the doctor's background. Patients mentioned mainly generic communication aspects, such as listening, as important skills and seemed to be aware of their own responsibility in participating in a consultation. Being treated as a unique person and not as a disease was also frequently mentioned. Unfamiliarity with the Dutch healthcare system influenced the experienced communication negatively. However, a language barrier was considered the most important problem, which would become less pressing once a doctor-patient relation was established. Remarkably, patients in this study had no preference regarding the ethnic background of the doctor. Generic communication was experienced as important as specific intercultural communication, which underlines the marginal distinction between these two. A close link between intercultural communication and patient-centred communication was reflected in the expressed preference 'to be treated as a person'.

  6. A Automated Tool for Supporting FMEAs of Digital Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue,M.; Chu, T.-L.; Martinez-Guridi, G.

    2008-09-07

    Although designs of digital systems can be very different from each other, they typically use many of the same types of generic digital components. Determining the impacts of the failure modes of these generic components on a digital system can be used to support development of a reliability model of the system. A novel approach was proposed for such a purpose by decomposing the system into a level of the generic digital components and propagating failure modes to the system level, which generally is time-consuming and difficult to implement. To overcome the associated issues of implementing the proposed FMEA approach,more » an automated tool for a digital feedwater control system (DFWCS) has been developed in this study. The automated FMEA tool is in nature a simulation platform developed by using or recreating the original source code of the different module software interfaced by input and output variables that represent physical signals exchanged between modules, the system, and the controlled process. For any given failure mode, its impacts on associated signals are determined first and the variables that correspond to these signals are modified accordingly by the simulation. Criteria are also developed, as part of the simulation platform, to determine whether the system has lost its automatic control function, which is defined as a system failure in this study. The conceptual development of the automated FMEA support tool can be generalized and applied to support FMEAs for reliability assessment of complex digital systems.« less

  7. What does music express? Basic emotions and beyond

    PubMed Central

    Juslin, Patrik N.

    2013-01-01

    Numerous studies have investigated whether music can reliably convey emotions to listeners, and—if so—what musical parameters might carry this information. Far less attention has been devoted to the actual contents of the communicative process. The goal of this article is thus to consider what types of emotional content are possible to convey in music. I will argue that the content is mainly constrained by the type of coding involved, and that distinct types of content are related to different types of coding. Based on these premises, I suggest a conceptualization in terms of “multiple layers” of musical expression of emotions. The “core” layer is constituted by iconically-coded basic emotions. I attempt to clarify the meaning of this concept, dispel the myths that surround it, and provide examples of how it can be heuristic in explaining findings in this domain. However, I also propose that this “core” layer may be extended, qualified, and even modified by additional layers of expression that involve intrinsic and associative coding. These layers enable listeners to perceive more complex emotions—though the expressions are less cross-culturally invariant and more dependent on the social context and/or the individual listener. This multiple-layer conceptualization of expression in music can help to explain both similarities and differences between vocal and musical expression of emotions. PMID:24046758

  8. Assessing Teachers' Science Content Knowledge: A Strategy for Assessing Depth of Understanding

    NASA Astrophysics Data System (ADS)

    McConnell, Tom J.; Parker, Joyce M.; Eberhardt, Jan

    2013-06-01

    One of the characteristics of effective science teachers is a deep understanding of science concepts. The ability to identify, explain and apply concepts is critical in designing, delivering and assessing instruction. Because some teachers have not completed extensive courses in some areas of science, especially in middle and elementary grades, many professional development programs attempt to strengthen teachers' content knowledge. Assessing this content knowledge is challenging. Concept inventories are reliable and efficient, but do not reveal depth of knowledge. Interviews and observations are time-consuming. The Problem Based Learning Project for Teachers implemented a strategy that includes pre-post instruments in eight content strands that permits blind coding of responses and comparison across teachers and groups of teachers. The instruments include two types of open-ended questions that assess both general knowledge and the ability to apply Big Ideas related to specific science topics. The coding scheme is useful in revealing patterns in prior knowledge and learning, and identifying ideas that are challenging or not addressed by learning activities. The strengths and limitations of the scoring scheme are identified through comparison of the findings to case studies of four participating teachers from middle and elementary schools. The cases include examples of coded pre- and post-test responses to illustrate some of the themes seen in teacher learning. The findings raise questions for future investigation that can be conducted using analyses of the coded responses.

  9. Can mutational GC-pressure create new linear B-cell epitopes in herpes simplex virus type 1 glycoprotein B?

    PubMed

    Khrustalev, Vladislav Victorovich

    2009-01-01

    We showed that GC-content of nucleotide sequences coding for linear B-cell epitopes of herpes simplex virus type 1 (HSV1) glycoprotein B (gB) is higher than GC-content of sequences coding for epitope-free regions of this glycoprotein (G + C = 73 and 64%, respectively). Linear B-cell epitopes have been predicted in HSV1 gB by BepiPred algorithm ( www.cbs.dtu.dk/services/BepiPred ). Proline is an acrophilic amino acid residue (it is usually situated on the surface of protein globules, and so included in linear B-cell epitopes). Indeed, the level of proline is much higher in predicted epitopes of gB than in epitope-free regions (17.8% versus 1.8%). This amino acid is coded by GC-rich codons (CCX) that can be produced due to nucleotide substitutions caused by mutational GC-pressure. GC-pressure will also lead to disappearance of acrophobic phenylalanine, isoleucine, methionine and tyrosine coded by GC-poor codons. Results of our "in-silico directed mutagenesis" showed that single nonsynonymous substitutions in AT to GC direction in two long epitope-free regions of gB will cause formation of new linear epitopes or elongation of previously existing epitopes flanking these regions in 25% of 539 possible cases. The calculations of GC-content and amino acid content have been performed by CodonChanges algorithm ( www.barkovsky.hotmail.ru ).

  10. Malt Beverage Brand Popularity Among Youth and Youth-Appealing Advertising Content.

    PubMed

    Xuan, Ziming; DeJong, William; Siegel, Michael; Babor, Thomas F

    2017-11-01

    This study examined whether alcohol brands more popular among youth are more likely to have aired television advertisements that violated the alcohol industry's voluntary code by including youth-appealing content. We obtained a complete list of 288 brand-specific beer advertisements broadcast during the National Collegiate Athletic Association (NCAA) men's and women's basketball tournaments from 1999 to 2008. All ads were rated by a panel of health professionals using a modified Delphi method to assess the presence of youth-appealing content in violation of the alcohol industry's voluntary code. The ads represented 23 alcohol brands. The popularity of these brands was operationalized as the brand-specific popularity of youth alcohol consumption in the past 30 days, as determined by a 2011 to 2012 national survey of underage drinkers. Brand-level popularity was used as the exposure variable to predict the odds of having advertisements with youth-appealing content violations. Accounting for other covariates and the clustering of advertisements within brands, increased brand popularity among underage youth was associated with significantly increased odds of having youth-appeal content violations in ads televised during the NCAA basketball tournament games (adjusted odds ratio = 1.70, 95% CI: 1.38, 2.09). Alcohol brands popular among underage drinkers are more likely to air television advertising that violates the industry's voluntary code which proscribes youth-appealing content. Copyright © 2017 by the Research Society on Alcoholism.

  11. Space Shuttle Operations and Infrastructure: A Systems Analysis of Design Root Causes and Effects

    NASA Technical Reports Server (NTRS)

    McCleskey, Carey M.

    2005-01-01

    This NASA Technical Publication explores and documents the nature of Space Shuttle operations and its supporting infrastructure and addresses fundamental questions often asked of the Space Shuttle program why does it take so long to turnaround the Space Shuttle for flight and why does it cost so much? Further, the report provides an overview of the cause-and effect relationships between generic flight and ground system design characteristics and resulting operations by using actual cumulative maintenance task times as a relative measure of direct work content. In addition, this NASA TP provides an overview of how the Space Shuttle program's operational infrastructure extends and accumulates from these design characteristics. Finally, and most important, the report derives a set of generic needs from which designers can revolutionize space travel from the inside out by developing and maturing more operable and supportable systems.

  12. DEVA: An extensible ontology-based annotation model for visual document collections

    NASA Astrophysics Data System (ADS)

    Jelmini, Carlo; Marchand-Maillet, Stephane

    2003-01-01

    The description of visual documents is a fundamental aspect of any efficient information management system, but the process of manually annotating large collections of documents is tedious and far from being perfect. The need for a generic and extensible annotation model therefore arises. In this paper, we present DEVA, an open, generic and expressive multimedia annotation framework. DEVA is an extension of the Dublin Core specification. The model can represent the semantic content of any visual document. It is described in the ontology language DAML+OIL and can easily be extended with external specialized ontologies, adapting the vocabulary to the given application domain. In parallel, we present the Magritte annotation tool, which is an early prototype that validates the DEVA features. Magritte allows to manually annotating image collections. It is designed with a modular and extensible architecture, which enables the user to dynamically adapt the user interface to specialized ontologies merged into DEVA.

  13. Spatial transform coding of color images.

    NASA Technical Reports Server (NTRS)

    Pratt, W. K.

    1971-01-01

    The application of the transform-coding concept to the coding of color images represented by three primary color planes of data is discussed. The principles of spatial transform coding are reviewed and the merits of various methods of color-image representation are examined. A performance analysis is presented for the color-image transform-coding system. Results of a computer simulation of the coding system are also given. It is shown that, by transform coding, the chrominance content of a color image can be coded with an average of 1.0 bits per element or less without serious degradation. If luminance coding is also employed, the average rate reduces to about 2.0 bits per element or less.

  14. Should anti-tobacco media messages be culturally targeted for Indigenous populations? A systematic review and narrative synthesis.

    PubMed

    Gould, Gillian Sandra; McEwen, Andy; Watters, Tracey; Clough, Alan R; van der Zwan, Rick

    2013-07-01

    To summarise published empirical research on culturally targeted anti-tobacco media messages for Indigenous or First Nations people and examine the evidence for the effectiveness of targeted and non-targeted campaigns. Studies were sought describing mass media and new media interventions for tobacco control or smoking cessation in Indigenous or First Nations populations. Studies of any design were included reporting outcomes of media-based interventions including: cognitions, awareness, recall, intention to quit and quit rates. Then, 2 reviewers independently applied inclusion criteria, which were met by 21 (5.8%) of the studies found. One author extracted data with crosschecking by a second. Both independently assessed papers using Scottish Intercollegiate Guidelines Network (SIGN; quantitative studies) and Daly et al (qualitative studies). A total of 21 studies were found (4 level 1 randomised controlled trials (RCTs), 11 level 2 studies and 6 qualitative studies) and combined with narrative synthesis. Eight evaluated anti-tobacco TV or radio campaigns; two assessed US websites; three New Zealand studies examined mobile phone interventions; five evaluated print media; three evaluated a CD-ROM, a video and an edutainment intervention. Although Indigenous people had good recall of generic anti-tobacco messages, culturally targeted messages were preferred. New Zealand Maori may be less responsive to holistic targeted campaigns, despite their additional benefits, compared to generic fear campaigns. Culturally targeted internet or mobile phone messages appear to be as effective in American Indians and Maori as generic messages in the general population. There is little research comparing the effect of culturally targeted versus generic messages with similar message content in Indigenous people.

  15. Harm reduction in name, but not substance: a comparative analysis of current Canadian provincial and territorial policy frameworks.

    PubMed

    Hyshka, Elaine; Anderson-Baron, Jalene; Karekezi, Kamagaju; Belle-Isle, Lynne; Elliott, Richard; Pauly, Bernie; Strike, Carol; Asbridge, Mark; Dell, Colleen; McBride, Keely; Hathaway, Andrew; Wild, T Cameron

    2017-07-26

    In Canada, funding, administration, and delivery of health services-including those targeting people who use drugs-are primarily the responsibility of the provinces and territories. Access to harm reduction services varies across jurisdictions, possibly reflecting differences in provincial and territorial policy commitments. We examined the quality of current provincial and territorial harm reduction policies in Canada, relative to how well official documents reflect internationally recognized principles and attributes of a harm reduction approach. We employed an iterative search and screening process to generate a corpus of 54 provincial and territorial harm reduction policy documents that were current to the end of 2015. Documents were content-analyzed using a deductive coding framework comprised of 17 indicators that assessed the quality of policies relative to how well they described key population and program aspects of a harm reduction approach. Only two jurisdictions had current provincial-level, stand-alone harm reduction policies; all other documents were focused on either substance use, addiction and/or mental health, or sexually transmitted and/or blood-borne infections. Policies rarely named specific harm reduction interventions and more frequently referred to generic harm reduction programs or services. Only one document met all 17 indicators. Very few documents acknowledged that stigma and discrimination are issues faced by people who use drugs, that not all substance use is problematic, or that people who use drugs are legitimate participants in policymaking. A minority of documents recognized that abstaining from substance use is not required to receive services. Just over a quarter addressed the risk of drug overdose, and even fewer acknowledged the need to apply harm reduction approaches to an array of drugs and modes of use. Current provincial and territorial policies offer few robust characterizations of harm reduction or go beyond rhetorical or generic support for the approach. By endorsing harm reduction in name, but not in substance, provincial and territorial policies may communicate to diverse stakeholders a general lack of support for key aspects of the approach, potentially challenging efforts to expand harm reduction services.

  16. Content Representation in the Human Medial Temporal Lobe

    PubMed Central

    Liang, Jackson C.; Wagner, Anthony D.

    2013-01-01

    Current theories of medial temporal lobe (MTL) function focus on event content as an important organizational principle that differentiates MTL subregions. Perirhinal and parahippocampal cortices may play content-specific roles in memory, whereas hippocampal processing is alternately hypothesized to be content specific or content general. Despite anatomical evidence for content-specific MTL pathways, empirical data for content-based MTL subregional dissociations are mixed. Here, we combined functional magnetic resonance imaging with multiple statistical approaches to characterize MTL subregional responses to different classes of novel event content (faces, scenes, spoken words, sounds, visual words). Univariate analyses revealed that responses to novel faces and scenes were distributed across the anterior–posterior axis of MTL cortex, with face responses distributed more anteriorly than scene responses. Moreover, multivariate pattern analyses of perirhinal and parahippocampal data revealed spatially organized representational codes for multiple content classes, including nonpreferred visual and auditory stimuli. In contrast, anterior hippocampal responses were content general, with less accurate overall pattern classification relative to MTL cortex. Finally, posterior hippocampal activation patterns consistently discriminated scenes more accurately than other forms of content. Collectively, our findings indicate differential contributions of MTL subregions to event representation via a distributed code along the anterior–posterior axis of MTL that depends on the nature of event content. PMID:22275474

  17. Generic Environmental Impact Statement. Air Force Low Altitude Flying Operations

    DTIC Science & Technology

    1990-01-01

    82173-’ •-.- ,4 7c F’.- . E . VOUEI IIPGIEFRLWAMD AISAEPRPSL Iceir, o I~S-’A I)l, TWL IU DitiutoI VOLUMEfit Code Acceoilr if~or I Distrtbut idn l I)TI...System SEL Sound Exposure Level SR Slow Route STRC Strategic Training Range Complex T& E Threatened and Endangered TAC Tactical Air Command TSP Total...I I I I i ~PROCEDURES I AND PRODUCTS I0 Zj X I ~4 16. I~z w’ ~ III5 ra I JLJ16 96i I6 E 434 I PROCEDURES AND PRODUCTS I Introduction The National

  18. Design fabrication and test of graphite/polyimide composite joints and attachments for advanced aerospace vehicles

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Graphite/polyimide (Gr/PI) bolted and bonded joints were investigated. Possible failure modes and the design loads for the four generic joint types are discussed. Preliminary sizing of a type 1 joint, bonded and bolted configuration is described, including assumptions regarding material properties and sizing methodology. A general purpose finite element computer code is described that was formulated to analyze single and double lap joints, with and without tapered adherends, and with user-controlled variable element size arrangements. An initial order of Celion 6000/PMR-15 prepreg was received and characterized.

  19. Real-time operating system for selected Intel processors

    NASA Technical Reports Server (NTRS)

    Pool, W. R.

    1980-01-01

    The rationale for system development is given along with reasons for not using vendor supplied operating systems. Although many system design and performance goals were dictated by problems with vendor supplied systems, other goals surfaced as a result of a design for a custom system able to span multiple projects. System development and management problems and areas that required redesign or major code changes for system implementation are examined as well as the relative successes of the initial projects. A generic description of the actual project is provided and the ongoing support requirements and future plans are discussed.

  20. Lightning Protection Guidelines for Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Goodloe, C. C.

    1999-01-01

    This technical memorandum provides lightning protection engineering guidelines and technical procedures used by the George C. Marshall Space Flight Center (MSFC) Electromagnetics and Aerospace Environments Branch for aerospace vehicles. The overviews illustrate the technical support available to project managers, chief engineers, and design engineers to ensure that aerospace vehicles managed by MSFC are adequately protected from direct and indirect effects of lightning. Generic descriptions of the lightning environment and vehicle protection technical processes are presented. More specific aerospace vehicle requirements for lightning protection design, performance, and interface characteristics are available upon request to the MSFC Electromagnetics and Aerospace Environments Branch, mail code EL23.

  1. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  2. Application of process mapping to understand integration of high risk medicine care bundles within community pharmacy practice.

    PubMed

    Weir, Natalie M; Newham, Rosemary; Corcoran, Emma D; Ali Atallah Al-Gethami, Ashwag; Mohammed Abd Alridha, Ali; Bowie, Paul; Watson, Anne; Bennie, Marion

    2017-11-21

    The Scottish Patient Safety Programme - Pharmacy in Primary Care collaborative is a quality improvement initiative adopting the Institute of Healthcare Improvement Breakthrough Series collaborative approach. The programme developed and piloted High Risk Medicine (HRM) Care Bundles (CB), focused on warfarin and non-steroidal anti-inflammatories (NSAIDs), within 27 community pharmacies over 4 NHS Regions. Each CB involves clinical assessment and patient education, although the CB content varies between regions. To support national implementation, this study aims to understand how the pilot pharmacies integrated the HRM CBs into routine practice to inform the development of a generic HRM CB process map. Regional process maps were developed in 4 pharmacies through simulation of the CB process, staff interviews and documentation of resources. Commonalities were collated to develop a process map for each HRM, which were used to explore variation at a national event. A single, generic process map was developed which underwent validation by case study testing. The findings allowed development of a generic process map applicable to warfarin and NSAID CB implementation. Five steps were identified as required for successful CB delivery: patient identification; clinical assessment; pharmacy CB prompt; CB delivery; and documentation. The generic HRM CB process map encompasses the staff and patients' journey and the CB's integration into routine community pharmacy practice. Pharmacist involvement was required only for clinical assessment, indicating suitability for whole-team involvement. Understanding CB integration into routine practice has positive implications for successful implementation. The generic process map can be used to develop targeted resources, and/or be disseminated to facilitate CB delivery and foster whole team involvement. Similar methods could be utilised within other settings, to allow those developing novel services to distil the key processes and consider their integration within routine workflows to effect maximal, efficient implementation and benefit to patient care. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. NAEYC Code of Ethical Conduct. Revised = Codigo de Conducta Etica. Revisada

    ERIC Educational Resources Information Center

    National Association of Elementary School Principals (NAESP), 2005

    2005-01-01

    This document presents a code of ethics for early childhood educators that offers guidelines for responsible behavior and sets forth a common basis for resolving ethical dilemmas encountered in early education. It represents the English and Spanish versions of the revised code. Its contents were approved by the NAEYC Governing Board in April 2005…

  4. 14 CFR 201.4 - General provisions concerning contents.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false General provisions concerning contents. 201.4 Section 201.4 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT OF TRANSPORTATION... UNITED STATES CODE-[AMENDED] Application Procedures § 201.4 General provisions concerning contents. (a...

  5. Scene-aware joint global and local homographic video coding

    NASA Astrophysics Data System (ADS)

    Peng, Xiulian; Xu, Jizheng; Sullivan, Gary J.

    2016-09-01

    Perspective motion is commonly represented in video content that is captured and compressed for various applications including cloud gaming, vehicle and aerial monitoring, etc. Existing approaches based on an eight-parameter homography motion model cannot deal with this efficiently, either due to low prediction accuracy or excessive bit rate overhead. In this paper, we consider the camera motion model and scene structure in such video content and propose a joint global and local homography motion coding approach for video with perspective motion. The camera motion is estimated by a computer vision approach, and camera intrinsic and extrinsic parameters are globally coded at the frame level. The scene is modeled as piece-wise planes, and three plane parameters are coded at the block level. Fast gradient-based approaches are employed to search for the plane parameters for each block region. In this way, improved prediction accuracy and low bit costs are achieved. Experimental results based on the HEVC test model show that up to 9.1% bit rate savings can be achieved (with equal PSNR quality) on test video content with perspective motion. Test sequences for the example applications showed a bit rate savings ranging from 3.7 to 9.1%.

  6. Coding visual features extracted from video sequences.

    PubMed

    Baroffio, Luca; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2014-05-01

    Visual features are successfully exploited in several applications (e.g., visual search, object recognition and tracking, etc.) due to their ability to efficiently represent image content. Several visual analysis tasks require features to be transmitted over a bandwidth-limited network, thus calling for coding techniques to reduce the required bit budget, while attaining a target level of efficiency. In this paper, we propose, for the first time, a coding architecture designed for local features (e.g., SIFT, SURF) extracted from video sequences. To achieve high coding efficiency, we exploit both spatial and temporal redundancy by means of intraframe and interframe coding modes. In addition, we propose a coding mode decision based on rate-distortion optimization. The proposed coding scheme can be conveniently adopted to implement the analyze-then-compress (ATC) paradigm in the context of visual sensor networks. That is, sets of visual features are extracted from video frames, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast to the traditional compress-then-analyze (CTA) paradigm, in which video sequences acquired at a node are compressed and then sent to a central unit for further processing. In this paper, we compare these coding paradigms using metrics that are routinely adopted to evaluate the suitability of visual features in the context of content-based retrieval, object recognition, and tracking. Experimental results demonstrate that, thanks to the significant coding gains achieved by the proposed coding scheme, ATC outperforms CTA with respect to all evaluation metrics.

  7. Towards high dynamic range extensions of HEVC: subjective evaluation of potential coding technologies

    NASA Astrophysics Data System (ADS)

    Hanhart, Philippe; Řeřábek, Martin; Ebrahimi, Touradj

    2015-09-01

    This paper reports the details and results of the subjective evaluations conducted at EPFL to evaluate the responses to the Call for Evidence (CfE) for High Dynamic Range (HDR) and Wide Color Gamut (WCG) Video Coding issued by Moving Picture Experts Group (MPEG). The CfE on HDR/WCG Video Coding aims to explore whether the coding efficiency and/or the functionality of the current version of HEVC standard can be signi_cantly improved for HDR and WCG content. In total, nine submissions, five for Category 1 and four for Category 3a, were compared to the HEVC Main 10 Profile based Anchor. More particularly, five HDR video contents, compressed at four bit rates by each proponent responding to the CfE, were used in the subjective evaluations. Further, the side-by-side presentation methodology was used for the subjective experiment to discriminate small differences between the Anchor and proponents. Subjective results shows that the proposals provide evidence that the coding efficiency can be improved in a statistically noticeable way over MPEG CfE Anchors in terms of perceived quality within the investigated content. The paper further benchmarks the selected objective metrics based on their correlations with the subjective ratings. It is shown that PSNR-DE1000, HDRVDP- 2, and PSNR-Lx can reliably detect visible differences between the proposed encoding solutions and current HEVC standard.

  8. Generic Schemes for Single-Molecule Kinetics. 2: Information Content of the Poisson Indicator.

    PubMed

    Avila, Thomas R; Piephoff, D Evan; Cao, Jianshu

    2017-08-24

    Recently, we described a pathway analysis technique (paper 1) for analyzing generic schemes for single-molecule kinetics based upon the first-passage time distribution. Here, we employ this method to derive expressions for the Poisson indicator, a normalized measure of stochastic variation (essentially equivalent to the Fano factor and Mandel's Q parameter), for various renewal (i.e., memoryless) enzymatic reactions. We examine its dependence on substrate concentration, without assuming all steps follow Poissonian kinetics. Based upon fitting to the functional forms of the first two waiting time moments, we show that, to second order, the non-Poissonian kinetics are generally underdetermined but can be specified in certain scenarios. For an enzymatic reaction with an arbitrary intermediate topology, we identify a generic minimum of the Poisson indicator as a function of substrate concentration, which can be used to tune substrate concentration to the stochastic fluctuations and to estimate the largest number of underlying consecutive links in a turnover cycle. We identify a local maximum of the Poisson indicator (with respect to substrate concentration) for a renewal process as a signature of competitive binding, either between a substrate and an inhibitor or between multiple substrates. Our analysis explores the rich connections between Poisson indicator measurements and microscopic kinetic mechanisms.

  9. Multiprocessing on supercomputers for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Yarrow, Maurice; Mehta, Unmeel B.

    1990-01-01

    Very little use is made of multiple processors available on current supercomputers (computers with a theoretical peak performance capability equal to 100 MFLOPs or more) in computational aerodynamics to significantly improve turnaround time. The productivity of a computer user is directly related to this turnaround time. In a time-sharing environment, the improvement in this speed is achieved when multiple processors are used efficiently to execute an algorithm. The concept of multiple instructions and multiple data (MIMD) through multi-tasking is applied via a strategy which requires relatively minor modifications to an existing code for a single processor. Essentially, this approach maps the available memory to multiple processors, exploiting the C-FORTRAN-Unix interface. The existing single processor code is mapped without the need for developing a new algorithm. The procedure for building a code utilizing this approach is automated with the Unix stream editor. As a demonstration of this approach, a Multiple Processor Multiple Grid (MPMG) code is developed. It is capable of using nine processors, and can be easily extended to a larger number of processors. This code solves the three-dimensional, Reynolds averaged, thin-layer and slender-layer Navier-Stokes equations with an implicit, approximately factored and diagonalized method. The solver is applied to generic oblique-wing aircraft problem on a four processor Cray-2 computer. A tricubic interpolation scheme is developed to increase the accuracy of coupling of overlapped grids. For the oblique-wing aircraft problem, a speedup of two in elapsed (turnaround) time is observed in a saturated time-sharing environment.

  10. The HR factor: codes of conduct and gender issues as levers of innovation in geosciences

    NASA Astrophysics Data System (ADS)

    Rubbia, Giuliana

    2014-05-01

    Professional geosciences organizations which support governments, industry and academic institutions in setting standards for communication, responsible use of geosciences information and continuing professional development do have codes of professional conduct, binding their members. "The geologist is responsible for the impression he gives of his profession in the opinion of those around him and of the public at large" reads one principle of the Code of Professional Conduct of the European Federation of Geologists. Several higher education institutions and public research bodies inspire their regulations to the European Charter of Researchers. In strengthening the relationships of professional organizations with industry, society and academy, it becomes interesting to highlight similarities and fruitful points of contacts between codes of professional ethics and the Charter of Researchers. Ethical principles, professional responsibility and attitude, accountability, dissemination and exploitation of results, public engagement, continuing professional development are some of the remarkable principles. Gender issues are also vital, as starting point to rethink processes in the knowledge society. Structural changes in institutions to improve excellence in research need more women in decision-making bodies, practices of work-family balance and codes of conduct which prevent hidden discriminations. In communication of natural hazards that have societal impact, the diversity management of both target public and communicators can make the difference between a generic communication and an effective one which is more tailored to information needs of women and men acting in the society.

  11. Function of university reactors in operator licensing training for nuclear utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wicks, F.

    1985-11-01

    The director of the Division of the US Nuclear Regulatory Commission in generic letter 84-10, dated April 26, 1984, spoke the requirement that applicants for senior reactor operator licenses for power reactors shall have performed then reactor startups. Simulator startups were not acknowledged. Startups performed on a university reactor are acceptable. The content and results of a five-day program combining instruction and experiments with the Rensselaer reactor are summarized.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonen, F.A.; Khaleel, M.A.

    This paper describes a statistical evaluation of the through-thickness copper variation for welds in reactor pressure vessels, and reviews the historical basis for the static and arrest fracture toughness (K{sub Ic} and K{sub Ia}) equations used in the VISA-II code. Copper variability in welds is due to fabrication procedures with copper contents being randomly distributed, variable from one location to another through the thickness of the vessel. The VISA-II procedure of sampling the copper content from a statistical distribution for every 6.35- to 12.7-mm (1/4- to 1/2-in.) layer through the thickness was found to be consistent with the statistical observations.more » However, the parameters of the VISA-II distribution and statistical limits required further investigation. Copper contents at few locations through the thickness were found to exceed the 0.4% upper limit of the VISA-II code. The data also suggest that the mean copper content varies systematically through the thickness. While, the assumption of normality is not clearly supported by the available data, a statistical evaluation based on all the available data results in mean and standard deviations within the VISA-II code limits.« less

  13. A biological inspired fuzzy adaptive window median filter (FAWMF) for enhancing DNA signal processing.

    PubMed

    Ahmad, Muneer; Jung, Low Tan; Bhuiyan, Al-Amin

    2017-10-01

    Digital signal processing techniques commonly employ fixed length window filters to process the signal contents. DNA signals differ in characteristics from common digital signals since they carry nucleotides as contents. The nucleotides own genetic code context and fuzzy behaviors due to their special structure and order in DNA strand. Employing conventional fixed length window filters for DNA signal processing produce spectral leakage and hence results in signal noise. A biological context aware adaptive window filter is required to process the DNA signals. This paper introduces a biological inspired fuzzy adaptive window median filter (FAWMF) which computes the fuzzy membership strength of nucleotides in each slide of window and filters nucleotides based on median filtering with a combination of s-shaped and z-shaped filters. Since coding regions cause 3-base periodicity by an unbalanced nucleotides' distribution producing a relatively high bias for nucleotides' usage, such fundamental characteristic of nucleotides has been exploited in FAWMF to suppress the signal noise. Along with adaptive response of FAWMF, a strong correlation between median nucleotides and the Π shaped filter was observed which produced enhanced discrimination between coding and non-coding regions contrary to fixed length conventional window filters. The proposed FAWMF attains a significant enhancement in coding regions identification i.e. 40% to 125% as compared to other conventional window filters tested over more than 250 benchmarked and randomly taken DNA datasets of different organisms. This study proves that conventional fixed length window filters applied to DNA signals do not achieve significant results since the nucleotides carry genetic code context. The proposed FAWMF algorithm is adaptive and outperforms significantly to process DNA signal contents. The algorithm applied to variety of DNA datasets produced noteworthy discrimination between coding and non-coding regions contrary to fixed window length conventional filters. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. On the Information Content of Program Traces

    NASA Technical Reports Server (NTRS)

    Frumkin, Michael; Hood, Robert; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Program traces are used for analysis of program performance, memory utilization, and communications as well as for program debugging. The trace contains records of execution events generated by monitoring units inserted into the program. The trace size limits the resolution of execution events and restricts the user's ability to analyze the program execution. We present a study of the information content of program traces and develop a coding scheme which reduces the trace size to the limit given by the trace entropy. We apply the coding to the traces of AIMS instrumented programs executed on the IBM SPA and the SCSI Power Challenge and compare it with other coding methods. Our technique shows size of the trace can be reduced by more than a factor of 5.

  15. Patient recall of specific cognitive therapy contents predicts adherence and outcome in adults with major depressive disorder.

    PubMed

    Dong, Lu; Zhao, Xin; Ong, Stacie L; Harvey, Allison G

    2017-10-01

    The current study examined whether and which specific contents of patients' memory for cognitive therapy (CT) were associated with treatment adherence and outcome. Data were drawn from a pilot RCT of forty-eight depressed adults, who received either CT plus Memory Support Intervention (CT + Memory Support) or CT-as-usual. Patients' memory for treatment was measured using the Patient Recall Task and responses were coded into cognitive behavioral therapy (CBT) codes, such as CBT Model and Cognitive Restructuring, and non-CBT codes, such as individual coping strategies and no code. Treatment adherence was measured using therapist and patient ratings during treatment. Depression outcomes included treatment response, remission, and recurrence. Total number of CBT codes recalled was not significantly different comparing CT + Memory Support to CT-as-usual. Total CBT codes recalled were positively associated with adherence, while non-CBT codes recalled were negatively associated with adherence. Treatment responders (vs. non-responders) exhibited a significant increase in their recall of Cognitive Restructuring from session 7 to posttreatment. Greater recall of Cognitive Restructuring was marginally significantly associated with remission. Greater total number of CBT codes recalled (particularly CBT Model) was associated with non-recurrence of depression. Results highlight the important relationships between patients' memory for treatment and treatment adherence and outcome. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Do Librarians Have a Shared Set of Values? A Comparative Study of 36 Codes of Ethics Based on Gorman's "Enduring Values"

    ERIC Educational Resources Information Center

    Foster, Catherine; McMenemy, David

    2012-01-01

    Thirty-six ethical codes from national professional associations were studied, the aim to test whether librarians have global shared values or if political and cultural contexts have significantly influenced the codes' content. Gorman's eight core values of stewardship, service, intellectual freedom, rationalism, literacy and learning, equity of…

  17. Participation as an outcome measure in psychosocial oncology: content of cancer-specific health-related quality of life instruments.

    PubMed

    van der Mei, Sijrike F; Dijkers, Marcel P J M; Heerkens, Yvonne F

    2011-12-01

    To examine to what extent the concept and the domains of participation as defined in the International Classification of Functioning, Disability and Health (ICF) are represented in general cancer-specific health-related quality of life (HRQOL) instruments. Using the ICF linking rules, two coders independently extracted the meaningful concepts of ten instruments and linked these to ICF codes. The proportion of concepts that could be linked to ICF codes ranged from 68 to 95%. Although all instruments contained concepts linked to Participation (Chapters d7-d9 of the classification of 'Activities and Participation'), the instruments covered only a small part of all available ICF codes. The proportion of ICF codes in the instruments that were participation related ranged from 3 to 35%. 'Major life areas' (d8) was the most frequently used Participation Chapter, with d850 'remunerative employment' as the most used ICF code. The number of participation-related ICF codes covered in the instruments is limited. General cancer-specific HRQOL instruments only assess social life of cancer patients to a limited degree. This study's information on the content of these instruments may guide researchers in selecting the appropriate instrument for a specific research purpose.

  18. Real-time range acquisition by adaptive structured light.

    PubMed

    Koninckx, Thomas P; Van Gool, Luc

    2006-03-01

    The goal of this paper is to provide a "self-adaptive" system for real-time range acquisition. Reconstructions are based on a single frame structured light illumination. Instead of using generic, static coding that is supposed to work under all circumstances, system adaptation is proposed. This occurs on-the-fly and renders the system more robust against instant scene variability and creates suitable patterns at startup. A continuous trade-off between speed and quality is made. A weighted combination of different coding cues--based upon pattern color, geometry, and tracking--yields a robust way to solve the correspondence problem. The individual coding cues are automatically adapted within a considered family of patterns. The weights to combine them are based on the average consistency with the result within a small time-window. The integration itself is done by reformulating the problem as a graph cut. Also, the camera-projector configuration is taken into account for generating the projection patterns. The correctness of the range maps is not guaranteed, but an estimation of the uncertainty is provided for each part of the reconstruction. Our prototype is implemented using unmodified consumer hardware only and, therefore, is cheap. Frame rates vary between 10 and 25 fps, dependent on scene complexity.

  19. PEGASUS 5: An Automated Pre-Processor for Overset-Grid CFD

    NASA Technical Reports Server (NTRS)

    Suhs, Norman E.; Rogers, Stuart E.; Dietz, William E.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    An all new, automated version of the PEGASUS software has been developed and tested. PEGASUS provides the hole-cutting and connectivity information between overlapping grids, and is used as the final part of the grid generation process for overset-grid computational fluid dynamics approaches. The new PEGASUS code (Version 5) has many new features: automated hole cutting; a projection scheme for fixing gaps in overset surfaces; more efficient interpolation search methods using an alternating digital tree; hole-size optimization based on adding additional layers of fringe points; and an automatic restart capability. The new code has also been parallelized using the Message Passing Interface standard. The parallelization performance provides efficient speed-up of the execution time by an order of magnitude, and up to a factor of 30 for very large problems. The results of three example cases are presented: a three-element high-lift airfoil, a generic business jet configuration, and a complete Boeing 777-200 aircraft in a high-lift landing configuration. Comparisons of the computed flow fields for the airfoil and 777 test cases between the old and new versions of the PEGASUS codes show excellent agreement with each other and with experimental results.

  20. Infrastructure for Rapid Development of Java GUI Programs

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Hostetter, Carl F.; Wheeler, Philip

    2006-01-01

    The Java Application Shell (JAS) is a software framework that accelerates the development of Java graphical-user-interface (GUI) application programs by enabling the reuse of common, proven GUI elements, as distinguished from writing custom code for GUI elements. JAS is a software infrastructure upon which Java interactive application programs and graphical user interfaces (GUIs) for those programs can be built as sets of plug-ins. JAS provides an application- programming interface that is extensible by application-specific plugins that describe and encapsulate both specifications of a GUI and application-specific functionality tied to the specified GUI elements. The desired GUI elements are specified in Extensible Markup Language (XML) descriptions instead of in compiled code. JAS reads and interprets these descriptions, then creates and configures a corresponding GUI from a standard set of generic, reusable GUI elements. These elements are then attached (again, according to the XML descriptions) to application-specific compiled code and scripts. An application program constructed by use of JAS as its core can be extended by writing new plug-ins and replacing existing plug-ins. Thus, JAS solves many problems that Java programmers generally solve anew for each project, thereby reducing development and testing time.

  1. SOMM: A New Service Oriented Middleware for Generic Wireless Multimedia Sensor Networks Based on Code Mobility

    PubMed Central

    Faghih, Mohammad Mehdi; Moghaddam, Mohsen Ebrahimi

    2011-01-01

    Although much research in the area of Wireless Multimedia Sensor Networks (WMSNs) has been done in recent years, the programming of sensor nodes is still time-consuming and tedious. It requires expertise in low-level programming, mainly because of the use of resource constrained hardware and also the low level API provided by current operating systems. The code of the resulting systems has typically no clear separation between application and system logic. This minimizes the possibility of reusing code and often leads to the necessity of major changes when the underlying platform is changed. In this paper, we present a service oriented middleware named SOMM to support application development for WMSNs. The main goal of SOMM is to enable the development of modifiable and scalable WMSN applications. A network which uses the SOMM is capable of providing multiple services to multiple clients at the same time with the specified Quality of Service (QoS). SOMM uses a virtual machine with the ability to support mobile agents. Services in SOMM are provided by mobile agents and SOMM also provides a t space on each node which agents can use to communicate with each other. PMID:22346646

  2. SOMM: A new service oriented middleware for generic wireless multimedia sensor networks based on code mobility.

    PubMed

    Faghih, Mohammad Mehdi; Moghaddam, Mohsen Ebrahimi

    2011-01-01

    Although much research in the area of Wireless Multimedia Sensor Networks (WMSNs) has been done in recent years, the programming of sensor nodes is still time-consuming and tedious. It requires expertise in low-level programming, mainly because of the use of resource constrained hardware and also the low level API provided by current operating systems. The code of the resulting systems has typically no clear separation between application and system logic. This minimizes the possibility of reusing code and often leads to the necessity of major changes when the underlying platform is changed. In this paper, we present a service oriented middleware named SOMM to support application development for WMSNs. The main goal of SOMM is to enable the development of modifiable and scalable WMSN applications. A network which uses the SOMM is capable of providing multiple services to multiple clients at the same time with the specified Quality of Service (QoS). SOMM uses a virtual machine with the ability to support mobile agents. Services in SOMM are provided by mobile agents and SOMM also provides a t space on each node which agents can use to communicate with each other.

  3. Retrofitting the AutoBayes Program Synthesis System with Concrete Syntax

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd; Visser, Eelco

    2004-01-01

    AutoBayes is a fully automatic, schema-based program synthesis system for statistical data analysis applications. Its core component is a schema library. i.e., a collection of generic code templates with associated applicability constraints which are instantiated in a problem-specific way during synthesis. Currently, AutoBayes is implemented in Prolog; the schemas thus use abstract syntax (i.e., Prolog terms) to formulate the templates. However, the conceptual distance between this abstract representation and the concrete syntax of the generated programs makes the schemas hard to create and maintain. In this paper we describe how AutoBayes is retrofitted with concrete syntax. We show how it is integrated into Prolog and describe how the seamless interaction of concrete syntax fragments with AutoBayes's remaining legacy meta-programming kernel based on abstract syntax is achieved. We apply the approach to gradually mitigate individual schemas without forcing a disruptive migration of the entire system to a different First experiences show that a smooth migration can be achieved. Moreover, it can result in a considerable reduction of the code size and improved readability of the code. In particular, abstracting out fresh-variable generation and second-order term construction allows the formulation of larger continuous fragments.

  4. Support for User Interfaces for Distributed Systems

    NASA Technical Reports Server (NTRS)

    Eychaner, Glenn; Niessner, Albert

    2005-01-01

    An extensible Java(TradeMark) software framework supports the construction and operation of graphical user interfaces (GUIs) for distributed computing systems typified by ground control systems that send commands to, and receive telemetric data from, spacecraft. Heretofore, such GUIs have been custom built for each new system at considerable expense. In contrast, the present framework affords generic capabilities that can be shared by different distributed systems. Dynamic class loading, reflection, and other run-time capabilities of the Java language and JavaBeans component architecture enable the creation of a GUI for each new distributed computing system with a minimum of custom effort. By use of this framework, GUI components in control panels and menus can send commands to a particular distributed system with a minimum of system-specific code. The framework receives, decodes, processes, and displays telemetry data; custom telemetry data handling can be added for a particular system. The framework supports saving and later restoration of users configurations of control panels and telemetry displays with a minimum of effort in writing system-specific code. GUIs constructed within this framework can be deployed in any operating system with a Java run-time environment, without recompilation or code changes.

  5. Nonlinear dynamic simulation of single- and multi-spool core engines

    NASA Technical Reports Server (NTRS)

    Schobeiri, T.; Lippke, C.; Abouelkheir, M.

    1993-01-01

    In this paper a new computational method for accurate simulation of the nonlinear dynamic behavior of single- and multi-spool core engines, turbofan engines, and power generation gas turbine engines is presented. In order to perform the simulation, a modularly structured computer code has been developed which includes individual mathematical modules representing various engine components. The generic structure of the code enables the dynamic simulation of arbitrary engine configurations ranging from single-spool thrust generation to multi-spool thrust/power generation engines under adverse dynamic operating conditions. For precise simulation of turbine and compressor components, row-by-row calculation procedures were implemented that account for the specific turbine and compressor cascade and blade geometry and characteristics. The dynamic behavior of the subject engine is calculated by solving a number of systems of partial differential equations, which describe the unsteady behavior of the individual components. In order to ensure the capability, accuracy, robustness, and reliability of the code, comprehensive critical performance assessment and validation tests were performed. As representatives, three different transient cases with single- and multi-spool thrust and power generation engines were simulated. The transient cases range from operating with a prescribed fuel schedule, to extreme load changes, to generator and turbine shut down.

  6. Describing the content of primary care: limitations of Canadian billing data.

    PubMed

    Katz, Alan; Halas, Gayle; Dillon, Michael; Sloshower, Jordan

    2012-02-15

    Primary health care systems are designed to provide comprehensive patient care. However, the ICD 9 coding system used for billing purposes in Canada neither characterizes nor captures the scope of clinical practice or complexity of physician-patient interactions. This study aims to describe the content of primary care clinical encounters and examine the limitations of using administrative data to capture the content of these visits. Although a number of U.S studies have described the content of primary care encounters, this is the first Canadian study to do so. Study-specific data collection forms were completed by 16 primary care physicians in community health and family practice clinics in Winnipeg, Manitoba, Canada. The data collection forms were completed immediately following the patient encounter and included patient and visit characteristics, such as primary reason for visit, topics discussed, actions taken, degree of complexity as well as diagnosis and ICD-9 codes. Data was collected for 760 patient encounters. The diagnostic codes often did not reflect the dominant topic of the visit or the topic requiring the most amount of time. Physicians often address multiple problems and provide numerous services thus increasing the complexity of care. This is one of the first Canadian studies to critically analyze the content of primary care clinical encounters. The data allowed a greater understanding of primary care clinical encounters and attests to the deficiencies of singular ICD-9 coding which fails to capture the comprehensiveness and complexity of the primary care encounter. As primary care reform initiatives in the U.S and Canada attempt to transform the way family physicians deliver care, it becomes increasingly important that other tools for structuring primary care data are considered in order to help physicians, researchers and policy makers understand the breadth and complexity of primary care.

  7. Content Coding of Psychotherapy Transcripts Using Labeled Topic Models.

    PubMed

    Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic

    2017-03-01

    Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, nonstandardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the labeled latent Dirichlet allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of 0.79, and 0.70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scalable method for accurate automated coding of psychotherapy sessions that perform better than comparable discriminative methods at session-level coding and can also predict fine-grained codes.

  8. Content Coding of Psychotherapy Transcripts Using Labeled Topic Models

    PubMed Central

    Gaut, Garren; Steyvers, Mark; Imel, Zac E; Atkins, David C; Smyth, Padhraic

    2016-01-01

    Psychotherapy represents a broad class of medical interventions received by millions of patients each year. Unlike most medical treatments, its primary mechanisms are linguistic; i.e., the treatment relies directly on a conversation between a patient and provider. However, the evaluation of patient-provider conversation suffers from critical shortcomings, including intensive labor requirements, coder error, non-standardized coding systems, and inability to scale up to larger data sets. To overcome these shortcomings, psychotherapy analysis needs a reliable and scalable method for summarizing the content of treatment encounters. We used a publicly-available psychotherapy corpus from Alexander Street press comprising a large collection of transcripts of patient-provider conversations to compare coding performance for two machine learning methods. We used the Labeled Latent Dirichlet Allocation (L-LDA) model to learn associations between text and codes, to predict codes in psychotherapy sessions, and to localize specific passages of within-session text representative of a session code. We compared the L-LDA model to a baseline lasso regression model using predictive accuracy and model generalizability (measured by calculating the area under the curve (AUC) from the receiver operating characteristic (ROC) curve). The L-LDA model outperforms the lasso logistic regression model at predicting session-level codes with average AUC scores of .79, and .70, respectively. For fine-grained level coding, L-LDA and logistic regression are able to identify specific talk-turns representative of symptom codes. However, model performance for talk-turn identification is not yet as reliable as human coders. We conclude that the L-LDA model has the potential to be an objective, scaleable method for accurate automated coding of psychotherapy sessions that performs better than comparable discriminative methods at session-level coding and can also predict fine-grained codes. PMID:26625437

  9. Changing the Latitudes and Attitudes about Content Analysis Research

    ERIC Educational Resources Information Center

    Brank, Eve M.; Fox, Kathleen A.; Youstin, Tasha J.; Boeppler, Lee C.

    2008-01-01

    The current research employs the use of content analysis to teach research methods concepts among students enrolled in an upper division research methods course. Students coded and analyzed Jimmy Buffett song lyrics rather than using a downloadable database or collecting survey data. Students' knowledge of content analysis concepts increased after…

  10. A study of thematic content in hospital mission statements: a question of values.

    PubMed

    Williams, Jaime; Smythe, William; Hadjistavropoulos, Thomas; Malloy, David C; Martin, Ronald

    2005-01-01

    We examined the content of Canadian hospital mission statements using thematic content analysis. The mission statements that we studied varied in terms of both content and length. Although there was some content related to goals designed to ensure organizational visibility, survival, and competitiveness, the domain of values predominated over our entire coding structure. The primary value-related theme that emerged concerned the importance of patient care.

  11. On the use of tower-flux measurements to assess the performance of global ecosystem models

    NASA Astrophysics Data System (ADS)

    El Maayar, M.; Kucharik, C.

    2003-04-01

    Global ecosystem models are important tools for the study of biospheric processes and their responses to environmental changes. Such models typically translate knowledge, gained from local observations, into estimates of regional or even global outcomes of ecosystem processes. A typical test of ecosystem models consists of comparing their output against tower-flux measurements of land surface-atmosphere exchange of heat and mass. To perform such tests, models are typically run using detailed information on soil properties (texture, carbon content,...) and vegetation structure observed at the experimental site (e.g., vegetation height, vegetation phenology, leaf photosynthetic characteristics,...). In global simulations, however, earth's vegetation is typically represented by a limited number of plant functional types (PFT; group of plant species that have similar physiological and ecological characteristics). For each PFT (e.g., temperate broadleaf trees, boreal conifer evergreen trees,...), which can cover a very large area, a set of typical physiological and physical parameters are assigned. Thus, a legitimate question arises: How does the performance of a global ecosystem model run using detailed site-specific parameters compare with the performance of a less detailed global version where generic parameters are attributed to a group of vegetation species forming a PFT? To answer this question, we used a multiyear dataset, measured at two forest sites with contrasting environments, to compare seasonal and interannual variability of surface-atmosphere exchange of water and carbon predicted by the Integrated BIosphere Simulator-Dynamic Global Vegetation Model. Two types of simulations were, thus, performed: a) Detailed runs: observed vegetation characteristics (leaf area index, vegetation height,...) and soil carbon content, in addition to climate and soil type, are specified for model run; and b) Generic runs: when only observed climates and soil types at the measurement sites are used to run the model. The generic runs were performed for the number of years equal to the current age of the forests, initialized with no vegetation and a soil carbon density equal to zero.

  12. Comparison of Outcomes Following a Switch From a Brand to an Authorized Versus Independent Generic Drug.

    PubMed

    Hansen, R A; Qian, J; Berg, R L; Linneman, J G; Seoane-Vazquez, E; Dutcher, S; Raofi, S; Page, C D; Peissig, P L

    2018-02-01

    Authorized generics are identical in formulation to brand drugs, manufactured by the brand company but marketed as a generic. Generics, marketed by generic manufacturers, are required to demonstrate pharmaceutical and bioequivalence to the brand drug, but repetition of clinical trials is not required. This retrospective cohort study compared outcomes for generics and authorized generics, which serves as a generic vs. brand proxy that minimizes bias against generics. For the seven drugs studied between 1999 and 2014, 5,234 unique patients were on brand drugs prior to generic entry and 4,900 (93.6%) switched to a generic. During the 12 months following the brand-to-generic switch, patients using generics vs. authorized generics were similar in terms of outpatient visits, urgent care visits, hospitalizations, and medication discontinuation. The likelihood of emergency department (ED) visits was slightly higher for authorized generics compared with generics. These data suggest that generics were clinically no worse than their proxy brand comparators. © 2017 American Society for Clinical Pharmacology and Therapeutics.

  13. Determination of 15 isoflavone isomers in soy foods and supplements by high-performance liquid chromatography.

    PubMed

    Yanaka, Kaoru; Takebayashi, Jun; Matsumoto, Teruki; Ishimi, Yoshiko

    2012-04-25

    Soy isoflavone is the generic name for the isoflavones found in soy. We determined the concentrations of 15 soy isoflavone species, including 3 succinyl glucosides, in 22 soy foods and isoflavone supplements by high-performance liquid chromatography (HPLC). The total isoflavone contents in 14 soy foods and 8 supplements ranged from 45 to 735 μg/g and from 1,304 to 90,224 μg/g, respectively. Higher amounts of succinyl glucosides were detected in natto, a typical fermented soy product in Japan; these ranged from 30 to 80 μg/g and comprised 4.1-10.9% of the total isoflavone content. In soy powder, 59 μg/g of succinyl glucosides were detected, equivalent to 4.6% of the total isoflavone content. These data suggest that the total isoflavone contents may be underestimated in the previous studies that have not included succinyl glucosides, especially for Bacillus subtilis -fermented soy food products.

  14. GARLIC - A general purpose atmospheric radiative transfer line-by-line infrared-microwave code: Implementation and evaluation

    NASA Astrophysics Data System (ADS)

    Schreier, Franz; Gimeno García, Sebastián; Hedelt, Pascal; Hess, Michael; Mendrok, Jana; Vasquez, Mayte; Xu, Jian

    2014-04-01

    A suite of programs for high resolution infrared-microwave atmospheric radiative transfer modeling has been developed with emphasis on efficient and reliable numerical algorithms and a modular approach appropriate for simulation and/or retrieval in a variety of applications. The Generic Atmospheric Radiation Line-by-line Infrared Code - GARLIC - is suitable for arbitrary observation geometry, instrumental field-of-view, and line shape. The core of GARLIC's subroutines constitutes the basis of forward models used to implement inversion codes to retrieve atmospheric state parameters from limb and nadir sounding instruments. This paper briefly introduces the physical and mathematical basics of GARLIC and its descendants and continues with an in-depth presentation of various implementation aspects: An optimized Voigt function algorithm combined with a two-grid approach is used to accelerate the line-by-line modeling of molecular cross sections; various quadrature methods are implemented to evaluate the Schwarzschild and Beer integrals; and Jacobians, i.e. derivatives with respect to the unknowns of the atmospheric inverse problem, are implemented by means of automatic differentiation. For an assessment of GARLIC's performance, a comparison of the quadrature methods for solution of the path integral is provided. Verification and validation are demonstrated using intercomparisons with other line-by-line codes and comparisons of synthetic spectra with spectra observed on Earth and from Venus.

  15. Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller

    NASA Astrophysics Data System (ADS)

    Perdikis, S.; Leeb, R.; Williamson, J.; Ramsay, A.; Tavella, M.; Desideri, L.; Hoogerwerf, E.-J.; Al-Khodairy, A.; Murray-Smith, R.; Millán, J. d. R.

    2014-06-01

    Objective. While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. Approach. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. Main results. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. Significance. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.

  16. AVC/H.264 patent portfolio license

    NASA Astrophysics Data System (ADS)

    Horn, Lawrence A.

    2004-11-01

    MPEG LA, LLC recently announced terms of a joint patent license for the AVC (a/k/a H.264) Standard (ISO/IEC IS 14496-10: Information technology -- Coding of audio-visual objects -- Part 10: Advanced Video Coding | ITU-T Rec. H.264: Series H: Audiovisual and Multimedia Systems: Infrastructure of audiovisual services -- Coding of moving video: Advanced video coding for generic audiovisual services). Like MPEG LA"s other licenses, the AVC Patent Portfolio License is offered for the convenience of the marketplace as an alternative enabling users to access essential intellectual property owned by many patent holders under a single license rather than negotiating licenses with each of them individually. The AVC Patent Portfolio License includes essential patents owned by Columbia Innovation Enterprises; Electronics and Telecommunications Research Institute (ETRI); France Télécom, société anonyme; Fujitsu Limited; Koninklijke Philips Electronics N.V.; Matsushita Electric Industrial Co., Ltd.; Microsoft Corporation; Mitsubishi Electric Corporation; Robert Bosch GmbH; Samsung Electronics Co., Ltd.; Sharp Kabushiki Kaisha; Sony Corporation; Toshiba Corporation; and Victor Company of Japan, Limited. MPEG LA"s objective is to provide worldwide access to as much AVC essential intellectual property as possible for the benefit of AVC users. Therefore, any party that believes it has essential patents is welcome to submit them for evaluation of their essentiality and inclusion in the License if found essential.

  17. Clinical evaluation of BrainTree, a motor imagery hybrid BCI speller.

    PubMed

    Perdikis, S; Leeb, R; Williamson, J; Ramsay, A; Tavella, M; Desideri, L; Hoogerwerf, E-J; Al-Khodairy, A; Murray-Smith, R; Millán, J D R

    2014-06-01

    While brain-computer interfaces (BCIs) for communication have reached considerable technical maturity, there is still a great need for state-of-the-art evaluation by the end-users outside laboratory environments. To achieve this primary objective, it is necessary to augment a BCI with a series of components that allow end-users to type text effectively. This work presents the clinical evaluation of a motor imagery (MI) BCI text-speller, called BrainTree, by six severely disabled end-users and ten able-bodied users. Additionally, we define a generic model of code-based BCI applications, which serves as an analytical tool for evaluation and design. We show that all users achieved remarkable usability and efficiency outcomes in spelling. Furthermore, our model-based analysis highlights the added value of human-computer interaction techniques and hybrid BCI error-handling mechanisms, and reveals the effects of BCI performances on usability and efficiency in code-based applications. This study demonstrates the usability potential of code-based MI spellers, with BrainTree being the first to be evaluated by a substantial number of end-users, establishing them as a viable, competitive alternative to other popular BCI spellers. Another major outcome of our model-based analysis is the derivation of a 80% minimum command accuracy requirement for successful code-based application control, revising upwards previous estimates attempted in the literature.

  18. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  19. Code of Ethics for Electrical Engineers

    NASA Astrophysics Data System (ADS)

    Matsuki, Junya

    The Institute of Electrical Engineers of Japan (IEEJ) has established the rules of practice for its members recently, based on its code of ethics enacted in 1998. In this paper, first, the characteristics of the IEEJ 1998 ethical code are explained in detail compared to the other ethical codes for other fields of engineering. Secondly, the contents which shall be included in the modern code of ethics for electrical engineers are discussed. Thirdly, the newly-established rules of practice and the modified code of ethics are presented. Finally, results of questionnaires on the new ethical code and rules which were answered on May 23, 2007, by 51 electrical and electronic students of the University of Fukui are shown.

  20. Intercultural communication through the eyes of patients: experiences and preferences

    PubMed Central

    van Dulmen, Sandra; Bank, Lindsay; Seeleman, Conny; Scherpbier, Albert; Scheele, Fedde

    2017-01-01

    Objectives To explore patients’ preferences and experiences regarding intercultural communication which could influence the development of intercultural patient-centred communication training. Methods This qualitative study is based on interviews with non-native patients. Thirty non-native patients were interviewed between September and December 2015 about their preferences and experiences regarding communication with a native Dutch doctor. Fourteen interviews were established with an interpreter. The semi-structured interviews took place in Amsterdam. They were focused on generic and intercultural communication skills of doctors. Relevant fragments were coded by two researchers and analysed by the research team by means of thematic network analysis. Informed consent and ethical approval was obtained beforehand. Results All patients preferred a doctor with a professional patient-centred attitude regardless of the doctor’s background. Patients mentioned mainly generic communication aspects, such as listening, as important skills and seemed to be aware of their own responsibility in participating in a consultation. Being treated as a unique person and not as a disease was also frequently mentioned. Unfamiliarity with the Dutch healthcare system influenced the experienced communication negatively. However, a language barrier was considered the most important problem, which would become less pressing once a doctor-patient relation was established. Conclusions Remarkably, patients in this study had no preference regarding the ethnic background of the doctor. Generic communication was experienced as important as specific intercultural communication, which underlines the marginal distinction between these two. A close link between intercultural communication and patient-centred communication was reflected in the expressed preference ‘to be treated as a person’.   PMID:28535143

  1. Brand-name drug, generic drug, orphan drug. Pharmacological therapy with biosimilar drugs – provision of due diligence in the treatment process

    PubMed Central

    Zajdel, Justyna

    2013-01-01

    Due diligence in the process of provision of healthcare services refers, among other elements, to the application of pharmacological therapy at a time which offers the greatest chance for a successful outcome of treatment, i.e. for achieving the optimum expected effect understood as an improvement in the patient's health, reduction of health risks or elimination of the disease. However, due diligence may also refer to actions aimed at ensuring that neither the patient nor the healthcare payer is required to incur unreasonable costs in the process of treatment. The validity of that statement stems not only from normative acts but also from ethical standards laid down in the Medical Code of Ethics (Article 57 section 2). It often happens that the provision of optimal treatment calls for deviations from the formal provisions included in Summary Product Characteristics (SPCs), and the application of drugs that are bioequivalent to reference drugs, which translates into a significant reduction of costs. The present study addresses the problem of acceptability of a specific form of drug substitution consisting in the replacement of a reference drug with a generic drug. Also explored are legal aspects associated with the possibility of therapy based on “off-label use”. The study reviews normative acts existing in the Polish and EU legislation. It also provides a clear definition of orphan drug, which has made it possible to make a distinction and investigate mutual relations between the concepts of brand-name (reference) drug, orphan drug and generic drug. PMID:24592133

  2. Brand-name drug, generic drug, orphan drug. Pharmacological therapy with biosimilar drugs - provision of due diligence in the treatment process.

    PubMed

    Zajdel, Justyna; Zajdel, Radosław

    2013-01-01

    Due diligence in the process of provision of healthcare services refers, among other elements, to the application of pharmacological therapy at a time which offers the greatest chance for a successful outcome of treatment, i.e. for achieving the optimum expected effect understood as an improvement in the patient's health, reduction of health risks or elimination of the disease. However, due diligence may also refer to actions aimed at ensuring that neither the patient nor the healthcare payer is required to incur unreasonable costs in the process of treatment. The validity of that statement stems not only from normative acts but also from ethical standards laid down in the Medical Code of Ethics (Article 57 section 2). It often happens that the provision of optimal treatment calls for deviations from the formal provisions included in Summary Product Characteristics (SPCs), and the application of drugs that are bioequivalent to reference drugs, which translates into a significant reduction of costs. The present study addresses the problem of acceptability of a specific form of drug substitution consisting in the replacement of a reference drug with a generic drug. Also explored are legal aspects associated with the possibility of therapy based on "off-label use". The study reviews normative acts existing in the Polish and EU legislation. It also provides a clear definition of orphan drug, which has made it possible to make a distinction and investigate mutual relations between the concepts of brand-name (reference) drug, orphan drug and generic drug.

  3. Comparing effectiveness of generic and disease-specific self-management interventions for people with diabetes in a practice context.

    PubMed

    Ghahari, Setareh; Packer, Tanya; Boldy, Duncan; Melling, Lauren; Parsons, Richard

    2015-10-01

    The effectiveness of self-management interventions has been demonstrated. However, the benefits of generic vs. disease-specific programs are unclear, and their efficacy within a practice setting has yet to be fully explored. To compare the outcomes of the diabetes-specific self-management program (Diabetes) and the generic chronic disease Self-management Program (Chronic Condition) and to explore whether program characteristics, evaluated using the Quality Self-Management Assessment Framework (Q-SAF), provide insight into the results of the outcome evaluation. A pragmatic pretest, post-test design with 12-week follow up was used to compare the 2 self-management interventions. Outcomes were quality of life, self-efficacy, loneliness, self-management skills, depression, and health behaviours. People with diabetes self-selected attendance at the Diabetes or Chronic Condition program offered as part of routine practice. Participants with diabetes in the 2 programs (Diabetes=200; Chronic Condition=90) differed significantly in almost all demographic and clinical characteristics. Both programs yielded positive outcomes. Controlling for baseline and demographic characteristics, random effects modelling showed an interaction between time and program for 1 outcome: self-efficacy (p=0.029). Participants in the Chronic Condition group experienced greater improvements over time than did those in the Diabetes group. The Q-SAF analysis showed differences in program content, delivery and workforce capacity. People with diabetes benefited from both programs, but participation in the generic program resulted in greater improvements in self-efficacy for participants who had self-selected that program. Both programs in routine care led to health-related improvements. The Q-SAF can be used to assess the quality of programs. Copyright © 2015 Canadian Diabetes Association. Published by Elsevier Inc. All rights reserved.

  4. Exploration and Production of Hydrocarbon Resources in Coastal Alabama and Mississippi. Executive Summary.

    DTIC Science & Technology

    1984-11-01

    r .-. u S FINAL GENERIC ENVIROMENTAL IMPACT SlAiLMENT .XPLORAYION AND PRODUCTION OF HYDROCARBON RESOU CES IN COASTAL AIABAMA AND MISSISSIPPI The...a service could potentially affect cultural resources in the area of development. Prior to issuing any project permit, conflicts on potential impacts...34 Air Emission. 6-34 Noise 6-34 Solid and Hazardous Waste 6-34 Socioe -conomic Characteristics 6-34 Navigation 6-34 xxiv TABLE OF CONTENTS (Continued

  5. SMT-Aware Instantaneous Footprint Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roy, Probir; Liu, Xu; Song, Shuaiwen

    Modern architectures employ simultaneous multithreading (SMT) to increase thread-level parallelism. SMT threads share many functional units and the whole memory hierarchy of a physical core. Without a careful code design, SMT threads can easily contend with each other for these shared resources, causing severe performance degradation. Minimizing SMT thread contention for HPC applications running on dedicated platforms is very challenging, because they usually spawn threads within Single Program Multiple Data (SPMD) models. To address this important issue, we introduce a simple scheme for SMT-aware code optimization, which aims to reduce the memory contention across SMT threads.

  6. Back to basics: informing the public of co-morbid physical health problems in those with mental illness.

    PubMed

    Ahire, Mrinalini; Sheridan, Judith; Regbetz, Shane; Stacey, Phillip; Scott, James G

    2013-02-01

    Those with mental illness are at increased risk of physical health problems. The current study aimed to examine the information available online to the Australian public about the increased risk and consequences of physical illness in those with mental health problems and the services available to address these co-morbidities. A structured online search was conducted with the search engine Google Australia (www.google.com.au) using generic search terms 'mental health information Australia', 'mental illness information Australia', 'depression', 'anxiety', and 'psychosis'. The direct content of websites was examined for information on the physical co-morbidities of mental illness. All external links on high-profile websites [the first five websites retrieved under each search term (n = 25)] were examined for information pertaining to physical health. Only 4.2% of websites informing the public about mental health contained direct content information about the increased risk of physical co-morbidities. The Australian Government's Department of Health and Ageing site did not contain any information. Of the high-profile websites, 62% had external links to resources about physical health and 55% had recommendations or resources for physical health. Most recommendations were generic. Relative to the seriousness of this problem, there is a paucity of information available to the public about the increased physical health risks associated with mental illness. Improved public awareness is the starting point of addressing this health inequity.

  7. Reliability in Cross-National Content Analysis.

    ERIC Educational Resources Information Center

    Peter, Jochen; Lauf, Edmund

    2002-01-01

    Investigates how coder characteristics such as language skills, political knowledge, coding experience, and coding certainty affected inter-coder and coder-training reliability. Shows that language skills influenced both reliability types. Suggests that cross-national researchers should pay more attention to cross-national assessments of…

  8. Data standards for clinical research data collection forms: current status and challenges.

    PubMed

    Richesson, Rachel L; Nadkarni, Prakash

    2011-05-01

    Case report forms (CRFs) are used for structured-data collection in clinical research studies. Existing CRF-related standards encompass structural features of forms and data items, content standards, and specifications for using terminologies. This paper reviews existing standards and discusses their current limitations. Because clinical research is highly protocol-specific, forms-development processes are more easily standardized than is CRF content. Tools that support retrieval and reuse of existing items will enable standards adoption in clinical research applications. Such tools will depend upon formal relationships between items and terminological standards. Future standards adoption will depend upon standardized approaches for bridging generic structural standards and domain-specific content standards. Clinical research informatics can help define tools requirements in terms of workflow support for research activities, reconcile the perspectives of varied clinical research stakeholders, and coordinate standards efforts toward interoperability across healthcare and research data collection.

  9. Molecular Modeling for Calculation of Mechanical Properties of Epoxies with Moisture Ingress

    NASA Technical Reports Server (NTRS)

    Clancy, Thomas C.; Frankland, Sarah J.; Hinkley, J. A.; Gates, T. S.

    2009-01-01

    Atomistic models of epoxy structures were built in order to assess the effect of crosslink degree, moisture content and temperature on the calculated properties of a typical representative generic epoxy. Each atomistic model had approximately 7000 atoms and was contained within a periodic boundary condition cell with edge lengths of about 4 nm. Four atomistic models were built with a range of crosslink degree and moisture content. Each of these structures was simulated at three temperatures: 300 K, 350 K, and 400 K. Elastic constants were calculated for these structures by monitoring the stress tensor as a function of applied strain deformations to the periodic boundary conditions. The mechanical properties showed reasonably consistent behavior with respect to these parameters. The moduli decreased with decreasing crosslink degree with increasing temperature. The moduli generally decreased with increasing moisture content, although this effect was not as consistent as that seen for temperature and crosslink degree.

  10. 49 U.S.C. 22705 - Content

    Code of Federal Regulations, 2010 CFR

    ... 49 U.S.C. United States Code, 2009 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...

  11. 49 U.S.C. 22705 - Content

    Code of Federal Regulations, 2010 CFR

    ... 49 U.S.C. United States Code, 2011 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...

  12. 49 U.S.C. 22705 - Content

    Code of Federal Regulations, 2010 CFR

    ... 49 U.S.C. United States Code, 2014 Edition Title 49 - TRANSPORTATION SUBTITLE V - RAIL PROGRAMS PART B - ASSISTANCE CHAPTER 227 - STATE RAIL PLANS Sec. 22705 - Content §22705. Content (a) In General .—Each State rail plan shall, at a minimum, contain the following: (1) An inventory of the existing overall rail transportation system an...

  13. Comparison of generic-to-brand switchback patterns for generic and authorized generic drugs

    PubMed Central

    Hansen, Richard A.; Qian, Jingjing; Berg, Richard; Linneman, James; Seoane-Vazquez, Enrique; Dutcher, Sarah K.; Raofi, Saeid; Page, C. David; Peissig, Peggy

    2018-01-01

    Background While generic drugs are therapeutically equivalent to brand drugs, some patients and healthcare providers remain uncertain about whether they produce identical outcomes. Authorized generics, which are identical in formulation to corresponding brand drugs but marketed as a generic, provide a unique post-marketing opportunity to study whether utilization patterns are influenced by perceptions of generic drugs. Objectives To compare generic-to-brand switchback rates between generics and authorized generics. Methods A retrospective cohort study was conducted using claims and electronic health records data from a regional U.S. healthcare system. Ten drugs with authorized generics and generics marketed between 1999 and 2014 were evaluated. Eligible adult patients received a brand drug during the 6 months preceding generic entry, and then switched to a generic or authorized generic. Patients in this cohort were followed for up to 30 months from the index switch date to evaluate occurrence of generic-to-brand switchbacks. Switchback rates were compared between patients on authorized generics versus generics using Kaplan-Meier curves and Cox proportional hazards models, controlling for individual drug effects, age, sex, Charlson comorbidity score, pre-index drug use characteristics, and pre-index healthcare utilization. Results Among 5,542 unique patients that switched from brand-to-generic or brand-to-authorized generic, 264 (4.8%) switched back to the brand drug. Overall switchback rates were similar for authorized generics compared with generics (HR=0.86; 95% CI 0.65-1.15). The likelihood of switchback was higher for alendronate (HR=1.64; 95% CI 1.20-2.23) and simvastatin (HR=1.81; 95% CI 1.30-2.54) and lower for amlodipine (HR=0.27; 95% CI 0.17-0.42) compared with other drugs in the cohort. Conclusions Overall switchback rates were similar between authorized generic and generic drug users, indirectly supporting similar efficacy and tolerability profiles for brand and generic drugs. Reasons for differences in switchback rates among specific products need to be further explored. PMID:28152215

  14. Histochemical changes of occlusal surface enamel of permanent teeth, where dental caries is questionable vs sound enamel surfaces.

    PubMed

    Michalaki, M; Oulis, C J; Pandis, N; Eliades, G

    2016-12-01

    This in vitro study was to classify questionable for caries occlusal surfaces (QCOS) of permanent teeth according to ICDAS codes 1, 2, and 3 and to compare them in terms of enamel mineral composition with the areas of sound tissue of the same tooth. Partially impacted human molars (60) extracted for therapeutic reasons with QCOS were used in the study, photographed via a polarised light microscope and classified according to the ICDAS II (into codes 1, 2, or 3). The crowns were embedded in clear self-cured acrylic resin and longitudinally sectioned at the levels of the characterised lesions and studied by SEM/EDX, to assess enamel mineral composition of the QCOS. Univariate and multivariate random effect regressions were used for Ca (wt%), P (wt%), and Ca/P (wt%). The EDX analysis indicated changes in the Ca and P contents that were more prominent in ICDAS-II code 3 lesions compared to codes 1 and 2 lesions. In these lesions, Ca (wt%) and P (wt%) concentrations were significantly decreased (p = 0.01) in comparison with sound areas. Ca and P (wt%) contents were significantly lower (p = 0.02 and p = 0.01 respectively) for code 3 areas in comparison with codes 1 and 2 areas. Significantly higher (p = 0.01) Ca (wt%) and P (wt%) contents were found on sound areas compared to the lesion areas. The enamel of occlusal surfaces of permanent teeth with ICDAS 1, 2, and 3 lesions was found to have different Ca/P compositions, necessitating further investigation on whether these altered surfaces might behave differently on etching preparation before fissure sealant placement, compared to sound surfaces.

  15. Louder than words: power and conflict in interprofessional education articles, 1954–2013

    PubMed Central

    Paradis, Elise; Whitehead, Cynthia R

    2015-01-01

    Context Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Objectives Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict – elements central to interprofessional care – figure in the IPE literature. Methods We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Results Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. Conclusions The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. PMID:25800300

  16. Coding Local and Global Binary Visual Features Extracted From Video Sequences.

    PubMed

    Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2015-11-01

    Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the bag-of-visual word model. Several applications, including, for example, visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget while attaining a target level of efficiency. In this paper, we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can conveniently be adopted to support the analyze-then-compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs the visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the compress-then-analyze (CTA) paradigm. In this paper, we experimentally compare the ATC and the CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: 1) homography estimation and 2) content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with the CTA, especially in bandwidth limited scenarios.

  17. Coding Local and Global Binary Visual Features Extracted From Video Sequences

    NASA Astrophysics Data System (ADS)

    Baroffio, Luca; Canclini, Antonio; Cesana, Matteo; Redondi, Alessandro; Tagliasacchi, Marco; Tubaro, Stefano

    2015-11-01

    Binary local features represent an effective alternative to real-valued descriptors, leading to comparable results for many visual analysis tasks, while being characterized by significantly lower computational complexity and memory requirements. When dealing with large collections, a more compact representation based on global features is often preferred, which can be obtained from local features by means of, e.g., the Bag-of-Visual-Word (BoVW) model. Several applications, including for example visual sensor networks and mobile augmented reality, require visual features to be transmitted over a bandwidth-limited network, thus calling for coding techniques that aim at reducing the required bit budget, while attaining a target level of efficiency. In this paper we investigate a coding scheme tailored to both local and global binary features, which aims at exploiting both spatial and temporal redundancy by means of intra- and inter-frame coding. In this respect, the proposed coding scheme can be conveniently adopted to support the Analyze-Then-Compress (ATC) paradigm. That is, visual features are extracted from the acquired content, encoded at remote nodes, and finally transmitted to a central controller that performs visual analysis. This is in contrast with the traditional approach, in which visual content is acquired at a node, compressed and then sent to a central unit for further processing, according to the Compress-Then-Analyze (CTA) paradigm. In this paper we experimentally compare ATC and CTA by means of rate-efficiency curves in the context of two different visual analysis tasks: homography estimation and content-based retrieval. Our results show that the novel ATC paradigm based on the proposed coding primitives can be competitive with CTA, especially in bandwidth limited scenarios.

  18. Louder than words: power and conflict in interprofessional education articles, 1954-2013.

    PubMed

    Paradis, Elise; Whitehead, Cynthia R

    2015-04-01

    Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict - elements central to interprofessional care - figure in the IPE literature. We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. © 2015 The Authors Medical Education Published by John Wiley & Sons Ltd.

  19. Genomics dataset of unidentified disclosed isolates.

    PubMed

    Rekadwad, Bhagwan N

    2016-09-01

    Analysis of DNA sequences is necessary for higher hierarchical classification of the organisms. It gives clues about the characteristics of organisms and their taxonomic position. This dataset is chosen to find complexities in the unidentified DNA in the disclosed patents. A total of 17 unidentified DNA sequences were thoroughly analyzed. The quick response codes were generated. AT/GC content of the DNA sequences analysis was carried out. The QR is helpful for quick identification of isolates. AT/GC content is helpful for studying their stability at different temperatures. Additionally, a dataset on cleavage code and enzyme code studied under the restriction digestion study, which helpful for performing studies using short DNA sequences was reported. The dataset disclosed here is the new revelatory data for exploration of unique DNA sequences for evaluation, identification, comparison and analysis.

  20. Modelling and experimental verification of a water alleviation system for the NASP. [National Aerospace Plane

    NASA Technical Reports Server (NTRS)

    Vanfossen, G. James

    1992-01-01

    One possible low speed propulsion system for the National Aerospace Plane is a liquid air cycle engine (LACE). The LACE system uses the heat sink in the liquid hydrogen propellant to liquefy air in a heat exchanger which is then pumped up to high pressure and used as the oxidizer in a hydrogen liquid air rocket. The inlet airstream must be dehumidified or moisture could freeze on the cryogenic heat exchangers and block them. The main objective of this research has been to develop a computer simulation of the cold tube/antifreeze-spray water alleviation system and to verify the model with experimental data. An experimental facility has been built and humid air tests were conducted on a generic heat exchanger to obtain condensing data for code development. The paper describes the experimental setup, outlines the method of calculation used in the code, and presents comparisons of the calculations and measurements. Cause of discrepancies between the model and data are explained.

  1. Anyonic self-induced disorder in a stabilizer code: Quasi many-body localization in a translational invariant model

    NASA Astrophysics Data System (ADS)

    Yarloo, H.; Langari, A.; Vaezi, A.

    2018-02-01

    We enquire into the quasi many-body localization in topologically ordered states of matter, revolving around the case of Kitaev toric code on the ladder geometry, where different types of anyonic defects carry different masses induced by environmental errors. Our study verifies that the presence of anyons generates a complex energy landscape solely through braiding statistics, which suffices to suppress the diffusion of defects in such clean, multicomponent anyonic liquid. This nonergodic dynamics suggests a promising scenario for investigation of quasi many-body localization. Computing standard diagnostics evidences that a typical initial inhomogeneity of anyons gives birth to a glassy dynamics with an exponentially diverging time scale of the full relaxation. Our results unveil how self-generated disorder ameliorates the vulnerability of topological order away from equilibrium. This setting provides a new platform which paves the way toward impeding logical errors by self-localization of anyons in a generic, high energy state, originated exclusively in their exotic statistics.

  2. Supersymmetric and non-supersymmetric models without catastrophic Goldstone bosons

    NASA Astrophysics Data System (ADS)

    Braathen, Johannes; Goodsell, Mark D.; Staub, Florian

    2017-11-01

    The calculation of the Higgs mass in general renormalisable field theories has been plagued by the so-called "Goldstone Boson Catastrophe," where light (would-be) Goldstone bosons give infra-red divergent loop integrals. In supersymmetric models, previous approaches included a workaround that ameliorated the problem for most, but not all, parameter space regions; while giving divergent results everywhere for non-supersymmetric models! We present an implementation of a general solution to the problem in the public code SARAH, along with new calculations of some necessary loop integrals and generic expressions. We discuss the validation of our code in the Standard Model, where we find remarkable agreement with the known results. We then show new applications in Split SUSY, the NMSSM, the Two-Higgs-Doublet Model, and the Georgi-Machacek model. In particular, we take some first steps to exploring where the habit of using tree-level mass relations in non-supersymmetric models breaks down, and show that the loop corrections usually become very large well before naive perturbativity bounds are reached.

  3. Pythran: enabling static optimization of scientific Python programs

    NASA Astrophysics Data System (ADS)

    Guelton, Serge; Brunet, Pierrick; Amini, Mehdi; Merlini, Adrien; Corbillon, Xavier; Raynaud, Alan

    2015-01-01

    Pythran is an open source static compiler that turns modules written in a subset of Python language into native ones. Assuming that scientific modules do not rely much on the dynamic features of the language, it trades them for powerful, possibly inter-procedural, optimizations. These optimizations include detection of pure functions, temporary allocation removal, constant folding, Numpy ufunc fusion and parallelization, explicit thread-level parallelism through OpenMP annotations, false variable polymorphism pruning, and automatic vector instruction generation such as AVX or SSE. In addition to these compilation steps, Pythran provides a C++ runtime library that leverages the C++ STL to provide generic containers, and the Numeric Template Toolbox for Numpy support. It takes advantage of modern C++11 features such as variadic templates, type inference, move semantics and perfect forwarding, as well as classical idioms such as expression templates. Unlike the Cython approach, Pythran input code remains compatible with the Python interpreter. Output code is generally as efficient as the annotated Cython equivalent, if not more, but without the backward compatibility loss.

  4. Genetic Code Expansion: A Powerful Tool for Understanding the Physiological Consequences of Oxidative Stress Protein Modifications.

    PubMed

    Porter, Joseph J; Mehl, Ryan A

    2018-01-01

    Posttranslational modifications resulting from oxidation of proteins (Ox-PTMs) are present intracellularly under conditions of oxidative stress as well as basal conditions. In the past, these modifications were thought to be generic protein damage, but it has become increasingly clear that Ox-PTMs can have specific physiological effects. It is an arduous task to distinguish between the two cases, as multiple Ox-PTMs occur simultaneously on the same protein, convoluting analysis. Genetic code expansion (GCE) has emerged as a powerful tool to overcome this challenge as it allows for the site-specific incorporation of an Ox-PTM into translated protein. The resulting homogeneously modified protein products can then be rigorously characterized for the effects of individual Ox-PTMs. We outline the strengths and weaknesses of GCE as they relate to the field of oxidative stress and Ox-PTMs. An overview of the Ox-PTMs that have been genetically encoded and applications of GCE to the study of Ox-PTMs, including antibody validation and therapeutic development, is described.

  5. Automated Kinematics Equations Generation and Constrained Motion Planning Resolution for Modular and Reconfigurable Robots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, Francois G.; Love, Lonnie L.; Jung, David L.

    2004-03-29

    Contrary to the repetitive tasks performed by industrial robots, the tasks in most DOE missions such as environmental restoration or Decontamination and Decommissioning (D&D) can be characterized as ''batches-of-one'', in which robots must be capable of adapting to changes in constraints, tools, environment, criteria and configuration. No commercially available robot control code is suitable for use with such widely varying conditions. In this talk we present our development of a ''generic code'' to allow real time (at loop rate) robot behavior adaptation to changes in task objectives, tools, number and type of constraints, modes of controls or kinematics configuration. Wemore » present the analytical framework underlying our approach and detail the design of its two major modules for the automatic generation of the kinematics equations when the robot configuration or tools change and for the motion planning under time-varying constraints. Sample problems illustrating the capabilities of the developed system are presented.« less

  6. An analysis of how The Irish Times portrayed Irish nursing during the 1999 strike.

    PubMed

    Clarke, J; O'Neill, C S

    2001-07-01

    The aim of this article is to explore the images of nursing that were presented in the media during the recent industrial action by nurses and midwives in the Republic of Ireland. Although both nurses and midwives took industrial strike action, the strike was referred to as 'the nurses' strike' and both nurses and midwives were generally referred to by the generic term 'nurses'. Data were gathered from the printed news media of The Irish Times over a period of one month--4 October to 4 November 1999--which included the nine days of the strike. Although we limited the source of our data to just one newspaper, the findings do provide an image of how nurses and nursing care are viewed by both health professionals and the public. This image appeared to give a higher value to masculine cultural codes and the performance of technical skills, whereas acts associated with feminine cultural codes of caring were considered of lower value.

  7. An Open Simulation System Model for Scientific Applications

    NASA Technical Reports Server (NTRS)

    Williams, Anthony D.

    1995-01-01

    A model for a generic and open environment for running multi-code or multi-application simulations - called the open Simulation System Model (OSSM) - is proposed and defined. This model attempts to meet the requirements of complex systems like the Numerical Propulsion Simulator System (NPSS). OSSM places no restrictions on the types of applications that can be integrated at any state of its evolution. This includes applications of different disciplines, fidelities, etc. An implementation strategy is proposed that starts with a basic prototype, and evolves over time to accommodate an increasing number of applications. Potential (standard) software is also identified which may aid in the design and implementation of the system.

  8. Aerothermodynamics research at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Deiwert, George S.

    1987-01-01

    Research activity in the aerothermodynamics branch at the NASA Ames Research Center is reviewed. Advanced concepts and mission studies relating to the next generation aerospace transportation systems are summarized and directions for continued research identified. Theoretical and computational studies directed at determining flow fields and radiative and convective heating loads in real gases are described. Included are Navier-Stokes codes for equilibrium and thermochemical nonequilibrium air. Experimental studies in the 3.5-ft hypersonic wind tunnel, the ballistic ranges, and the electric arc driven shock tube are described. Tested configurations include generic hypersonic aerospace plane configurations, aeroassisted orbital transfer vehicle shapes and Galileo probe models.

  9. Table-driven image transformation engine algorithm

    NASA Astrophysics Data System (ADS)

    Shichman, Marc

    1993-04-01

    A high speed image transformation engine (ITE) was designed and a prototype built for use in a generic electronic light table and image perspective transformation application code. The ITE takes any linear transformation, breaks the transformation into two passes and resamples the image appropriately for each pass. The system performance is achieved by driving the engine with a set of look up tables computed at start up time for the calculation of pixel output contributions. Anti-aliasing is done automatically in the image resampling process. Operations such as multiplications and trigonometric functions are minimized. This algorithm can be used for texture mapping, image perspective transformation, electronic light table, and virtual reality.

  10. Generic control software connecting astronomical instruments to the reflective memory data recording system of VLTI - bossvlti

    NASA Astrophysics Data System (ADS)

    Pozna, E.; Ramirez, A.; Mérand, A.; Mueller, A.; Abuter, R.; Frahm, R.; Morel, S.; Schmid, C.; Duc, T. Phan; Delplancke-Ströbele, F.

    2014-07-01

    The quality of data obtained by VLTI instruments may be refined by analyzing the continuous data supplied by the Reflective Memory Network (RMN). Based on 5 years experience providing VLTI instruments (PACMAN, AMBER, MIDI) with RMN data, the procedure has been generalized to make the synchronization with observation trouble-free. The present software interface saves not only months of efforts for each instrument but also provides the benefits of software frameworks. Recent applications (GRAVITY, MATISSE) supply feedback for the software to evolve. The paper highlights the way common features been identified to be able to offer reusable code in due course.

  11. Assurance Cases for Proofs as Evidence

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Gurfinkel, Arie; Wallnau, Kurt; Weinstock, Charles

    2009-01-01

    Proof-carrying code (PCC) provides a 'gold standard' for establishing formal and objective confidence in program behavior. However, in order to extend the benefits of PCC - and other formal certification techniques - to realistic systems, we must establish the correspondence of a mathematical proof of a program's semantics and its actual behavior. In this paper, we argue that assurance cases are an effective means of establishing such a correspondence. To this end, we present an assurance case pattern for arguing that a proof is free from various proof hazards. We also instantiate this pattern for a proof-based mechanism to provide evidence about a generic medical device software.

  12. Towards a European code of medical ethics. Ethical and legal issues.

    PubMed

    Patuzzo, Sara; Pulice, Elisabetta

    2017-01-01

    The feasibility of a common European code of medical ethics is discussed, with consideration and evaluation of the difficulties such a project is going to face, from both the legal and ethical points of view. On the one hand, the analysis will underline the limits of a common European code of medical ethics as an instrument for harmonising national professional rules in the European context; on the other hand, we will highlight some of the potentials of this project, which could be increased and strengthened through a proper rulemaking process and through adequate and careful choice of content. We will also stress specific elements and devices that should be taken into consideration during the establishment of the code, from both procedural and content perspectives. Regarding methodological issues, the limits and potentialities of a common European code of medical ethics will be analysed from an ethical point of view and then from a legal perspective. The aim of this paper is to clarify the framework for the potential but controversial role of the code in the European context, showing the difficulties in enforcing and harmonising national ethical rules into a European code of medical ethics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Wavelet-based reversible watermarking for authentication

    NASA Astrophysics Data System (ADS)

    Tian, Jun

    2002-04-01

    In the digital information age, digital content (audio, image, and video) can be easily copied, manipulated, and distributed. Copyright protection and content authentication of digital content has become an urgent problem to content owners and distributors. Digital watermarking has provided a valuable solution to this problem. Based on its application scenario, most digital watermarking methods can be divided into two categories: robust watermarking and fragile watermarking. As a special subset of fragile watermark, reversible watermark (which is also called lossless watermark, invertible watermark, erasable watermark) enables the recovery of the original, unwatermarked content after the watermarked content has been detected to be authentic. Such reversibility to get back unwatermarked content is highly desired in sensitive imagery, such as military data and medical data. In this paper we present a reversible watermarking method based on an integer wavelet transform. We look into the binary representation of each wavelet coefficient and embed an extra bit to expandable wavelet coefficient. The location map of all expanded coefficients will be coded by JBIG2 compression and these coefficient values will be losslessly compressed by arithmetic coding. Besides these two compressed bit streams, an SHA-256 hash of the original image will also be embedded for authentication purpose.

  14. Sanctions Connected to Dress Code Violations in Secondary School Handbooks

    ERIC Educational Resources Information Center

    Workman, Jane E.; Freeburg, Elizabeth W.; Lentz-Hees, Elizabeth S.

    2004-01-01

    This study identifies and evaluates sanctions for dress code violations in secondary school handbooks. Sanctions, or consequences for breaking rules, vary along seven interrelated dimensions: source, formality, retribution, obtrusiveness, magnitude, severity, and pervasiveness. A content analysis of handbooks from 155 public secondary schools…

  15. National Geocoding Converter File 1 : Volume 1. Structure & Content.

    DOT National Transportation Integrated Search

    1974-01-01

    This file contains a record for each county, county equivalent (as defined by the Census Bureau), SMSA county segment and SPLC county segment in the U.S. A record identifies for an area all major county codes and the associated county aggregate codes

  16. 21 CFR 106.90 - Coding.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content...

  17. 21 CFR 106.90 - Coding.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content...

  18. 21 CFR 106.90 - Coding.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Coding. 106.90 Section 106.90 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION INFANT FORMULA QUALITY CONTROL PROCEDURES Quality Control Procedures for Assuring Nutrient Content...

  19. Generic effective source for scalar self-force calculations

    NASA Astrophysics Data System (ADS)

    Wardell, Barry; Vega, Ian; Thornburg, Jonathan; Diener, Peter

    2012-05-01

    A leading approach to the modeling of extreme mass ratio inspirals involves the treatment of the smaller mass as a point particle and the computation of a regularized self-force acting on that particle. In turn, this computation requires knowledge of the regularized retarded field generated by the particle. A direct calculation of this regularized field may be achieved by replacing the point particle with an effective source and solving directly a wave equation for the regularized field. This has the advantage that all quantities are finite and require no further regularization. In this work, we present a method for computing an effective source which is finite and continuous everywhere, and which is valid for a scalar point particle in arbitrary geodesic motion in an arbitrary background spacetime. We explain in detail various technical and practical considerations that underlie its use in several numerical self-force calculations. We consider as examples the cases of a particle in a circular orbit about Schwarzschild and Kerr black holes, and also the case of a particle following a generic timelike geodesic about a highly spinning Kerr black hole. We provide numerical C code for computing an effective source for various orbital configurations about Schwarzschild and Kerr black holes.

  20. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.

Top