Tam, Cynthia; Schwellnus, Heidi; Eaton, Ceilidh; Hamdani, Yani; Lamont, Andrea; Chau, Tom
2007-01-01
Children with severe physical disabilities often lack the physical skills to explore their environment independently, and to play with toys or musical instruments. The movement-to-music (MTM) system is an affordable computer system that allows children with limited movements to play and create music. The present study explored parents' experiences of using the MTM system with their children. A qualitative methodology employing in-depth interview techniques was used with six mothers and their children. The themes extracted from the data were organized under two main concepts of the International Classification of Functioning, Disability, and Health (ICF) (WHO, 2001) framework. The results showed that the MTM expanded horizons for the child along the ICF health dimensions and the MTM had a positive impact on ICF environmental determinants of health. The small sample size should be noted as a limitation of this study. Further research should be carried out with a larger sample of children with restricted mobility to obtain a better understanding of the impact of MTM technology on children's psychosocial development.
Kang, Feiwu; Huang, Cheng; Sah, Manoj Kumar; Jiang, Beizhan
2016-04-01
To analyze the effect of the eruption status of the mandibular third molar (MTM) on distal caries in the mandibular second molar (MSM) by cone-beam computed tomography (CBCT). Five hundred CBCT images of MTMs from 469 patients were evaluated. Presence of distal caries in MSMs, impaction depths and angulations of MTMs, cementoenamel junction (CEJ) distances between distal MSMs and mesial MTMs, presence of pericoronitis in MTMs, and patient characteristics (age and gender) were assessed. Data were analyzed by χ(2) test, univariate and multivariate logistic regression analyses, and Spearman correlation analysis. Descriptive and bivariate statistics were computed and the P value was set at .05. The overall prevalence of distal caries in the MSM was 52.0%. According to the classification of Pell and Gregory, position A was the impaction depth at which most distal caries in MSMs were present (P = .036). For angulation of the MTM, when mesial angulations were 43° to 73°, MSMs developed more distal caries (P < .0001). For the CEJ distance between the distal MSM and the mesial MTM, when distances ranged from 6 to 15 mm, distal caries in MSMs occurred more frequently (6 to 8 mm, P < .0001; 8 to 15 mm, P = .037). Furthermore, there was a linear correlation between angulation of the MTM and the CEJ distance between the distal MSM and the mesial MTM (P < .0001). Impaction depth and angulation of the MTM are associated with distal caries in the MSM. Angulation of the MTM is more stable and reliable than the CEJ distance between the distal MSM and the mesial MTM for the estimation of risk factors related to the MTM. Copyright © 2016 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Johnson, Robert E; Oldroyd, Megan E; Ahmed, Saleem S; Gieseler, Henning; Lewis, Lavinia M
2010-06-01
The freeze-drying behavior and cake morphology of a model protein in an amorphous formulation were studied at varying protein concentrations using conservative (-25 degrees C) and aggressive (+25 degrees C) shelf temperatures at constant chamber pressure during primary drying. The two cycles were characterized by manometric temperature measurements (MTM) in a SMART freeze dryer that estimates the sublimation rate (dm/dt), product temperature at the freeze-drying front (T(p-MTM)) and product resistance (R(p)) during a run. The calculated sublimation rates (dm/dt) were 3-4 times faster in the aggressive cycle compared to the conservative cycle. For conservatively dried cakes R(p) increased with both dry layer thickness and protein concentration. For aggressively dried cakes (where freeze-drying occurs at the edge of microcollapse), R(p) also increased with protein concentration but was independent of the dry layer thickness. The sublimation rate was influenced by R(p), dry layer thickness and T(p-MTM) in the conservative cycle, but was governed mainly by T(p-MTM) in the aggressive cycle, where R(p) is independent of the dry layer thickness. The aggressively dried cakes had a more open and porous structure compared to their conservatively dried counterparts. (c) 2009 Wiley-Liss, Inc. and the American Pharmacists Association
Developing a dashboard for benchmarking the productivity of a medication therapy management program.
Umbreit, Audrey; Holm, Emily; Gander, Kelsey; Davis, Kelsie; Dittrich, Kristina; Jandl, Vanda; Odell, Laura; Sweeten, Perry
To describe a method for internal benchmarking of medication therapy management (MTM) pharmacist activities. Multisite MTM pharmacist practices within an integrated health care system. MTM pharmacists are located within primary care clinics and provide medication management through collaborative practice. MTM pharmacist activity is grouped into 3 categories: direct patient care, nonvisit patient care, and professional activities. MTM pharmacist activities were tracked with the use of the computer-based application Pharmacist Ambulatory Resource Management System (PhARMS) over a 12-month period to measure growth during a time of expansion. A total of 81% of MTM pharmacist time was recorded. A total of 1655.1 hours (41%) was nonvisit patient care, 1185.2 hours (29%) was direct patient care, and 1190.4 hours (30%) was professional activities. The number of patient visits per month increased during the study period. There were 1496 direct patient care encounters documented. Of those, 1051 (70.2%) were face-to-face visits, 257 (17.2%) were by telephone, and 188 (12.6%) were chart reviews. Nonvisit patient care and professional activities also increased during the period. PhARMS reported MTM pharmacist activities and captured nonvisit patient care work not tracked elsewhere. Internal benchmarking data proved to be useful for justifying increases in MTM pharmacist personnel resources. Reviewing data helped to identify best practices from high-performing sites. Limitations include potential for self-reporting bias and lack of patient outcomes data. Implementing PhARMS facilitated internal benchmarking of patient care and nonpatient care activities in a regional MTM program. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Sandiego, Christine M.; Weinzimmer, David; Carson, Richard E.
2012-01-01
An important step in PET brain kinetic analysis is the registration of functional data to an anatomical MR image. Typically, PET-MR registrations in nonhuman primate neuroreceptor studies used PET images acquired early post-injection, (e.g., 0–10 min) to closely resemble the subject’s MR image. However, a substantial fraction of these registrations (~25%) fail due to the differences in kinetics and distribution for various radiotracer studies and conditions (e.g., blocking studies). The Multi-Transform Method (MTM) was developed to improve the success of registrations between PET and MR images. Two algorithms were evaluated, MTM-I and MTM-II. The approach involves creating multiple transformations by registering PET images of different time intervals, from a dynamic study, to a single reference (i.e., MR image) (MTM-I) or to multiple reference images (i.e., MR and PET images pre-registered to the MR) (MTM-II). Normalized mutual information was used to compute similarity between the transformed PET images and the reference image(s) to choose the optimal transformation. This final transformation is used to map the dynamic dataset into the animal’s anatomical MR space, required for kinetic analysis. The chosen transformed from MTM-I and MTM-II were evaluated using visual rating scores to assess the quality of spatial alignment between the resliced PET and reference. One hundred twenty PET datasets involving eleven different tracers from 3 different scanners were used to evaluate the MTM algorithms. Studies were performed with baboons and rhesus monkeys on the HR+, HRRT, and Focus-220. Successful transformations increased from 77.5%, 85.8%, to 96.7% using the 0–10 min method, MTM-I, and MTM-II, respectively, based on visual rating scores. The Multi-Transform Methods proved to be a robust technique for PET-MR registrations for a wide range of PET studies. PMID:22926293
Freeze-drying process design by manometric temperature measurement: design of a smart freeze-dryer.
Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J
2005-04-01
To develop a procedure based on manometric temperature measurement (MTM) and an expert system for good practices in freeze drying that will allow development of an optimized freeze-drying process during a single laboratory freeze-drying experiment. Freeze drying was performed with a FTS Dura-Stop/Dura-Top freeze dryer with the manometric temperature measurement software installed. Five percent solutions of glycine, sucrose, or mannitol with 2 ml to 4 ml fill in 5 ml vials were used, with all vials loaded on one shelf. Details of freezing, optimization of chamber pressure, target product temperature, and some aspects of secondary drying are determined by the expert system algorithms. MTM measurements were used to select the optimum shelf temperature, to determine drying end points, and to evaluate residual moisture content in real-time. MTM measurements were made at 1 hour or half-hour intervals during primary drying and secondary drying, with a data collection frequency of 4 points per second. The improved MTM equations were fit to pressure-time data generated by the MTM procedure using Microcal Origin software to obtain product temperature and dry layer resistance. Using heat and mass transfer theory, the MTM results were used to evaluate mass and heat transfer rates and to estimate the shelf temperature required to maintain the target product temperature. MTM product dry layer resistance is accurate until about two-thirds of total primary drying time is over, and the MTM product temperature is normally accurate almost to the end of primary drying provided that effective thermal shielding is used in the freeze-drying process. The primary drying times can be accurately estimated from mass transfer rates calculated very early in the run, and we find the target product temperature can be achieved and maintained with only a few adjustments of shelf temperature. The freeze-dryer overload conditions can be estimated by calculation of heat/mass flow at the target product temperature. It was found that the MTM results serve as an excellent indicator of the end point of primary drying. Further, we find that the rate of water desorption during secondary drying may be accurately measured by a variation of the basic MTM procedure. Thus, both the end point of secondary drying and real-time residual moisture may be obtained during secondary drying. Manometric temperature measurement and the expert system for good practices in freeze drying does allow development of an optimized freeze-drying process during a single laboratory freeze-drying experiment.
Park, Wonse; Choi, Ji-Wook; Kim, Jae-Young; Kim, Bong-Chul; Kim, Hyung Jun; Lee, Sang-Hwy
2010-03-01
Paresthesia is a well-known complication of extraction of mandibular third molars (MTMs). The authors evaluated the relationship between paresthesia after MTM extraction and the cortical integrity of the inferior alveolar canal (IAC) by using computed tomography (CT). The authors designed a retrospective cohort study involving participants considered, on the basis of panoramic imaging, to be at high risk of experiencing injury of the inferior alveolar nerve who subsequently underwent CT imaging and extraction of the MTMs. The primary predictor variable was the contact relationship between the IAC and the MTM as viewed on a CT image, classified into three groups: group 1, no contact; group 2, contact between the MTM and the intact IAC cortex; group 3, contact between the MTM and the interrupted IAC cortex. The secondary predictor variable was the number of CT image slices showing the cortical interruption around the MTM. The outcome variable was the presence or absence of postoperative paresthesia after MTM extraction. The study sample comprised 179 participants who underwent MTM extraction (a total of 259 MTMs). Their mean age was 23.6 years, and 85 (47.5 percent) were male. The overall prevalence of paresthesia was 4.2 percent (11 of 259 teeth). The prevalence of paresthesia in group 3 (involving an interrupted IAC cortex) was 11.8 percent (10 of 85 cases), while for group 2 (involving an intact IAC cortex) and group 1 (involving no contact) it was 1.0 percent (1 of 98 cases) and 0.0 percent (no cases), respectively. The frequency of nerve damage increased with the number of CT image slices showing loss of cortical integrity (P=.043). The results of this study indicate that loss of IAC cortical integrity is associated with an increased risk of experiencing paresthesia after MTM extraction.
76 FR 20369 - Notice of Invitation-Coal Exploration License Applications MTM 101687 and MTM 101688
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-12
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT921000-11-L13200000-EL0000-P; MTM 101687-MTM 101688] Notice of Invitation-Coal Exploration License Applications MTM 101687 and MTM 101688..., Wyoming. Such written notice must refer to serial numbers MTM 101687 or MTM 101688. ADDRESSES: The...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-28
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-10-L13100000-FI0000-P; MTM 97526 and MTM 97527] Notice of Proposed Reinstatement of Terminated Oil and Gas Leases MTM 97526 and MTM... MTM 97526 and MTM 97527, Richland County, Montana. The lessee paid the required rental accruing from...
Attracting Students to Space Science Fields: Mission to Mars
NASA Astrophysics Data System (ADS)
Congdon, Donald R.; Lovegrove, William P.; Samec, Ronald G.
Attracting high school students to space science is one of the main goals of Bob Jones University's annual Mission to Mars (MTM). MTM develops interest in space exploration through a highly realistic simulated trip to Mars. Students study and learn to appreciate the challenges of space travel including propulsion life support medicine planetary astronomy psychology robotics and communication. Broken into teams (Management Spacecraft Design Communications Life Support Navigation Robotics and Science) they address the problems specific to each aspect of the mission. Teams also learn to interact and recognize that a successful mission requires cooperation. Coordinated by the Management Team the students build a spacecraft and associated apparatus connect computers and communications equipment train astronauts on the mission simulator and program a Pathfinder-type robot. On the big day the astronauts enter the spacecraft as Mission Control gets ready to support them through the expected and unexpected of their mission. Aided by teamwork the astronauts must land on Mars perform their scientific mission on a simulated surface of mars and return home. We see the success of MTM not only in successful missions but in the students who come back year after year for another MTM.
Gilliam, Eric; Thompson, Megan; Vande Griend, Joseph
2017-01-01
Objective. To develop a community pharmacy-based medication therapy management (MTM) advanced pharmacy practice experience (APPE) that provides students with skills and knowledge to deliver entry-level pharmacy MTM services. Design. The University of Colorado Skaggs School of Pharmacy & Pharmaceutical Sciences (SSPPS) partnered with three community pharmacy chains to establish this three-week, required MTM APPE. Students completed the American Pharmacists Association MTM Certificate Course prior to entering the APPE. Students were expected to spend 90% or more of their time at this experience working on MTM interventions, using store MTM platforms. Assessment. All 151 students successfully completed this MTM APPE, and each received a passing evaluation from their preceptor. Preceptor evaluations of students averaged above four (entry-level practice) on a five-point Likert scale. The majority of students reported engagement in MTM services for more than 80% of the time on site. Students’ self-reporting of their ability to perform MTM interventions improved after participation in the APPE. Conclusion. The SSPPS successfully implemented a required MTM APPE, preparing students for entry-level delivery of MTM services. PMID:28381896
Thumar, Ricky; Zaiken, Kathy
2014-01-01
To compare the impact of clinical pharmacist (CP) recommendations through a live, primary care-based, medication therapy management (MTM) protocol on low-density-lipoprotein (LDL) cholesterol in patients who have cardiovascular disease (CVD) with standard, chart-review MTM. Patients with established CVD who were not at their LDL goal were identified and analyzed by either a chart-review MTM service or a live, one-on-one pharmacist-physician MTM service over a 6-month timeframe. For the chart-review MTM service, recommendations were communicated through an electronic medical record (EMR) that the physician and pharmacist had access to. Primary outcomes included mean LDL reduction from baseline, number of patients achieving their LDL goal, and percent of implemented CP recommendations. Mean LDL reduction from baseline in the chart-review MTM group and the live MTM group was 36 mg/dL ± 23.2 mg/dL (P = 0.001) and 62 mg/dL ± 28.3 mg/dL (P = 0.001), respectively. The difference between these two groups was statistically significant (P = 0.001). The chart-review MTM group had 30% of patients reach their LDL goal with 66.3% of CP recommendations implemented compared to 51.3% and 86.3% for the same parameters in the live MTM group (P = 0.006 and P = 0.003, respectively). Although both MTM services provide a significant LDL reduction from baseline in patients with CVD, live MTM provides significantly greater LDL reductions, implemented CP recommendations, and goal attainment than chart-review MTM. Thus, live MTM services are more effective than chart-review MTM services, at least within the clinics that these protocols were assessed for the purposes of this study.
Wang, Junling; Qiao, Yanru; Tina Shih, Ya-Chen; Jamison, JoEllen Jarrett; Spivey, Christina A.; Wan, Jim Y.; White-Means, Shelley I.; Dagogo-Jack, Samuel; Cushman, William C.; Chisholm-Burns, Marie
2015-01-01
Background The Medicare Prescription Drug, Improvement, and Modernization Act (MMA) requires Part D plans to establish programs to provide medication therapy management (MTM) services starting from 2006. MTM services have been found to improve patient outcomes from pharmacotherapy, reduce emergency room visits and hospitalizations, and reduce health care costs in a cost-effective fashion. However, previous research found that Non-Hispanic Blacks (Blacks) and Hispanics may be less likely to be eligible for MTM services than Non-Hispanic Whites (Whites) among the Medicare population according to current Medicare MTM eligibility criteria. This is because the Medicare MTM eligibility criteria are predominantly based on medication utilization and costs, while Blacks and Hispanics tend to use fewer prescription medications and incur lower prescription medication costs. The Patient Protection and Affordable Care Act (PPACA) laid out a set of MTM eligibility criteria for eligible entities to target patients for MTM services: “(1) take 4 or more prescribed medications …; (2) take any ‘high risk’ medications; (3) have 2 or more chronic diseases… or (4) have undergone a transition of care, or other factors… that are likely to create a high risk of medication-related problems.” Objectives This study aimed to examine (1) racial/ethnic disparities in meeting the eligibility criteria for MTM services in PPACA among the Medicare population; and (2) whether there would be greater disparities in health and economic outcomes among MTM-ineligible than MTM-eligible groups. (If so, the PPACA MTM eligibility criteria may aggravate existing disparities.) Methods This was a retrospective cross-sectional analysis of Medicare Current Beneficiaries Survey (MCBS; 2007–2008). To determine medication characteristics, the Food and Drug Administration’s Electronic Orange Book was also used. Proportions of population eligible for MTM services based on the PPACA MTM eligibility criteria were compared across racial and ethnic groups using a chi-square test; a logistic regression model was used to adjust for population socio-demographic and health characteristics. Health and economic outcomes examined included health status (self-perceived good health status, number of chronic diseases, activities of daily living or ADLs, and instrumental activities of daily living or IADLs), health services utilization and costs (physician visits, emergency room visits, and total health care costs), and medication utilization patterns (generic dispensing ratio). To determine difference in disparities across MTM eligibility categories, difference-in-differences regressions of various functional forms were employed depending on the nature of the dependent variables. Interaction terms between the dummy variables for minority groups (e.g., Blacks or Hispanics) and MTM eligibility were included to test whether disparity patterns varied between MTM-ineligible and MTM-eligible individuals. Results The sample consisted of 12,966 Medicare beneficiaries, of which 11,161 were White, 930 were Black and 875 were Hispanic. Of the study sample, 9,992 Whites (86.4%), 825 Blacks (86.3%) and 733 Hispanics (80.6%) were eligible for MTM. The difference between Whites and Hispanics was significant (P<0.05) and the difference between Whites and Blacks were not significant (P>0.05). In multivariate analyses, significant disparity in eligibility for MTM services was found only between Hispanics and Whites (OR = 0.59; 95% CI = 0.43–0.82) but not between Blacks and Whites (OR=0.78; 95% CI=0.55–1.09). Disparities were greater among the MTM-ineligible than the MTM-eligible populations in self-perceived health status, ADLs, and IADLs for both Blacks and Hispanics compared with Whites. When analyzing number of chronic conditions, number and costs of physician visits and total healthcare costs, this study found lower racial and ethnic disparities among the non-eligible population than the eligible population. Conclusion Hispanics would be significantly less likely than Whites to qualify for MTM eligibility among the Medicare population according to the MTM eligibility criteria stipulated in PPACA. The PPACA MTM eligibility criteria may aggravate existing racial and ethnic disparities in health status but may remediate racial and ethnic disparities in health services utilization. Alternative MTM eligibility criteria other than PPACA MTM eligibility criteria may be needed to improve the efficiency and equity of access to Medicare Part D MTM programs. PMID:26521111
Pierson, Christopher R.; Dulin-Smith, Ashley N.; Durban, Ashley N.; Marshall, Morgan L.; Marshall, Jordan T.; Snyder, Andrew D.; Naiyer, Nada; Gladman, Jordan T.; Chandler, Dawn S.; Lawlor, Michael W.; Buj-Bello, Anna; Dowling, James J.; Beggs, Alan H.
2012-01-01
X-linked myotubular myopathy (MTM) is a severe neuromuscular disease of infancy caused by mutations of MTM1, which encodes the phosphoinositide lipid phosphatase, myotubularin. The Mtm1 knockout (KO) mouse has a severe phenotype and its short lifespan (8 weeks) makes it a challenge to use as a model in the testing of certain preclinical therapeutics. Many MTM patients succumb early in life, but some have a more favorable prognosis. We used human genotype–phenotype correlation data to develop a myotubularin-deficient mouse model with a less severe phenotype than is seen in Mtm1 KO mice. We modeled the human c.205C>T point mutation in Mtm1 exon 4, which is predicted to introduce the p.R69C missense change in myotubularin. Hemizygous male Mtm1 p.R69C mice develop early muscle atrophy prior to the onset of weakness at 2 months. The median survival period is 66 weeks. Histopathology shows small myofibers with centrally placed nuclei. Myotubularin protein is undetectably low because the introduced c.205C>T base change induced exon 4 skipping in most mRNAs, leading to premature termination of myotubularin translation. Some full-length Mtm1 mRNA bearing the mutation is present, which provides enough myotubularin activity to account for the relatively mild phenotype, as Mtm1 KO and Mtm1 p.R69C mice have similar muscle phosphatidylinositol 3-phosphate levels. These data explain the basis for phenotypic variability among human patients with MTM1 p.R69C mutations and establish the Mtm1 p.R69C mouse as a valuable model for the disease, as its less severe phenotype will expand the scope of testable preclinical therapies. PMID:22068590
Lamm, Steven H; Li, Ji; Robbins, Shayhan A; Dissen, Elisabeth; Chen, Rusan; Feinleib, Manning
2015-02-01
Pooled 1996 to 2003 birth certificate data for four central states in Appalachia indicated higher rates of infants with birth defects born to residents of counties with mountain-top mining (MTM) than born to residents of non-mining-counties (Ahern 2011). However, those analyses did not consider sources of uncertainty such as unbalanced distributions or quality of data. Quality issues have been a continuing problem with birth certificate analyses. We used 1990 to 2009 live birth certificate data for West Virginia to reassess this hypothesis. Forty-four hospitals contributed 98% of the MTM-county births and 95% of the non-mining-county births, of which six had more than 1000 births from both MTM and nonmining counties. Adjusted and stratified prevalence rate ratios (PRRs) were computed both by using Poisson regression and Mantel-Haenszel analysis. Unbalanced distribution of hospital births was observed by mining groups. The prevalence rate of infants with reported birth defects, higher in MTM-counties (0.021) than in non-mining-counties (0.015), yielded a significant crude PRR (cPRR = 1.43; 95% confidence interval [CI] = 1.36-1.52) but a nonsignificant hospital-adjusted PRR (adjPRR = 1.08; 95% CI = 0.97-1.20; p = 0.16) for the 44 hospitals. So did the six hospital data analysis ([cPRR = 2.39; 95% CI = 2.15-2.65] and [adjPRR = 1.01; 95% CI, 0.89-1.14; p = 0.87]). No increased risk of birth defects was observed for births from MTM-counties after adjustment for, or stratification by, hospital of birth. These results have consistently demonstrated that the reported association between birth defect rates and MTM coal mining was a consequence of data heterogeneity. The data do not demonstrate evidence of a "Mountain-top Mining" effect on the prevalence of infants with reported birth defects in WV. © 2014 Wiley Periodicals, Inc.
Chronic cardiovascular disease mortality in mountaintop mining areas of central Appalachian states.
Esch, Laura; Hendryx, Michael
2011-01-01
To determine if chronic cardiovascular disease (CVD) mortality rates are higher among residents of mountaintop mining (MTM) areas compared to mining and nonmining areas, and to examine the association between greater levels of MTM surface mining and CVD mortality. Age-adjusted chronic CVD mortality rates from 1999 to 2006 for counties in 4 Appalachian states where MTM occurs (N = 404) were linked with county coal mining data. Three groups of counties were compared: MTM, coal mining but not MTM, and nonmining. Covariates included smoking rate, rural-urban status, percent male population, primary care physician supply, obesity rate, diabetes rate, poverty rate, race/ethnicity rates, high school and college education rates, and Appalachian county. Linear regression analyses examined the association of mortality rates with mining in MTM areas and non-MTM areas and the association of mortality with quantity of surface coal mined in MTM areas. Prior to covariate adjustment, chronic CVD mortality rates were significantly higher in both mining areas compared to nonmining areas and significantly highest in MTM areas. After adjustment, mortality rates in MTM areas remained significantly higher and increased as a function of greater levels of surface mining. Higher obesity and poverty rates and lower college education rates also significantly predicted CVD mortality overall and in rural counties. MTM activity is significantly associated with elevated chronic CVD mortality rates. Future research is necessary to examine the socioeconomic and environmental impacts of MTM on health to reduce health disparities in rural coal mining areas. © 2011 National Rural Health Association.
Wang, Junling; Qiao, Yanru; Shih, Ya-Chen Tina; Jarrett-Jamison, JoEllen; Spivey, Christina A; Wan, Jim Y; White-Means, Shelley I; Dagogo-Jack, Samuel; Cushman, William C; Chisholm-Burns, Marie
2015-11-01
The Medicare Prescription Drug, Improvement, and Modernization Act requires Part D plans to establish programs to provide medication therapy management (MTM) services starting from 2006. MTM services have been found to improve patient outcomes from pharmacotherapy, reduce emergency room visits and hospitalizations, and reduce health care costs in a cost-effective fashion. However, previous research found that non-Hispanic blacks (blacks) and Hispanics may be less likely to be eligible for MTM services than non-Hispanic whites (whites) among the Medicare population, according to current Medicare MTM eligibility criteria. This finding is because Medicare MTM eligibility criteria are predominantly based on medication use and costs, and blacks and Hispanics tend to use fewer prescription medications and incur lower prescription medication costs. The Patient Protection and Affordable Care Act (PPACA) laid out a set of MTM eligibility criteria for eligible entities to target patients for MTM services: "(1) take 4 or more prescribed medications ...; (2) take any 'high risk' medications; (3) have 2 or more chronic diseases ... or (4) have undergone a transition of care, or other factors ... that are likely to create a high risk of medication-related problems." To (a) examine racial/ethnic disparities in meeting the eligibility criteria for MTM services in PPACA among the Medicare population and (b) determine whether there would be greater disparities in health and economic outcomes among MTM-ineligible than MTM-eligible groups. This was a retrospective cross-sectional analysis of the Medicare Current Beneficiaries Survey (2007-2008). To determine medication characteristics, the U.S. Food and Drug Administration's Electronic Orange Book was also used. Proportions of the population eligible for MTM services based on PPACA MTM eligibility criteria were compared across racial and ethnic groups using a chi-square test; a logistic regression model was used to adjust for population sociodemographic and health characteristics. Health and economic outcomes examined included health status (self-perceived good health status, number of chronic diseases, activities of daily living [ADLs], and instrumental activities of daily living [IADLs]), health services utilization and costs (physician visits, emergency room visits, and total health care costs), and medication use patterns (generic dispensing ratio). To determine difference in disparities across MTM eligibility categories, difference-in-differences regressions of various functional forms were employed, depending on the nature of the dependent variables. Interaction terms between the dummy variables for minority groups (e.g., blacks or Hispanics) and MTM eligibility were included to test whether disparity patterns varied between MTM-ineligible and MTM-eligible individuals. The sample consisted of 12,966 Medicare beneficiaries, of which 11,161 were white, 930 were black, and 875 were Hispanic. Of the study sample, 9,992 whites (86.4%), 825 blacks (86.3%), and 733 Hispanics (80.6%) were eligible for MTM. The difference between whites and Hispanics was significant (P less than 0.050), and the difference between whites and blacks was not significant (P greater than 0.050). In multivariate analyses, significant disparity in eligibility for MTM services was found only between Hispanics and whites (odds ratio [OR] = 0.59; 95% CI = 0.43-0.82) but not between blacks and whites (OR = 0.78; 95% CI = 0.55-1.09). Disparities were greater among the MTM-ineligible than the MTM-eligible populations in self-perceived health status, ADLs, and IADLs for both blacks and Hispanics compared with whites. When analyzing the number of chronic conditions, the number and costs of physician visits, and total health care costs, the authors of this study found lower racial and ethnic disparities among the ineligible population than the eligible population. Hispanics are significantly less likely than whites to qualify for MTM among the Medicare population, according to MTM eligibility criteria stipulated in the PPACA. PPACA MTM eligibility criteria may aggravate existing racial and ethnic disparities in health status but may remediate racial and ethnic disparities in health services utilization. Alternative MTM eligibility criteria other than PPACA MTM eligibility criteria may be needed to improve the efficiency and equity of access to Medicare Part D MTM programs.
Pharmacy technician involvement in community pharmacy medication therapy management.
Lengel, Matthew; Kuhn, Catherine H; Worley, Marcia; Wehr, Allison M; McAuley, James W
To assess the impact of technician involvement on the completion of medication therapy management (MTM) services in a community pharmacy setting and to describe pharmacists' and technicians' perceptions of technician involvement in MTM-related tasks and their satisfaction with the technician's role in MTM. Prospective observational study. In the fall of 2015, pharmacists and selected technicians from 32 grocery store-based community pharmacies were trained to use technicians within MTM services. Completed MTM claims were evaluated at all pharmacies for 3 months before training and 3 months after training. An electronic survey, developed with the use of competencies taught in the training and relevant published literature, was distributed via e-mail to trained employees 3 months after training. The total number of completed MTM claims at the 32 pharmacy sites was higher during the posttraining time period (2687 claims) versus the pretraining period (1735 claims). Of the 182 trained participants, 112 (61.5%) completed the survey. Overall, perceived technician involvement was lower than expected. However, identifying MTM opportunities was the most commonly reported technician MTM task, with 62.5% of technicians and 47.2% of pharmacists reporting technician involvement. Nearly one-half of technicians (42.5%) and pharmacists (44.0%) agreed or strongly agreed they were satisfied with the technician's role in MTM services, and 40.0% of technicians agreed that they were more satisfied with their work in the pharmacy after involvement in MTM. Three months after initial training of technicians in MTM, participation of technicians was lower than expected. However, the technicians involved most often reported identifying MTM opportunities for pharmacists, which may be a focus for future technician trainings. In addition, technician involvement in MTM services may increase satisfaction with many aspects of work for actively involved technicians. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Positioning and integrating medication therapy management.
Schommer, Jon C; Doucette, William R; Johnson, Kathleen A; Planas, Lourdes G
2012-01-01
To summarize findings from medication therapy management (MTM) "environmental scans" conducted from 2007 through 2010, interpret findings from the environmental scans using insights gained from the Future of MTM Roundtable convened in October 2010, and propose ideas for future positioning and integrating of MTM programs in the U.S. health care system. Data for the environmental scans were collected from purposive samples of MTM pharmacist providers and MTM payers throughout the United States using self-administered online surveys in 2007, 2008, 2009, and 2010. Based on the findings, it appears that MTM is becoming more developed and that some aspects of MTM have become established within the organizations that are providing and paying for these programs. However, the findings also revealed that a need exists to better integrate MTM between organizations and patients serviced (business-to-consumer relationships), between partnering organizations (business-to-business relationships), and between collaborating practitioners (peer-to-peer relationships). The findings suggest that a "channel of distribution" is emerging in which organizational relationships and cost efficiencies will be important considerations in the near term. We propose that applying (1) customer portfolio management and (2) transaction cost economics would help improve positioning and integrating MTM into the U.S. health care system.
Drosophila Mtm and class II PI3K coregulate a PI(3)P pool with cortical and endolysosomal functions.
Velichkova, Michaella; Juan, Joe; Kadandale, Pavan; Jean, Steve; Ribeiro, Inês; Raman, Vignesh; Stefan, Chris; Kiger, Amy A
2010-08-09
Reversible phosphoinositide phosphorylation provides a dynamic membrane code that balances opposing cell functions. However, in vivo regulatory relationships between specific kinases, phosphatases, and phosphoinositide subpools are not clear. We identified myotubularin (mtm), a Drosophila melanogaster MTM1/MTMR2 phosphoinositide phosphatase, as necessary and sufficient for immune cell protrusion formation and recruitment to wounds. Mtm-mediated turnover of endosomal phosphatidylinositol 3-phosphate (PI(3)P) pools generated by both class II and III phosphatidylinositol 3-kinases (Pi3K68D and Vps34, respectively) is needed to down-regulate membrane influx, promote efflux, and maintain endolysosomal homeostasis. Endocytosis, but not endolysosomal size, contributes to cortical remodeling by mtm function. We propose that Mtm-dependent regulation of an endosomal PI(3)P pool has separable consequences for endolysosomal homeostasis and cortical remodeling. Pi3K68D depletion (but not Vps34) rescues protrusion and distribution defects in mtm-deficient immune cells and restores functions in other tissues essential for viability. The broad interactions between mtm and class II Pi3K68D suggest a novel strategy for rebalancing PI(3)P-mediated cell functions in MTM-related human disease.
A design for living technology: experiments with the mind time machine.
Ikegami, Takashi
2013-01-01
Living technology aims to help people expand their experiences in everyday life. The environment offers people ways to interact with it, which we call affordances. Living technology is a design for new affordances. When we experience something new, we remember it by the way we perceive and interact with it. Recent studies in neuroscience have led to the idea of a default mode network, which is a baseline activity of a brain system. The autonomy of artificial life must be understood as a sort of default mode that self-organizes its baseline activity, preparing for its external inputs and its interaction with humans. I thus propose a method for creating a suitable default mode as a design principle for living technology. I built a machine called the mind time machine (MTM), which runs continuously for 10 h per day and receives visual data from its environment using 15 video cameras. The MTM receives and edits the video inputs while it self-organizes the momentary now. Its base program is a neural network that includes chaotic dynamics inside the system and a meta-network that consists of video feedback systems. Using this system as the hardware and a default mode network as a conceptual framework, I describe the system's autonomous behavior. Using the MTM as a testing ground, I propose a design principle for living technology.
Navigating complex patients using an innovative tool: the MTM Spider Web.
Morello, Candis M; Hirsch, Jan D; Lee, Kelly C
2013-01-01
To introduce a teaching tool that can be used to assess the complexity of medication therapy management (MTM) patients, prioritize appropriate interventions, and design patient-centered care plans for each encounter. MTM patients are complex as a result of multiple comorbidities, medications, and socioeconomic and behavioral issues. Pharmacists who provide MTM services are required to synthesize a plethora of information (medical and nonmedical), evaluate and prioritize the clinical problems, and design a comprehensive patient-centered care plan. The MTM Spider Web is a visual tool to facilitate this process. A description is provided regarding how to build the MTM Spider Web using case-based scenarios. This model can be used to teach pharmacists, health professional students, and patients. The MTM Spider Web is an innovative teaching tool that can be used to teach pharmacists and students how to assess complex patients and design a patient-centered care plan to deliver the most appropriate medication therapy.
75 FR 440 - Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 91625
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-10-L13100000-FI0000-P;MTM 91625] Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 91625 AGENCY: Bureau of Land... for reinstatement of noncompetitive oil and gas lease MTM 91625, Musselshell County, Montana. The...
75 FR 76753 - Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 91627
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-09
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-11-L13100000-FI0000-P;MTM 91627] Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 91627 AGENCY: Bureau of Land... for reinstatement of noncompetitive oil and gas lease MTM 91627, Musselshell County, Montana. The...
Tang, Xiaolin; Nail, Steven L; Pikal, Michael J
2006-02-10
This study examines the factors that may cause systematic errors in the manometric temperature measurement (MTM) procedure used to evaluate product temperature during primary drying. MTM was conducted during primary drying using different vial loads, and the MTM product temperatures were compared with temperatures directly measured by thermocouples. To clarify the impact of freeze-drying load on MTM product temperature, simulation of the MTM vapor pressure rise was performed, and the results were compared with the experimental results. The effect of product temperature heterogeneity in MTM product temperature determination was investigated by comparing the MTM product temperatures with directly measured thermocouple product temperatures in systems differing in temperature heterogeneity. Both the simulated and experimental results showed that at least 50 vials (5 mL) were needed to give sufficiently rapid pressure rise during the MTM data collection period (25 seconds) in the freeze dryer, to allow accurate determination of the product temperature. The product temperature is location dependent, with higher temperature for vials on the edge of the array and lower temperature for the vials in the center of the array. The product temperature heterogeneity is also dependent upon the freeze-drying conditions. In product temperature heterogeneous systems, MTM measures a temperature close to the coldest product temperature, even if only a small fraction of the samples have the coldest product temperature. The MTM method is valid even at very low product temperature (-45 degrees C).
Tang, Xiaolin; Nail, Steven L; Pikal, Michael J
2006-03-01
This study examines the factors that may cause systematic errors in the manometric temperature measurement (MTM) procedure used to evaluate product temperature during primary drying. MTM was conducted during primary drying using different vial loads, and the MTM product temperatures were compared with temperatures directly measured by thermocouples. To clarify the impact of freeze-drying load on MTM product temperatures, simulation of the MTM vapor pressure rise was performed, and the results were compared with the experimental results. The effect of product temperature heterogeneity in MTM product temperature determination was investigated by comparing the MTM product temperatures with directly measured thermocouple product temperatures in systems differing in temperature heterogeneity. Both the simulated and experimental results showed that at least 50 vials (5 mL) were needed to give sufficiently rapid pressure rise during the MTM data collection period (25 seconds) in the freeze dryer, to allow accurate determination of the product temperature. The product temperature is location dependent, with higher temperature for vials on the edge of the array and lower temperature for the vials in the center of the array. The product temperature heterogeneity is also dependent upon the freeze-drying conditions. In product temperature heterogeneous systems, MTM measures a temperature close to the coldest product temperature, even, if only a small fraction of the samples have the coldest product temperature. The MTM method is valid even at very low product temperature (-45°C).
Garcia, Gladys M; Snyder, Margie E; McGrath, Stephanie Harriman; Smith, Randall B; McGivney, Melissa Somma
2009-01-01
To identify effective strategies for marketing pharmacist-provided medication therapy management (MTM) services to patients in a self-insured employer setting. Qualitative study. University of Pittsburgh during March through May 2008. 26 university employees taking at least one chronic medication. Three focus group sessions were conducted using a semistructured topic guide to facilitate the discussion. Employees' perceived medication-related needs, perceived benefits of pharmacist-provided MTM, potential barriers for employee participation in MTM, and effective strategies for marketing MTM. Participants reported concerns with timing of doses, medication costs, access, and ensuring adherence. Participants generally felt positively toward pharmacists; however, the level of reported patient contact with pharmacists varied among participants. Some participants questioned pharmacists' education and qualifications for this enhanced role in patient care. Perceived benefits of MTM noted by participants included the opportunity to obtain personalized information about their medications and the potential for improved communication among their health providers. Barriers to patient participation were out-of-pocket costs and lack of time for MTM visits. Participants suggested use of alternative words to describe MTM and marketing approaches that involve personal contact. Pharmacists should emphasize parts of MTM that patients feel are most beneficial (i.e., provision of a personal medication record) and use patient-friendly language to describe MTM when marketing their practice. Patients will need greater exposure to the concept of MTM and the pharmacists' role in order to correctly describe and assign value to this type of pharmacist patient care practice.
Trends in Medicare Part D Medication Therapy Management Eligibility Criteria
Wang, Junling; Shih, Ya-Chen Tina; Qin, Yolanda; Young, Theo; Thomas, Zachary; Spivey, Christina A.; Solomon, David K.; Chisholm-Burns, Marie
2015-01-01
Background To increase the enrollment rate of medication therapy management (MTM) programs in Medicare Part D plans, the US Centers for Medicare & Medicaid Services (CMS) lowered the allowable eligibility thresholds based on the number of chronic diseases and Part D drugs for Medicare Part D plans for 2010 and after. However, an increase in MTM enrollment rates has not been realized. Objectives To describe trends in MTM eligibility thresholds used by Medicare Part D plans and to identify patterns that may hinder enrollment in MTM programs. Methods This study analyzed data extracted from the Medicare Part D MTM Programs Fact Sheets (2008–2014). The annual percentages of utilizing each threshold value of the number of chronic diseases and Part D drugs, as well as other aspects of MTM enrollment practices, were analyzed among Medicare MTM programs that were established by Medicare Part D plans. Results For 2010 and after, increased proportions of Medicare Part D plans set their eligibility thresholds at the maximum numbers allowable. For example, in 2008, 48.7% of Medicare Part D plans (N = 347:712) opened MTM enrollment to Medicare beneficiaries with only 2 chronic disease states (specific diseases varied between plans), whereas the other half restricted enrollment to patients with a minimum of 3 to 5 chronic disease states. After 2010, only approximately 20% of plans opened their MTM enrollment to patients with 2 chronic disease states, with the remaining 80% restricting enrollment to patients with 3 or more chronic diseases. Conclusion The policy change by CMS for 2010 and after is associated with increased proportions of plans setting their MTM eligibility thresholds at the maximum numbers allowable. Changes to the eligibility thresholds by Medicare Part D plans might have acted as a barrier for increased MTM enrollment. Thus, CMS may need to identify alternative strategies to increase MTM enrollment in Medicare plans. PMID:26380030
76 FR 10389 - Notice of Proposed Reinstatement of Terminated Oil and Gas lease MTM 96122
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-24
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-11-L13100000-FI0000-P;MTM 96122] Notice of Proposed Reinstatement of Terminated Oil and Gas lease MTM 96122 AGENCY: Bureau of Land... filed a petition for reinstatement of competitive oil and gas lease MTM 96122, Richland County, Montana...
76 FR 54484 - Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 98742
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-01
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-11-L13100000-FI0000-P;MTM 98742] Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 98742 AGENCY: Bureau of Land... petition for reinstatement of competitive oil and gas lease MTM 98742, Fergus County, Montana. The lessee...
75 FR 63855 - Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 98742, Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-18
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-10-L13100000-FI0000-P;MTM 98742] Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 98742, Montana AGENCY: Bureau of... lease MTM 98742, for land in Fergus County, Montana. The lessee paid the required rental accruing from...
75 FR 30059 - Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 98343
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-28
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-10-L13100000-FI0000-P; MTM 98343] Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 98343 AGENCY: Bureau of Land... petition for reinstatement of competitive oil and gas lease MTM 98343, Fergus County, Montana. The lessee...
75 FR 43553 - Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 97827, Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT922200-10-L13100000-FI0000-P;MTM 97827] Notice of Proposed Reinstatement of Terminated Oil and Gas Lease MTM 97827, Montana AGENCY: Bureau of... and gas lease MTM 97827, Carbon County, Montana. The lessee paid the required rental accruing from the...
Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J
2006-01-01
The purpose of this work was to study the factors that may cause systematic errors in the manometric temperature measurement (MTM) procedure used to determine product dry-layer resistance to vapor flow. Product temperature and dry-layer resistance were obtained using MTM software installed on a laboratory freeze-dryer. The MTM resistance values were compared with the resistance values obtained using the "vial method." The product dry-layer resistances obtained by MTM, assuming fixed temperature difference (DeltaT; 2 degrees C), were lower than the actual values, especially when the product temperatures and sublimation rates were low, but with DeltaT determined from the pressure rise data, more accurate results were obtained. MTM resistance values were generally lower than the values obtained with the vial method, particularly whenever freeze-drying was conducted under conditions that produced large variations in product temperature (ie, low shelf temperature, low chamber pressure, and without thermal shields). In an experiment designed to magnify temperature heterogeneity, MTM resistance values were much lower than the simple average of the product resistances. However, in experiments where product temperatures were homogenous, good agreement between MTM and "vial-method" resistances was obtained. The reason for the low MTM resistance problem is the fast vapor pressure rise from a few "warm" edge vials or vials with low resistance. With proper use of thermal shields, and the evaluation of DeltaT from the data, MTM resistance data are accurate. Thus, the MTM method for determining dry-layer resistance is a useful tool for freeze-drying process analytical technology.
Guthrie, Kendall D; Stoner, Steven C; Hartwig, D Matthew; May, Justin R; Nicolaus, Sara E; Schramm, Andrew M; DiDonato, Kristen L
2017-02-01
(1) To identify physicians' preferences in regard to pharmacist-provided medication therapy management (MTM) communication in the community pharmacy setting; (2) to identify physicians' perceived barriers to communicating with a pharmacist regarding MTM; and (3) to determine whether Missouri physicians feel MTM is beneficial for their patients. A cross-sectional prospective survey study of 2021 family and general practice physicians registered with MO HealthNet, Missouri's Medicaid program. The majority (52.8%) of physicians preferred MTM data to be communicated via fax. Most physicians who provided care to patients in long-term care (LTC) facilities (81.0%) preferred to be contacted at their practice location as opposed to the LTC facility. The greatest barriers to communication were lack of time and inefficient communication practices. Improved/enhanced communication was the most common suggestion for improvement in the MTM process. Approximately 67% of respondents reported MTM as beneficial or somewhat beneficial for their patients. Survey respondents saw value in the MTM services offered by pharmacists. However, pharmacists should use the identified preferences and barriers to improve their currently utilized communication practices in hopes of increasing acceptance of recommendations. Ultimately, this may assist MTM providers in working collaboratively with patients' physicians.
New results on the mid-latitude midnight temperature maximum
NASA Astrophysics Data System (ADS)
Mesquita, Rafael L. A.; Meriwether, John W.; Makela, Jonathan J.; Fisher, Daniel J.; Harding, Brian J.; Sanders, Samuel C.; Tesema, Fasil; Ridley, Aaron J.
2018-04-01
Fabry-Perot interferometer (FPI) measurements of thermospheric temperatures and winds show the detection and successful determination of the latitudinal distribution of the midnight temperature maximum (MTM) in the continental mid-eastern United States. These results were obtained through the operation of the five FPI observatories in the North American Thermosphere Ionosphere Observing Network (NATION) located at the Pisgah Astronomic Research Institute (PAR) (35.2° N, 82.8° W), Virginia Tech (VTI) (37.2° N, 80.4° W), Eastern Kentucky University (EKU) (37.8° N, 84.3° W), Urbana-Champaign (UAO) (40.2° N, 88.2° W), and Ann Arbor (ANN) (42.3° N, 83.8° W). A new approach for analyzing the MTM phenomenon is developed, which features the combination of a method of harmonic thermal background removal followed by a 2-D inversion algorithm to generate sequential 2-D temperature residual maps at 30 min intervals. The simultaneous study of the temperature data from these FPI stations represents a novel analysis of the MTM and its large-scale latitudinal and longitudinal structure. The major finding in examining these maps is the frequent detection of a secondary MTM peak occurring during the early evening hours, nearly 4.5 h prior to the timing of the primary MTM peak that generally appears after midnight. The analysis of these observations shows a strong night-to-night variability for this double-peaked MTM structure. A statistical study of the behavior of the MTM events was carried out to determine the extent of this variability with regard to the seasonal and latitudinal dependence. The results show the presence of the MTM peak(s) in 106 out of the 472 determinable nights (when the MTM presence, or lack thereof, can be determined with certainty in the data set) selected for analysis (22 %) out of the total of 846 nights available. The MTM feature is seen to appear slightly more often during the summer (27 %), followed by fall (22 %), winter (20 %), and spring (18 %). Also seen is a northwestward propagation of the MTM signature with a latitude-dependent amplitude. This behavior suggests either a latitudinal dependence of thermosphere tidal dissipation or a night-to-night variation of the composition of the higher-order tidal modes that contribute to the production of the MTM peak at mid-latitudes. Also presented in this paper is the perturbation on the divergence of the wind fields, which is associated with the passage of each MTM peak analyzed with the 2-D interpolation.
Application of advanced structure to multi-tone mask for FPD process
NASA Astrophysics Data System (ADS)
Song, Jin-Han; Jeong, Jin-Woong; Kim, Kyu-Sik; Jeong, Woo-Gun; Yun, Sang-Pil; Lee, Dong-Heok; Choi, Sang-Soo
2017-07-01
In accordance with improvement of FPD technology, masks such as phase shift mask (PSM) and multi-tone mask (MTM) for a particular purpose also have been developed. Above all, the MTM consisted of more than tri-tone transmittance has a substantial advantage which enables to reduce the number of mask demand in FPD fabrication process contrast to normal mask of two-tone transmittance.[1,2] A chromium (Cr)-based MTM (Typically top type) is being widely employed because of convenience of etch process caused by its only Cr-based structure consisted of Cr absorber layer and Cr half-tone layer. However, the top type of Cr-based MTM demands two Cr sputtering processes after each layer etching process and writing process. For this reason, a different material from the Cr-based MTM is required for reduction of mask fabrication time and cost. In this study, we evaluate a MTM which has a structure combined Cr with molybdenum silicide (MoSi) to resolve the issues mentioned above. The MoSi which is demonstrated by integrated circuit (IC) process is a suitable material for MTM evaluation. This structure could realize multi-transmittance in common with the Cr-based MTM. Moreover, it enables to reduce the number of sputtering process. We investigate a optimized structure upon consideration of productivity along with performance such as critical dimension (CD) variation and transmittance range of each structure. The transmittance is targeted at h-line wavelength (405 nm) in the evaluation. Compared with Cr-based MTM, the performances of all Cr-/MoSi-based MTMs are considered.
Assessing Medicare beneficiaries' willingness-to-pay for medication therapy management services.
Woelfel, Joseph A; Carr-Lopez, Sian M; Delos Santos, Melanie; Bui, Ann; Patel, Rajul A; Walberg, Mark P; Galal, Suzanne M
2014-02-01
To assess Medicare beneficiaries' willingness-to-pay (WTP) for medication therapy management (MTM) services and determine sociodemographic and clinical characteristics influencing this payment amount. A cross-sectional, descriptive study design was adopted to elicit Medicare beneficiaries' WTP for MTM. Nine outreach events in cities across Central/Northern California during Medicare's 2011 open-enrollment period. A total of 277 Medicare beneficiaries participated in the study. Comprehensive MTM was offered to each beneficiary. Pharmacy students conducted the MTM session under the supervision of licensed pharmacists. At the end of each MTM session, beneficiaries were asked to indicate their WTP for the service. Medication, self-reported chronic conditions, and beneficiary demographic data were collected and recorded via a survey during the session. The mean WTP for MTM was $33.15 for the 277 beneficiaries receiving the service and answering the WTP question. WTP by low-income subsidy recipients (mean ± standard deviation; $12.80 ± $24.10) was significantly lower than for nonsubsidy recipients ($41.13 ± $88.79). WTP was significantly (positively) correlated with number of medications regularly taken and annual out-of-pocket drug costs. The mean WTP for MTM was $33.15. WTP for MTM significantly varied by race, subsidy status, and number of prescription medications taken. WTP was significantly higher for nonsubsidy recipients than subsidy recipients, and significantly positively correlated with the number of medications regularly taken and the beneficiary rating of the delivered services.
Barnett, Mitchell J; Frank, Jessica; Wehring, Heidi; Newland, Brand; VonMuenster, Shannon; Kumbera, Patty; Halterman, Tom; Perry, Paul J
2009-01-01
Although community pharmacists have historically been paid primarily for drug distribution and dispensing services, medication therapy management (MTM) services evolved in the 1990s as a means for pharmacists and other providers to assist physicians and patients in managing clinical, service, and cost outcomes of drug therapy. The Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA 2003) and the subsequent implementation of Medicare Part D in January 2006 for the more than 20 million Medicare beneficiaries enrolled in the Part D benefit formalized MTM services for a subset of high-cost patients. Although Medicare Part D has provided a new opportunity for defining the value of pharmacist-provided MTM services in the health care system, few publications exist which quantify changes in the provision of pharmacist-provided MTM services over time. To (a) describe the changes over a 7-year period in the primary types of MTM services provided by community pharmacies that have contracted with drug plan sponsors through an MTM administrative services company, and (b) quantify potential MTM-related cost savings based on pharmacists' self-assessments of the likely effects of their interventions on health care utilization. Medication therapy management claims from a multistate MTM administrative services company were analyzed over the 7-year period from January 1, 2000, through December 31, 2006. Data extracted from each MTM claim included patient demographics (e.g., age and gender), the drug and type that triggered the intervention (e.g., drug therapeutic class and therapy type as either acute, intermittent, or chronic), and specific information about the service provided (e.g., Reason, Action, Result, and Estimated Cost Avoidance [ECA]). ECA values are derived from average national health care utilization costs, which are applied to pharmacist self-assessment of the "reasonable and foreseeable" outcome of the intervention. ECA values are updated annually for medical care inflation. From a database of nearly 100,000 MTM claims, a convenience sample of 50 plan sponsors was selected. After exclusion of claims with missing or potentially duplicate data, there were 76,148 claims for 23,798 patients from community pharmacy MTM providers in 47 states. Over the 7-year period from January 1, 2000, through December 31, 2006, the mean ([SD] median) pharmacy reimbursement was $8.44 ([$5.19] $7.00) per MTM service, and the mean ([SD] median) ECA was $93.78 ([$1,022.23] $5.00). During the 7-year period, pharmacist provided MTM interventions changed from primarily education and monitoring for new or changed prescription therapies to prescriber consultations regarding cost-efficacy management (Pearson chi-square P<0.001). Services also shifted from claims involving acute medications (e.g. penicillin antibiotics, macrolide antibiotics, and narcotic analgesics) to services involving chronic medications (e.g., lipid lowering agents, angiotensin-converting enzyme [ACE] inhibitors, and beta-blockers; P<0.001), resulting in significant changes in the therapeutic classes associated with MTM claims and an increase in the proportion of older patients served (P<0.001). These trends resulted in higher pharmacy reimbursements and greater ECA per claim over time (P<0.001). MTM interventions over a 7-year period evolved from primarily the provision of patient education involving acute medications towards consultation-type services for chronic medications. These changes were associated with increases in reimbursement amounts and pharmacist-estimated cost savings. It is uncertain if this shift in service type is a result of clinical need, documentation requirements, or reimbursement opportunities.
Wang, Junling; Hong, Song Hee
2015-01-01
Pharmacists' acceptable level of compensation for medication therapy management (MTM) services needs to be determined using various economic evaluation techniques. Using contingent valuation method, determine pharmacists' acceptable levels of compensation for MTM services. A mailing survey was used to elicit Tennessee (U.S.) pharmacists' acceptable levels of compensation for a 30-minute MTM session for a new patient with 2 medical conditions, 8 medications, and an annual drug cost of $2000. Three versions of a series of double-bounded, closed-ended, binary discrete choice questions were asked of pharmacists for their willingness to accept (WTA) for an original monetary value ($30, $60, or $90) and then follow-up higher or lower value depending on their responses to the original value. A Kaplan-Meier approach was taken to analyze pharmacists' WTA, and Cox's proportional hazards model was used to examine the effects of pharmacist characteristics on their WTA. Three hundred and forty-eight pharmacists responded to the survey. Pharmacists' WTA for the given MTM session had a mean of $63.31 and median of $60. The proportions of pharmacists willing to accept $30, $60, and $90 for the given MTM session were 30.61%, 85.19%, and 91.01%, respectively. Pharmacists' characteristics had statistically significant association with their WTA rates. Pharmacists' WTA for the given MTM session is higher than current Medicare MTM programs' compensation levels of $15-$50 and patients' willingness to pay of less than $40. Besides advocating for higher MTM compensation levels by third-party payers, pharmacists also may need to charge patients to reach sufficient compensation levels for MTM services. Copyright © 2015 Elsevier Inc. All rights reserved.
Wang, Junling; Hong, Song Hee
2012-01-01
Background Pharmacists' acceptable level of compensation for medication therapy management (MTM) services needs to be determined using various economic evaluation techniques. Objectives Using contingent valuation method, determine pharmacists' acceptable levels of compensation for MTM services. Methods A mailing survey was used to elicit Tennessee (US) pharmacists' acceptable levels of compensation for a 30-minute MTM session for a new patient with 2 medical conditions, 8 medications, and an annual drug cost of $2,000. Three versions of a series of double-bounded, closed-ended, binary discrete choice questions were asked of pharmacists for their willingness-to-accept (WTA) for an original monetary value ($30, $60, or $90) and then follow-up higher or lower value depending on their responses to the original value. A Kaplan-Meier approach was taken to analyze pharmacists' WTA, and Cox's proportional hazards model was used to examine the effects of pharmacist characteristics on their WTA. Results Three hundred and forty-eight pharmacists responded to the survey. Pharmacists' WTA for the given MTM session had a mean of $63.31 and median of $60. The proportions of pharmacists willing to accept $30, $60, and $90 for the given MTM session were 30.61%, 85.19%, and 91.01%, respectively. Pharmacists' characteristics had statistically significant association with their WTA rates. Conclusions Pharmacists' WTA for the given MTM session is higher than current Medicare MTM programs' compensation levels of $15 to $50 and patients' willingness-to-pay of less than $40. Besides advocating for higher MTM compensation levels by third-party payers, pharmacists also may need to charge patients to reach sufficient compensation levels for MTM services. PMID:22436583
2005-01-01
To develop a model framework of Medication Therapy Management (MTM) in community pharmacy designed to improve care, enhance communication among patients and providers, improve collaboration among providers, and optimize medication use that leads to improved patient outcomes. Peer-reviewed literature, structured discussions with community pharmacy leaders and representatives from pharmacy benefit providers and health plans, and input from pharmacists and pharmacy associations. Building on an MTM consensus definition adopted by 11 national pharmacy organizations in July 2004, this model describes core elements of an MTM service that can be provided by pharmacists across the spectrum of community pharmacy. The model is structured for pharmacists to use with all patients in need of MTM services, both in the private and public sector. The model describes five core elements of MTM in the community pharmacy setting: medication therapy review (MTR), a personal medication record (PMR), a medication action plan (MAP), intervention and referral, and documentation and follow-up. The MTR can be comprehensive or targeted, depending onthe needs of the patient. The PMR and MAP are patient-centered documents intended to be used by the patient to improve medication self-management. A collaborative approach to patient care involving patients, pharmacists, and physicians and other health care providers is advocated in the model. General patient eligibility considerations are also described. A model framework for consideration by community pharmacists in developing MTM services is described. The model consists of five core elements for MTM service delivery in community pharmacy practice.
Comparison of two Medication Therapy Management Practice Models on Return on Investment.
Gazda, Nicholas P; Berenbrok, Lucas A; Ferreri, Stefanie P
2017-06-01
To compare the return on investment (ROI) of an integrated practice model versus a "hub and spoke" practice model of pharmacist provided medication therapy management (MTM). A cohort retrospective analysis of MTM claims billed in 76 pharmacies in North Carolina in the 2010 hub and spoke practice model and the 2012 "integrated" practice model were analyzed to calculate the ROI. In 2010, 4089 patients received an MTM resulting in 8757 claims in the hub and spoke model. In 2012, 4896 patients received an MTM resulting in 13 730 claims in the integrated model. In 2010, US$165 897.26 was invested in pharmacist salary and $173 498.00 was received in reimbursement, resulting in an ROI of +US$7600.74 (+4.6%). In 2012, US$280 890.09 was invested in pharmacist salary and US$302 963 was received in reimbursement, resulting in an ROI of +US$22 072.91 or (+7.9%). The integrated model of MTM showed an increase in number of claims submitted and in number of patients receiving MTM services, ultimately resulting in a higher ROI. While a higher ROI was evident in the integrated model, both models resulted in positive ROI (1:12-1:21), highlighting that MTM programs can be cost effective with different strategies of execution.
Impact of medication therapy management on underserved, primarily Hispanic patients with diabetes.
Congdon, Heather B; Dowling, Thomas C; Cheng, Iliana; Truong, Hoai-An
2013-05-01
Diabetes-related complications are more pronounced in Hispanic patients versus patients of other ethnicities. It is documented that medication therapy management (MTM) can improve diabetes outcomes; however, data regarding Hispanic patients are limited. To evaluate the impact of MTM on hemoglobin A1c (A1C), blood pressure (BP), and low-density lipoprotein cholesterol (LDL-C) in underserved, primarily Hispanic patients who use a safety-net clinic as their medical home. A retrospective, observational study of uninsured, primarily Hispanic patients with diabetes who received MTM from October 2009 through March 2011. Patients were stratified into 2 cohorts: A1C less than 9% and A1C greater than or equal to 9%. Patients were also stratified by frequency of MTM visits and insulin use, regardless of A1C. A chart review was conducted to evaluate diabetes-related outcomes pre- and postimplementation of MTM. The primary study outcome was reduction of A1C. Secondary outcomes included reduction of BP and LDL-C and reduction of A1C based on MTM visit frequency or insulin use. Sixty-four patients with at least 1 MTM visit and pre- and postimplementation A1C data were included. In the cohort with A1C greater than or equal to 9%, mean (SD) A1C values decreased from 10.9% (1.4%) to 8.8% (1.5%) versus the cohort with A1C less than 9%, whose A1C changed minimally, from 7.2% (0.9%) to 7.4% (1.4%). Regardless of their A1C, patients who were using insulin at baseline had a change in A1C of -0.8% (1.5%) versus -0.1% (1.6%) in those who were not using insulin at baseline (p = 0.04); patients who participated in multiple MTM visits had a significant reduction in A1C, from 9% to 8.3% (95% CI -1.26 to -0.03; p = 0.02) compared with patients participating in only 1 MTM visit. Pharmacist-provided MTM can significantly improve diabetes control in uninsured, primarily Hispanic patients with poorly controlled diabetes and in those who are using insulin. Multiple MTM visits also yielded significant A1C reductions.
Racial and ethnic disparities in meeting MTM eligibility criteria among patients with asthma.
Lu, Degan; Qiao, Yanru; Johnson, Karen C; Wang, Junling
2017-06-01
Asthma is one of the most frequently targeted chronic diseases in the medication therapy management (MTM) programs of the Medicare prescription drug (Part D) benefits. Although racial and ethnic disparities in meeting eligibility criteria for MTM services have been reported, little is known about whether there would be similar disparities among adults with asthma in the United States. Adult patients with asthma (age ≥ 18) from Medical Expenditure Panel Survey (2011-2012) were analyzed. Bivariate analyses were conducted to compare the proportions of patients who would meet Medicare MTM eligibility criteria between non-Hispanic Blacks (Blacks), Hispanics and non-Hispanic Whites (Whites). Survey-weighted logistic regression was performed to adjust for patient characteristics. Main and sensitivity analyses were conducted to cover the entire range of the eligibility thresholds used by Part D plans in 2011-2012. The sample included 4,455 patients with asthma, including 2,294 Whites, 1,218 Blacks, and 943 Hispanics. Blacks and Hispanics had lower proportions of meeting MTM eligibility criteria than did Whites (P < 0.001). According to the main analysis, Blacks and Hispanics had 36% and 32% lower, respectively, likelihood of MTM eligibility than Whites (odds ratio [OR]: 0.64, 95% confidence interval [CI]: 0.45-0.90; OR: 0.68, 95% CI: 0.47-0.98, respectively). Similar results were obtained in sensitivity analyses. There are racial and ethnic disparities in meeting Medicare Part D MTM eligibility criteria among adult patients with asthma. Future studies should examine the implications of such disparities on health outcomes of patients with asthma and explore alternative MTM eligibility criteria.
An introductory pharmacy practice experience based on a medication therapy management service model.
Agness, Chanel F; Huynh, Donna; Brandt, Nicole
2011-06-10
To implement and evaluate an introductory pharmacy practice experience (IPPE) based on the medication therapy management (MTM) service model. Patient Care 2 is an IPPE that introduces third-year pharmacy students to the MTM service model. Students interacted with older adults to identify medication-related problems and develop recommendations using core MTM elements. Course outcome evaluations were based on number of documented medication-related problems, recommendations, and student reviews. Fifty-seven older adults participated in the course. Students identified 52 medication-related problems and 66 medical problems, and documented 233 recommendations relating to health maintenance and wellness, pharmacotherapy, referrals, and education. Students reported having adequate experience performing core MTM elements. Patient Care 2 may serve as an experiential learning model for pharmacy schools to teach the core elements of MTM and provide patient care services to the community.
Minutia Tensor Matrix: A New Strategy for Fingerprint Matching
Fu, Xiang; Feng, Jufu
2015-01-01
Establishing correspondences between two minutia sets is a fundamental issue in fingerprint recognition. This paper proposes a new tensor matching strategy. First, the concept of minutia tensor matrix (simplified as MTM) is proposed. It describes the first-order features and second-order features of a matching pair. In the MTM, the diagonal elements indicate similarities of minutia pairs and non-diagonal elements indicate pairwise compatibilities between minutia pairs. Correct minutia pairs are likely to establish both large similarities and large compatibilities, so they form a dense sub-block. Minutia matching is then formulated as recovering the dense sub-block in the MTM. This is a new tensor matching strategy for fingerprint recognition. Second, as fingerprint images show both local rigidity and global nonlinearity, we design two different kinds of MTMs: local MTM and global MTM. Meanwhile, a two-level matching algorithm is proposed. For local matching level, the local MTM is constructed and a novel local similarity calculation strategy is proposed. It makes full use of local rigidity in fingerprints. For global matching level, the global MTM is constructed to calculate similarities of entire minutia sets. It makes full use of global compatibility in fingerprints. Proposed method has stronger description ability and better robustness to noise and nonlinearity. Experiments conducted on Fingerprint Verification Competition databases (FVC2002 and FVC2004) demonstrate the effectiveness and the efficiency. PMID:25822489
Sound Medication Therapy Management Programs, Version 2.0 with validation study.
2008-01-01
The Academy of Managed Care Pharmacy (AMCP, the Academy) contracted with the National Committee for Quality Assurance (NCQA) to conduct a field study to validate and assess the 2006 Sound Medication Therapy Management Programs, Version 1.0 document. Version 1.0 posits several principles of sound medication therapy management (MTM) programs: they (1) recruit patients whose data show they may need assistance with managing medications; (2) have health professionals who intervene with patients and their physicians to improve medication regimens; and (3) measure their results. The validation study determined the extent to which the principles identified in version 1.0 are incorporated in MTM programs. The method was designed to determine to what extent the important features and operational elements of sound MTM programs as described in version 1.0 are (1) acceptable and seen as comprehensive to users, (2) incorporated into MTM programs in the field, (3) reflective of the consensus group's intentions, and (4) in need of modification or updating. NCQA first conducted Phase One, in which NCQA gathered perspectives on the principles in the consensus document from a mixed group of stakeholders representing both providers and users of MTM programs. Phase Two involved a deeper analysis of existing programs related to the consensus document, in which NCQA conducted a Web-based survey of 20 varied MTM programs and conducted in-depth site visits with 5 programs. NCQA selected programs offered by a range of MTM-providing organizations -- health plans, pharmacy benefit management companies, disease management organizations, and stand-alone MTM providers. NCQA analyzed the results of both phases. The Phase Two survey asked specific questions of the programs and found that some programs perform beyond the principles listed in version 1.0. NCQA found that none of the elements of the consensus document should be eliminated because programs cannot perform them, although NCQA suggested some areas where the document could be more expansive or more specific, given the state of MTM operations in the field. The important features and operational elements in the document were categorized into the following 3 overall categories, which NCQA used to structure the survey and conduct the site visits in Phase Two: (1) eligibility and enrollment, (2) operations, and (3) quality management. NCQA found that the original consensus document was realistic in identifying the elements of sound MTM. In the current project, NCQA's purpose was not to make judgments about the effectiveness of MTM programs in general or any individual program in particular. NCQA recommended that the consensus document could be made stronger and more specific in 3 areas: (1) specifically state that the Patient Identification and Recruitment section advocates use of various eligibility criteria that may include, but are not limited to, Medicare-defined MTM eligibility criteria; (2) reframe or remove the statement in Appendix A of the consensus document that the preferred modality for MTM is face-to-face interaction between patient and pharmacist, unless there are comparative data to support it as currently written; and (3) specifically recommend that programs measure performance across the entire populations in their plans in addition to measuring results for those patients selected into MTM. This will make benchmarking among programs possible and will lead to substantiated best practices in this growing field.
Oladapo, Abiola O; Rascati, Karen L
2012-08-01
To provide a summary of published survey articles regarding the provision of medication therapy management (MTM) services in the United States. A literature search was conducted to identify original articles on MTM-related surveys conducted in the United States, involving community and outpatient pharmacists, physicians, patients, or pharmacy students and published by the primary researchers who conducted the study. Search engines used included PubMed, Medline, and International Pharmaceutical Abstracts (IPA). If MTM was in the keyword list, mesh heading, title, or abstract, the article was reviewed. References from these articles were searched to determine whether other relevant articles were available. A total of 405 articles were initially reviewed; however, only 32 articles met the study requirements. Of the 32 articles, 17 surveyed community/outpatient pharmacists, 3 surveyed pharmacy students, 4 surveyed physicians, and 8 surveyed patients. The survey periods varied across the different studies, with the earliest survey conducted in 2004 and the most recent survey conducted in 2009. The surveys were conducted via the telephone, US mail, interoffice mail, e-mails, Internet/Web sites, hand-delivered questionnaires, and focus groups. Despite the identified barriers to the provision of MTM services, pharmacists reportedly found it professionally rewarding to provide these services. Pharmacists claimed to have adequate clinical knowledge, experience, and access to information required to provide MTM services. Pharmacy students were of the opinion that the provision of MTM services was important to the advancement of the pharmacy profession and in providing patients with a higher level of care. Physicians supported having pharmacists adjust patients’ drug therapy and educate patients on general drug information but not in selecting patients’ drug therapy. Finally, patients suggested that alternative ways need to be explored in describing and marketing MTM services for it to be appealing to them.
Denvir, Paul M; Cardone, Katie E; Parker, Wendy M; Cerulli, Jennifer
2018-02-01
Medication therapy management (MTM) is a comprehensive, patient-centered approach to improving medication use, reducing the risk of adverse events and improving medication adherence. Given the service delivery model and required outputs of MTM services, communication skills are of utmost importance. The objectives of this study were to identify and describe communication principles and instructional practices to enhance MTM training. Drawing on formative assessment data from interviews of both pharmacy educators and alumni, this article identifies and describes communication principles and instructional practices that pharmacy educators can use to enhance MTM training initiatives to develop student communication strategies. Analysis revealed five key communication challenges of MTM service delivery, two communication principles that pharmacy teachers and learners can use to address those challenges, and a range of specific strategies, derived from communication principles, that students can use when challenges emerge. Implications of the analysis for pharmacy educators and researchers are described. Proactive communication training provided during MTM advanced pharmacy practice experiences enabled students to apply the principles and instructional strategies to specific patient interactions during the advanced pharmacy practice experiences and in their post-graduation practice settings. Copyright © 2017 Elsevier Inc. All rights reserved.
Isotopic imprints of mountaintop mining contaminants.
Vengosh, Avner; Lindberg, T Ty; Merola, Brittany R; Ruhl, Laura; Warner, Nathaniel R; White, Alissa; Dwyer, Gary S; Di Giulio, Richard T
2013-09-03
Mountaintop mining (MTM) is the primary procedure for surface coal exploration within the central Appalachian region of the eastern United States, and it is known to contaminate streams in local watersheds. In this study, we measured the chemical and isotopic compositions of water samples from MTM-impacted tributaries and streams in the Mud River watershed in West Virginia. We systematically document the isotopic compositions of three major constituents: sulfur isotopes in sulfate (δ(34)SSO4), carbon isotopes in dissolved inorganic carbon (δ(13)CDIC), and strontium isotopes ((87)Sr/(86)Sr). The data show that δ(34)SSO4, δ(13)CDIC, Sr/Ca, and (87)Sr/(86)Sr measured in saline- and selenium-rich MTM impacted tributaries are distinguishable from those of the surface water upstream of mining impacts. These tracers can therefore be used to delineate and quantify the impact of MTM in watersheds. High Sr/Ca and low (87)Sr/(86)Sr characterize tributaries that originated from active MTM areas, while tributaries from reclaimed MTM areas had low Sr/Ca and high (87)Sr/(86)Sr. Leaching experiments of rocks from the watershed show that pyrite oxidation and carbonate dissolution control the solute chemistry with distinct (87)Sr/(86)Sr ratios characterizing different rock sources. We propose that MTM operations that access the deeper Kanawha Formation generate residual mined rocks in valley fills from which effluents with distinctive (87)Sr/(86)Sr and Sr/Ca imprints affect the quality of the Appalachian watersheds.
Millonig, Marsha K
2009-01-01
To convene a diverse group of stakeholders to discuss medication therapy management (MTM) documentation and billing standardization and its interoperability within the health care system. More than 70 stakeholders from pharmacy, health information systems, insurers/payers, quality, and standard-setting organizations met on October 7-8, 2008, in Bethesda, MD. The American Pharmacists Association (APhA) organized the invitational conference to facilitate discussion on strategic directions for meeting current market need for MTM documentation and billing interoperability and future market needs for MTM integration into electronic health records (EHRs). APhA recently adopted policy that specifically addresses technology barriers and encourages the use and development of standardized systems for the documentation and billing of MTM services. Day 1 of the conference featured six foundational presentations on health information technology (HIT) trends, perspectives on MTM from the profession and the Centers for Medicare & Medicaid Services, health care quality and medication-related outcome measures, integrating MTM workflow in EHRs, and the current state of MTM operalization in practice. After hearing presentations on day 1 and having the opportunity to pose questions to each speaker, conference participants were divided into three breakout groups on day 2. Each group met three times for 60 minutes each and discussed five questions from the perspective of a patient, provider, or payer. Three facilitators met with each of the groups and led discussion from one perspective (i.e., patient, provider, payer). Participants then reconvened as a complete group to participate in a discussion on next steps. HIT is expected to assist in delivering safe, effective, efficient, coordinated care as health professionals strive to improve the quality of care and outcomes for individual patients. The pharmacy profession is actively contributing to quality patient care through MTM services focused on identifying and preventing medication-related problems, improving medication use, and optimizing individual therapeutic outcomes. As MTM programs continue to expand within the health care system, one important limiting factor is the lack of standardization for documentation and billing of MTM services. This lack of interoperability between technology systems, software, and system platforms is presenting as a barrier to MTM service delivery for patients. APhA convened this invitational conference to identify strategic directions to address MTM documentation and billing standardization and interoperability. Participants viewed the meeting as highly successful in bringing together a unique, wide-ranging set of stakeholders, including the government, regulators, standards organizations, other health professions, technology firms, professional organizations, and practitioners, to share perspectives. They strongly encouraged the Association to continue this unique stakeholder dialogue. Participants provided a number of next-step suggestions for APhA to consider because of the event. Participants noted the pharmacy profession's success in building information technology systems for product transactions with systematic, organized, methodical thinking and the need to apply this success to patient services. A unique opportunity exists for the profession to influence and lead the HIT community in creating a workable health technology solution for MTM services. Reaching consensus on minimum data sets for each functional area--clinical, billing, quality improvement--would be a very important short-term gain. Further, participants said it was imperative for pharmacists and the pharmacy community at large to become actively engaged in HIT standards development efforts.
Trajectory-based morphological operators: a model for efficient image processing.
Jimeno-Morenilla, Antonio; Pujol, Francisco A; Molina-Carmona, Rafael; Sánchez-Romero, José L; Pujol, Mar
2014-01-01
Mathematical morphology has been an area of intensive research over the last few years. Although many remarkable advances have been achieved throughout these years, there is still a great interest in accelerating morphological operations in order for them to be implemented in real-time systems. In this work, we present a new model for computing mathematical morphology operations, the so-called morphological trajectory model (MTM), in which a morphological filter will be divided into a sequence of basic operations. Then, a trajectory-based morphological operation (such as dilation, and erosion) is defined as the set of points resulting from the ordered application of the instant basic operations. The MTM approach allows working with different structuring elements, such as disks, and from the experiments, it can be extracted that our method is independent of the structuring element size and can be easily applied to industrial systems and high-resolution images.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bosserman, Mary A.; Downey, Theresa; Noinaj, Nicholas
Baeyer–Villiger monooxygenases (BVMOs) have been shown to play key roles for the biosynthesis of important natural products. MtmOIV, a homodimeric FAD- and NADPH-dependent BVMO, catalyzes the key frame-modifying steps of the mithramycin biosynthetic pathway, including an oxidative C–C bond cleavage, by converting its natural substrate premithramycin B into mithramycin DK, the immediate precursor of mithramycin. The drastically improved protein structure of MtmOIV along with the high-resolution structure of MtmOIV in complex with its natural substrate premithramycin B are reported here, revealing previously undetected key residues that are important for substrate recognition and catalysis. Kinetic analyses of selected mutants allowed usmore » to probe the substrate binding pocket of MtmOIV and also to discover the putative NADPH binding site. This is the first substrate-bound structure of MtmOIV providing new insights into substrate recognition and catalysis, which paves the way for the future design of a tailored enzyme for the chemo-enzymatic preparation of novel mithramycin analogues.« less
Pilot study: incorporation of pharmacogenetic testing in medication therapy management services.
Haga, Susanne B; Allen LaPointe, Nancy M; Moaddeb, Jivan; Mills, Rachel; Patel, Mahesh; Kraus, William E
2014-11-01
Aim: To describe the rationale and design of a pilot study evaluating the integration of pharmacogenetic (PGx) testing into pharmacist-delivered medication therapy management (MTM). Study rationale: Clinical delivery approaches of PGx testing involving pharmacists may overcome barriers of limited physician knowledge about and experience with testing. Study design: We will assess the addition of PGx testing to MTM services for cardiology patients taking three or more medications including simvastatin or clopidogrel. We will measure the impact of MTM plus PGx testing on drug/dose adjustment and clinical outcomes. Factors associated with delivery, such as time to prepare and conduct MTM and consult with physicians will be recorded. Additionally, patient interest and satisfaction will be measured. Anticipated results: We anticipate that PGx testing can be practically integrated into standard a MTM service, providing a viable delivery model for testing. Conclusion: Given the lack of evidence of an effective PGx delivery models, this study will provide preliminary evidence regarding a pharmacist-delivered approach.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-05
... that affect these lands. The lessee agrees to new lease terms for rentals and royalties of $10 per acre and 16\\2/3\\ percent. The lessee paid the $500 administration fee for the reinstatement of each lease... leases; The increased rental of $10 per acre; The increased royalty of 16\\2/3\\ percent; and The $163 cost...
McGivney, Melissa Somma; Meyer, Susan M; Duncan-Hewitt, Wendy; Hall, Deanne L; Goode, Jean-Venable R; Smith, Randall B
2007-01-01
To delineate the relationship, including similarities and differences, between medication therapy management (MTM) and contemporary pharmacist-provided services, including patient counseling, disease management, and pharmaceutical care, to facilitate the continued evolution of commonly used language and a standard of practice across geographic areas and practice environments. Incorporation of MTM services into the array of Medicare-funded services affords an opportunity for pharmacists to develop direct patient care services in the community. Defining the role of MTM within the scope of pharmacist-provided patient care activities, including patient counseling, disease management, and all currently provided pharmacy services is essential to the delineation of a viable and sustainable practice model for pharmacists. The definitions of each of these services are offered, as well as comparisons and contrasts of the individual services. In addition to Medicare-eligible patients, MTM services are appropriate for anyone with medication-related needs. MTM is offered as an all-encompassing model that incorporates the philosophy of pharmaceutical care, techniques of patient counseling, and disease management in an environment that facilitates the direct collaboration of patients, pharmacists, and other health professionals. Defining the role of MTM within the current patient care models, including patient counseling, disease management, and all who provide pharmacy services, is essential in delineating a viable and sustainable practice model for pharmacists.
A natural history study of X-linked myotubular myopathy.
Amburgey, Kimberly; Tsuchiya, Etsuko; de Chastonay, Sabine; Glueck, Michael; Alverez, Rachel; Nguyen, Cam-Tu; Rutkowski, Anne; Hornyak, Joseph; Beggs, Alan H; Dowling, James J
2017-09-26
To define the natural history of X-linked myotubular myopathy (MTM). We performed a cross-sectional study that included an online survey (n = 35) and a prospective, 1-year longitudinal investigation using a phone survey (n = 33). We ascertained data from 50 male patients with MTM and performed longitudinal assessments on 33 affected individuals. Consistent with existing knowledge, we found that MTM is a disorder associated with extensive morbidities, including wheelchair (86.7% nonambulant) and ventilator (75% requiring >16 hours of support) dependence. However, unlike previous reports and despite the high burden of disease, mortality was lower than anticipated (approximate rate 10%/y). Seventy-six percent of patients with MTM enrolled (mean age 10 years 11 months) were alive at the end of the study. Nearly all deaths in the study were associated with respiratory failure. In addition, the disease course was more stable than expected, with few adverse events reported during the prospective survey. Few non-muscle-related morbidities were identified, although an unexpectedly high incidence of learning disability (43%) was noted. Conversely, MTM was associated with substantial burdens on patient and caregiver daily living, reflected by missed days of school and lost workdays. MTM is one of the most severe neuromuscular disorders, with affected individuals requiring extensive mechanical interventions for survival. However, among study participants, the disease course was more stable than predicted, with more individuals surviving infancy and early childhood. These data reflect the disease burden of MTM but offer hope in terms of future therapeutic intervention. Copyright © 2017 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the American Academy of Neurology.
A natural history study of X-linked myotubular myopathy
Amburgey, Kimberly; Tsuchiya, Etsuko; de Chastonay, Sabine; Glueck, Michael; Alverez, Rachel; Nguyen, Cam-Tu; Rutkowski, Anne; Hornyak, Joseph; Beggs, Alan H.
2017-01-01
Objective: To define the natural history of X-linked myotubular myopathy (MTM). Methods: We performed a cross-sectional study that included an online survey (n = 35) and a prospective, 1-year longitudinal investigation using a phone survey (n = 33). Results: We ascertained data from 50 male patients with MTM and performed longitudinal assessments on 33 affected individuals. Consistent with existing knowledge, we found that MTM is a disorder associated with extensive morbidities, including wheelchair (86.7% nonambulant) and ventilator (75% requiring >16 hours of support) dependence. However, unlike previous reports and despite the high burden of disease, mortality was lower than anticipated (approximate rate 10%/y). Seventy-six percent of patients with MTM enrolled (mean age 10 years 11 months) were alive at the end of the study. Nearly all deaths in the study were associated with respiratory failure. In addition, the disease course was more stable than expected, with few adverse events reported during the prospective survey. Few non–muscle-related morbidities were identified, although an unexpectedly high incidence of learning disability (43%) was noted. Conversely, MTM was associated with substantial burdens on patient and caregiver daily living, reflected by missed days of school and lost workdays. Conclusions: MTM is one of the most severe neuromuscular disorders, with affected individuals requiring extensive mechanical interventions for survival. However, among study participants, the disease course was more stable than predicted, with more individuals surviving infancy and early childhood. These data reflect the disease burden of MTM but offer hope in terms of future therapeutic intervention. PMID:28842446
Schneider, Michael; Haas, Mitchell; Glick, Ronald; Stevans, Joel; Landsittel, Doug
2015-02-15
Randomized controlled trial with follow-up to 6 months. This was a comparative effectiveness trial of manual-thrust manipulation (MTM) versus mechanical-assisted manipulation (MAM); and manipulation versus usual medical care (UMC). Low back pain (LBP) is one of the most common conditions seen in primary care and physical medicine practice. MTM is a common treatment for LBP. Claims that MAM is an effective alternative to MTM have yet to be substantiated. There is also question about the effectiveness of manipulation in acute and subacute LBP compared with UMC. A total of 107 adults with onset of LBP within the past 12 weeks were randomized to 1 of 3 treatment groups: MTM, MAM, or UMC. Outcome measures included the Oswestry LBP Disability Index (0-100 scale) and numeric pain rating (0-10 scale). Participants in the manipulation groups were treated twice weekly during 4 weeks; subjects in UMC were seen for 3 visits during this time. Outcome measures were captured at baseline, 4 weeks, 3 months, and 6 months. Linear regression showed a statistically significant advantage of MTM at 4 weeks compared with MAM (disability = -8.1, P = 0.009; pain = -1.4, P = 0.002) and UMC (disability = -6.5, P = 0.032; pain = -1.7, P < 0.001). Responder analysis, defined as 30% and 50% reductions in Oswestry LBP Disability Index scores revealed a significantly greater proportion of responders at 4 weeks in MTM (76%; 50%) compared with MAM (50%; 16%) and UMC (48%; 39%). Similar between-group results were found for pain: MTM (94%; 76%); MAM (69%; 47%); and UMC (56%; 41%). No statistically significant group differences were found between MAM and UMC, and for any comparison at 3 or 6 months. MTM provides greater short-term reductions in self-reported disability and pain scores compared with UMC or MAM. 2.
Lawlor, Michael W.; Viola, Marissa G.; Meng, Hui; Edelstein, Rachel V.; Liu, Fujun; Yan, Ke; Luna, Elizabeth J.; Lerch-Gaggl, Alexandra; Hoffmann, Raymond G.; Pierson, Christopher R.; Buj-Bello, Anna; Lachey, Jennifer L.; Pearsall, Scott; Yang, Lin; Hillard, Cecilia J.; Beggs, Alan H.
2015-01-01
X-linked myotubular myopathy is a congenital myopathy caused by deficiency of myotubularin. Patients often present with severe perinatal weakness, requiring mechanical ventilation to prevent death from respiratory failure. We recently reported that an activin receptor type IIB inhibitor produced hypertrophy of type 2b myofibers and modest increases of strength and life span in the severely myopathic Mtm1δ4 mouse model of X-linked myotubular myopathy. We have now performed a similar study in the less severely symptomatic Mtm1 p.R69C mouse in hopes of finding greater treatment efficacy. Activin receptor type IIB inhibitor treatment of Mtm1 p.R69C animals produced behavioral and histological evidence of hypertrophy in gastrocnemius muscles but not in quadriceps or triceps. The ability of the muscles to respond to activin receptor type IIB inhibitor treatment correlated with treatment-induced increases in satellite cell number and several muscle-specific abnormalities of hypertrophic signaling. Treatment-responsive Mtm1 p.R69C gastrocnemius muscles displayed lower levels of phosphorylated ribosomal protein S6 and higher levels of phosphorylated eukaryotic elongation factor 2 kinase than were observed in Mtm1 p.R69C quadriceps muscle or in muscles from wild-type littermates. Hypertrophy in the Mtm1 p.R69C gastrocnemius muscle was associated with increased levels of phosphorylated ribosomal protein S6. Our findings indicate that muscle-, fiber type-, and mutation-specific factors affect the response to hypertrophic therapies that will be important to assess in future therapeutic trials. PMID:24726641
NASA Astrophysics Data System (ADS)
Martin Zurdo, M. J.
2012-07-01
The BepiColombo is a space mission to Mercury (ESA in cooperation with Japan Aerospace Exploration Agency). The spacecraft consist of three different structures: two orbiters responsible for the scientific mission (MPO and MMO) and one service module, Mercury Transfer Module (MTM), which provides propulsion and services during the journey to Mercury. Taking into account only the MTM structure, the companies involved are ASTRIUM GERMANY acting as the prime contractor and ASTRIUM UK acting as the co- prime contractor company. EADS CASA Espacio (ECE) in Spain is the company responsible for the final design, manufacturing and qualification of the MTM structure. The test campaign specimen is the MTM core structure, which corresponds to the central cone with the structure floors, shear panels and tank support structure. This test campaign qualifies the primary load path and its primary interfaces; the rest of the MTM structure is qualified by system level vibration test. In order to qualify the MTM structure, three different kinds of qualification tests have been performed: stiffness test, global strength test and local tests in different specific areas. The most relevant test during the campaign is the global strength test case, in which several external loads are introduced (different interfaces) simulating the load introduction for a selected critical flight case. There are two important items in the qualification test campaign: 1. The instrumentation of the structure, with two main functions: to control the specimen under test loads, and to demonstrate the qualification of the structure. 2. The set-up structure, designed by ECE to allow the correct load introduction on each testing case during the whole test campaign. This paper describes the MTM structure test campaign from the definition of the loads applied in each test to the qualification of the complete structure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dahl, N.; Mandel, J.L.; Chery, M.
1995-05-01
A young girl with a clinically moderate form of myotubular myopathy was found to carry a cytogenetically detectable deletion in Xq27-q28. The deletion had occurred de novo on the paternal X chromosome. It encompasses the fragile X (FRAXA) and Hunter syndrome (IDS) loci, and the DXS304 and DXS455 markers, in Xq27.3 and proximal Xq28. Other loci from the proximal half of Xq28 (DXS49, DXS256, DXS258, DXS305, and DXS497) were found intact. As the X-linked myotubular myopathy locus (MTM1) was previously mapped to Xq28 by linkage analysis, the present observation suggested that MTM1 is included in the deletion. However, a significantmore » clinical phenotype is unexpected in a female MTM1 carrier. Analysis of inactive X-specific methylation at the androgen receptor gene showed that the deleted X chromosome was active in {approximately}80% of leukocytes. Such unbalanced inactivation may account for the moderate MTM1 phenotype and for the mental retardation that later developed in the patient. This observation is discussed in relation to the hypothesis that a locus modulating X inactivation may lie in the region. Comparison of this deletion with that carried by a male patient with a severe Hunter syndrome phenotype but no myotubular myopathy, in light of recent linkage data on recombinant MTM1 families, led to a considerable refinement of the position of the MTM1 locus, to a region of {approximately}600 kb, between DXS304 and DXS497. 46 refs., 4 figs.« less
Lawlor, Michael W; Viola, Marissa G; Meng, Hui; Edelstein, Rachel V; Liu, Fujun; Yan, Ke; Luna, Elizabeth J; Lerch-Gaggl, Alexandra; Hoffmann, Raymond G; Pierson, Christopher R; Buj-Bello, Anna; Lachey, Jennifer L; Pearsall, Scott; Yang, Lin; Hillard, Cecilia J; Beggs, Alan H
2014-06-01
X-linked myotubular myopathy is a congenital myopathy caused by deficiency of myotubularin. Patients often present with severe perinatal weakness, requiring mechanical ventilation to prevent death from respiratory failure. We recently reported that an activin receptor type IIB inhibitor produced hypertrophy of type 2b myofibers and modest increases of strength and life span in the severely myopathic Mtm1δ4 mouse model of X-linked myotubular myopathy. We have now performed a similar study in the less severely symptomatic Mtm1 p.R69C mouse in hopes of finding greater treatment efficacy. Activin receptor type IIB inhibitor treatment of Mtm1 p.R69C animals produced behavioral and histological evidence of hypertrophy in gastrocnemius muscles but not in quadriceps or triceps. The ability of the muscles to respond to activin receptor type IIB inhibitor treatment correlated with treatment-induced increases in satellite cell number and several muscle-specific abnormalities of hypertrophic signaling. Treatment-responsive Mtm1 p.R69C gastrocnemius muscles displayed lower levels of phosphorylated ribosomal protein S6 and higher levels of phosphorylated eukaryotic elongation factor 2 kinase than were observed in Mtm1 p.R69C quadriceps muscle or in muscles from wild-type littermates. Hypertrophy in the Mtm1 p.R69C gastrocnemius muscle was associated with increased levels of phosphorylated ribosomal protein S6. Our findings indicate that muscle-, fiber type-, and mutation-specific factors affect the response to hypertrophic therapies that will be important to assess in future therapeutic trials. Copyright © 2014 American Society for Investigative Pathology. Published by Elsevier Inc. All rights reserved.
Community pharmacy-based medication therapy management services: financial impact for patients.
Dodson, Sarah E; Ruisinger, Janelle F; Howard, Patricia A; Hare, Sarah E; Barnes, Brian J
2012-07-01
To determine the direct financial impact for patients resulting from Medication Therapy Management (MTM) interventions made by community pharmacists. Secondary objectives include evaluating the patient and physician acceptance rates of the community pharmacists' recommended MTM interventions. This was a retrospective observational study conducted at 20 Price Chopper and Hen House grocery store chain pharmacies in the Kansas City metro area from January 1, 2010 to December 31, 2010. Study patients were Medicare Part D beneficiaries eligible for MTM services. The primary outcome was the change in patient out-of-pocket prescription medication expense as a result of MTM services. Of 128 patients included in this study, 68% experienced no out-of-pocket financial impact on their medication expenses as a result of MTM services. A total of 27% of the patients realized a cost-savings (USD440.50 per year, (SD=289.69)) while another 5% of patients saw a cost increase in out-of-pocket expense (USD255.66 per year, (SD=324.48)). The net financial impact for all 128 patients who participated in MTM services was an average savings of USD102.83 per patient per year (SD=269.18, p<0.0001). Pharmacists attempted a total of 732 recommendations; 391 (53%) were accepted by both the patient and their prescriber. A total of 341 (47%) recommendations were not accepted because of patient refusal (290, 85%) or prescriber refusal (51, 15%). Patient participation in MTM services reduces patient out-of-pocket medication expense. However, this savings is driven by only 32% of subjects who are experiencing a financial impact on out-of-pocket medication expense. Additionally, the majority of the pharmacists' recommended interventions (53%) were accepted by patients and prescribers.
McDonough, Randal P; Harthan, Aaron A; McLeese, Kelly E; Doucette, William R
2010-01-01
To determine the net financial gain or loss for medication therapy management (MTM) services provided to patients by an independent community pharmacy during 16 months of operation. Retrospective study. Independent community pharmacy in Iowa City, IA, from September 1, 2006, to December 31, 2007. Patients receiving MTM services during the specified period who had proper documentation of reimbursement for the services. MTM services were provided to the patient and documented by the pharmacist or student pharmacist. Net financial gains or losses for providing MTM services. Sensitivity analyses included costs that might be incurred under various conditions of operation. 103 initial and 88 follow-up MTM visits were conducted during a 16-month time period. The total cost for these services to the pharmacy was $11,191.72. Total revenue from these services was $11,195.00; therefore, the pharmacy experienced a net financial gain of $3.28. Sensitivity analyses were conducted, revealing the net gain/loss to the pharmacy if a student pharmacist was used and the net gain/loss if the pharmacist needed extra training to provide the services. Using a student pharmacist resulted in a net gain of $6,308.48, while extra training for the pharmacist resulted in a net loss of $1,602.72. The MTM service programs showed a positive financial gain after 16 months of operation, which should encourage pharmacists to incorporate these services into their practice.
Chang, Alex R; Evans, Michael; Yule, Christina; Bohn, Larissa; Young, Amanda; Lewis, Meredith; Graboski, Elisabeth; Gerdy, Bethany; Ehmann, William; Brady, Jonathan; Lawrence, Leah; Antunes, Natacha; Green, Jamie; Snyder, Susan; Kirchner, H Lester; Grams, Morgan; Perkins, Robert
2016-11-08
Measurement of albuminuria to stratify risk in chronic kidney disease (CKD) is not done universally in the primary care setting despite recommendation in KDIGO (Kidney Disease Improving Global Outcomes) guidelines. Pharmacist medication therapy management (MTM) may be helpful in improving CKD risk stratification and management. We conducted a pragmatic, cluster-randomized trial using seven primary care clinic sites in the Geisinger Health System to evaluate the feasibility of pharmacist MTM in patients with estimated glomerular filtration rate (eGFR) 45-59 ml/min/1.73 m 2 and uncontrolled blood pressure (≥150/85 mmHg). In the three pharmacist MTM sites, pharmacists were instructed to follow a protocol aimed to improve adherence to KDIGO guidelines on testing for proteinuria and lipids, and statin and blood pressure medical therapy. In the four control clinics, patients received usual care. The primary outcome was proteinuria screening over a follow-up of 1 year. A telephone survey was administered to physicians, pharmacists, and patients in the pharmacist MTM arm at the end of the trial. Baseline characteristics were similar between pharmacist MTM (n = 24) and control (n = 23) patients, although pharmacist MTM patients tended to be younger (64 vs. 71 y; p = 0.06) and less likely to have diabetes (17 % vs. 35 %; p = 0.2) or baseline proteinuria screening (41.7 % vs. 60.9 %, p = 0.2). Mean eGFR was 54 ml/min/1.73 m 2 in both groups. The pharmacist MTM intervention did not significantly improve total proteinuria screening at the population level (OR 2.6, 95 % CI: 0.5-14.0; p = 0.3). However, it tended to increase screening of previously unscreened patients (78.6 % in the pharmacist MTM group compared to 33.3 % in the control group; OR 7.3, 95 % CI: 0.96-56.3; p = 0.05). In general, the intervention was well-received by patients, pharmacists, and providers, who agreed that pharmacists could play an important role in CKD management. A few patients contacted the research team to express anxiety about having a CKD diagnosis without prior knowledge. Pharmacist MTM may be useful in improving risk stratification and management of CKD in the primary care setting, although implementation requires ongoing education and multidisciplinary collaboration and careful communication regarding CKD diagnosis. Future studies are needed to establish the effectiveness of pharmacist MTM on slowing CKD progression and improvement in cardiovascular outcomes. ClinicalTrials.gov, NCT02208674 Registered August 1, 2014, first patient enrolled September 30, 2014.
Vande Griend, Joseph P; Rodgers, Melissa; Nuffer, Wesley
2017-05-01
Medication therapy management (MTM) delivery is increasingly important in managed care. Successful delivery positively affects patient health and improves Centers for Medicare & Medicaid Services star ratings, a measure of health plan quality. As MTM services continue to grow, there is an increased need for efficient and effective care models. The primary objectives of this project were to describe the delivery of MTM services by fourth-year Advanced Pharmacy Practice Experience (APPE) students in a centralized retail pharmacy system and to evaluate and quantify the clinical and financial contributions of the students. The secondary objective was to describe the engagement needed to complete comprehensive medication reviews (CMRs) and targeted interventions. From May 2015 to December 2015, thirty-five APPE students from the University of Colorado Skaggs School of Pharmacy provided MTM services at Albertsons Companies using the OutcomesMTM and Mirixa platforms. Students delivered patient care services by phone at the central office and provided face-to-face visits at pharmacies in the region. With implementation of the MTM APPE in 2015, the team consisted of 2 MTM pharmacists and pharmacy students, as compared with 1 MTM pharmacist in 2014. The number of CMRs and targeted interventions completed and the estimated additional revenue generated during the 2015 time period were compared with those completed from May through December 2014. The patient and provider engagement needed to complete the CMRs and targeted interventions was summarized. 125 CMRs and 1,918 targeted interventions were billed in 2015, compared with 13 CMRs and 767 targeted interventions in 2014. An estimated $16,575-$49,272 of additional revenue was generated in 2015. To complete the interventions in 2015, the team engaged in 1,714 CMR opportunities and 4,686 targeted intervention opportunities. In this MTM rotation, students provided real-life care to patients, resulting in financial and clinical contributions. This model of education and care delivery can be replicated in the community pharmacy or managed care setting. APPE students are an important component of this model of care delivery, particularly when considering the level of patient engagement needed to complete MTM interventions. No outside funding supported this research. The authors have no conflicts of interest to disclose related to this work. All authors contributed to study concept and design. Rodgers collected the data, and data interpretation was performed by Vande Griend, along with Rodgers and Nuffer. The manuscript was written and revised primarily by Vande Griend, along with Nuffer and Rodgers. This project was presented at the Pharmacy Quality Alliance Annual Meeting in Arlington, Virginia, in May 2016.
Minnesota Department of Human Services audit of medication therapy management programs.
Smith, Stephanie; Cell, Penny; Anderson, Lowell; Larson, Tom
2013-01-01
To inform medication therapy management (MTM) providers of findings of the Minnesota Department of Human Services review of claims submitted to Minnesota Health Care Programs (MHCP) for patients receiving MTM services and to discuss the impact of the audit on widespread MTM services and future audits. A retrospective review was completed on MTM claims submitted to MHCP from 2008 to 2010. The auditor verified that the Current Procedural Terminology codes billed matched the actual number of medications, conditions, and drug therapy problems assessed during an encounter. 190 claims were reviewed for 57 distinct pharmacies that billed for MTM services from 2008 to 2010, representing 4.5% of all claims submitted. The auditor reported that generally, the documentation within the electronic medical record had the least "up-coding" of all documentation systems. A total of 18 claims were coded at a higher level than appropriate, but only 10 notices were sent out to recover money because the others did not meet the minimum $50 threshold. The auditor expressed concerns that a number of claims billed at the highest complexity level were only 15 minutes long. Providers will need to be cautious of the conditions that they bill as complex and of how they define drug therapy problems. Everything for which is being billed must be clearly assessed or rationalized in the documentation note. The auditor expressed that overall, documentation was well done; however, many MTM providers are now asking how to internally prepare for future audits.
Kochumalayil, Joby J; Morimune, Seira; Nishino, Takashi; Ikkala, Olli; Walther, Andreas; Berglund, Lars A
2013-11-11
Nacre-mimetic bionanocomposites of high montmorillonite (MTM) clay content, prepared from hydrocolloidal suspensions, suffer from reduced strength and stiffness at high relative humidity. We address this problem by chemical modification of xyloglucan in (XG)/MTM nacre-mimetic nanocomposites, by subjecting the XG to regioselective periodate oxidation of side chains to enable it to form covalent cross-links to hydroxyl groups in neighboring XG chains or to the MTM surface. The resulting materials are analyzed by FTIR spectroscopy, thermogravimetric analysis, carbohydrate analysis, calorimetry, X-ray diffraction, scanning electron microscopy, tensile tests, and oxygen barrier properties. We compare the resulting mechanical properties at low and high relative humidity. The periodate oxidation leads to a strong increase in modulus and strength of the materials. A modulus of 30 GPa for cross-linked composite at 50% relative humidity compared with 13.7 GPa for neat XG/MTM demonstrates that periodate oxidation of the XG side chains leads to crucially improved stress transfer at the XG/MTM interface, possibly through covalent bond formation. This enhanced interfacial adhesion and internal cross-linking of the matrix moreover preserves the mechanical properties at high humidity condition and leads to a Young's modulus of 21 GPa at 90%RH.
School of pharmacy-based medication therapy management program: development and initial experience.
Lam, Annie; Odegard, Peggy Soule; Gardner, Jacqueline
2012-01-01
To describe a school of pharmacy-community pharmacy collaborative model for medication therapy management (MTM) service and training. University of Washington (UW) School of Pharmacy (Seattle), from July to December 2008. MTM services and training. A campus-based MTM pharmacy was established for teaching, practice, and collaboration with community pharmacies to provide comprehensive medication reviews (CMRs) and MTM training. Number of collaborating pharmacies, number of patients contacted, number of CMRs conducted, and estimated cost avoidance (ECA). UW Pharmacy Cares was licensed as a Class A pharmacy (nondispensing) and signed "business associate" agreements with six community pharmacies. During July to December 2008, 10 faculty pharmacists completed training and 5 provided CMR services to 17 patients (5 telephonic and 12 face-to-face interviews). A total of 67 claims (17 CMRs and 50 CMR-generated claims) were submitted for reimbursement of $1,642 ($96.58/CMR case). Total ECA was $54,250, averaging $3,191.19 per patient. Seven student pharmacists gained CMR interview training. Interest in collaboration by community pharmacies was lower than expected; however, the campus-community practice model addressed unmet patient care needs, reduced outstanding MTM CMR case loads, increased ECA, and facilitated faculty development and training of student pharmacists.
HIV medication therapy management services in community pharmacies
Kauffman, Yardlee; Nair, Vidya; Herist, Keith; Thomas, Vasavi; Weidle, Paul J.
2015-01-01
Objectives To present a rationale and a proposed structure to support pharmacist-delivered medication therapy management (MTM) for human immunodeficiency virus (HIV) disease and to outline challenges to implementing and sustaining the service. Data sources Professional literature. Summary Historically, the effect of pharmacy services for HIV-infected persons has been demonstrated in inpatient and clinic-based settings. Developing similar programs adapted for community pharmacists could be a model of care to improve patient adherence to antiretroviral therapy and retention in care. Initiation of antiretroviral therapy and regular monitoring of CD4+ cell count, HIV RNA viral load, adverse drug events, and adherence form the backbone of successful medical management of HIV infection. Support for these services can be provided to HIV-infected patients through pharmacist-managed HIV MTM programs in community pharmacy settings in collaboration with primary providers and other health care professionals. Conclusion Community pharmacists can help meet the growing need for HIV care through provision of MTM services. Although resources have been developed, including the general MTM framework, challenges of adequate training, education, and support of community pharmacists need to be addressed in order for HIV MTM to be a successful model. PMID:23229993
Integrating home-based medication therapy management (MTM) services in a health system.
Reidt, Shannon; Holtan, Haley; Stender, Jennifer; Salvatore, Toni; Thompson, Bruce
2016-01-01
To describe the integration of home-based Medication Therapy Management (MTM) into the ambulatory care infrastructure of a large urban health system and to discuss the outcomes of this service. Minnesota from September 2012 to December 2013. The health system has more than 50 primary care and specialty clinics. Eighteen credentialed MTM pharmacists are located in 16 different primary care and specialty settings, with the greatest number of pharmacists providing services in the internal medicine clinic. Home-based MTM was promoted throughout the clinics within the health system. Physicians, advanced practice providers, nurses, and pharmacists could refer patients to receive MTM in their homes. A home visit had the components of a clinic-based visit and was documented in the electronic health record (EHR); however, providing the service in the home allowed for a more direct assessment of environmental factors affecting medication use. Number of home MTM referrals, reason for referral and type of referring provider, number and type of medication-related problems (MRPs). In the first 15 months, 74 home visits were provided to 53 patients. Sixty-six percent of the patients were referred from the Internal Medicine Clinic. Referrals were also received from the senior care, coordinated care, and psychiatry clinics. Approximately 50% of referrals were made by physicians. More referrals (23%) were made by pharmacists compared with advanced practice providers, who made 21% of referrals. The top 3 reasons for referral were: nonadherence, transportation barriers, and the need for medication reconciliation with a home care nurse. Patients had a median of 3 MRPs with the most common (40%) MRP related to compliance. Home-based MTM is feasibly delivered within the ambulatory care infrastructure of a health system with sufficient provider engagement as demonstrated by referrals to the service. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Odeniyi, Michael Ayodele; Oyedokun, Babatunde Mukhtar; Bamiro, Oluyemisi Adebowale
2017-01-01
Hydrophilic polymers provide a means of sustaining drug delivery. Native gums may be limited in function, but modification may improve their activity. The aim of the study was to evaluate native and modified forms of Terminalia mantaly gum for their sustained-release and bioadhesive properties. The native gum (NTM) was modified by microwave irradiation for 20 seconds (MTM20) and 60 seconds (MTM60) and characterized using microscopy, Fourier transform infrared spectroscopy (FTIR) and packing properties. The effects of the thermally induced molecular reorientation were determined. Tablet formulations of naproxen were produced by direct compression. The mechanical, bioadhesive and release properties of the formulations were determined. Irradiation of NTM improved the gum's flow properties, resulting in Carr's Index and Hausner's ratios lower than 16% and 1.25, respectively. Swelling studies showed that MTM20 and MTM60 had lower water absorption capacity and swelling index values, while packing properties improved upon irradiation, as depicted by lower tapped density values. FTIR spectra of samples showed that the irradiated gums were distinct from the native gums and did not interact with naproxen sodium. The gum's mechanical properties improved with MTM20 and MTM60 and sustained-release action of up 12 h was obtained. Inclusion of hydroxypropyl methylcellulose (HPMC) in the tablet formulations proved critical for bioadhesion. Microwave irradiation of native Terminalia mantaly gum improved the flow, mechanical and sustained-release properties of Naproxen tablets, and the addition of HPMC increased bioadhesion properties. The tablet properties of the native gum were significantly improved after 20 s of microwave irradiation.
75 FR 438 - Notice of Invitation-Coal Exploration License Application MTM 99242
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
...] Notice of Invitation--Coal Exploration License Application MTM 99242 AGENCY: Bureau of Land Management... Energy Company in a program for the exploration of coal deposits owned by the United States of America in... section [[Page 439
Sharma, Manoj; Khubchandani, Jagdish; Nahar, Vinayak K
2017-01-01
Background: Smoking continues to be a public health problem worldwide. Smoking and tobacco use are associated with cardiovascular diseases that include coronary heart disease, atherosclerosis, cerebrovascular disease, and abdominal aortic aneurysm. Programs for quitting smoking have played a significant role in reduction of smoking in the United States. The smoking cessation interventions include counseling, nicotine replacement therapy, buproprion therapy, and varenicline therapy. The success rates with each of these approaches vary with clear need for improvement. Moreover, there is a need for a robust theory that can guide smoking cessation counseling interventions and increase the success rates. A fourth generation approach using multi-theory model (MTM) of health behavior change is introduced in this article for smoking cessation. An approach for developing and evaluating an intervention for smoking cessation is presented along with a measurement tool. Methods: A literature review reifying the MTM of health behavior change for smoking cessation has been presented. An instrument designed to measure constructs of MTM and associated smoking cessation behavior has been developed. Results: The instrument developed is available for validation, reliability and prediction study pertaining to smoking cessation. The intervention is available for testing in a randomized control trial involving smokers. Conclusion: MTM is a robust theory that holds promise for testing and application to smoking cessation.
Long-term effects of systemic gene therapy in a canine model of myotubular myopathy.
Elverman, Matthew; Goddard, Melissa A; Mack, David; Snyder, Jessica M; Lawlor, Michael W; Meng, Hui; Beggs, Alan H; Buj-Bello, Ana; Poulard, Karine; Marsh, Anthony P; Grange, Robert W; Kelly, Valerie E; Childers, Martin K
2017-11-01
X-linked myotubular myopathy (XLMTM), a devastating pediatric disease caused by the absence of the protein myotubularin, results from mutations in the MTM1 gene. While there is no cure for XLMTM, we previously reported effects of MTM1 gene therapy using adeno-associated virus (AAV) vector on muscle weakness and pathology in MTM1-mutant dogs. Here, we followed 2 AAV-infused dogs over 4 years. We evaluated gait, strength, respiration, neurological function, muscle pathology, AAV vector copy number (VCN), and transgene expression. Four years following AAV-mediated gene therapy, gait, respiratory performance, neurological function and pathology in AAV-infused XLMTM dogs remained comparable to their healthy littermate controls despite a decline in VCN and muscle strength. AAV-mediated gene transfer of MTM1 in young XLMTM dogs results in long-term expression of myotubularin transgene with normal muscular performance and neurological function in the absence of muscle pathology. These findings support a clinical trial in patients. Muscle Nerve 56: 943-953, 2017. © 2017 Wiley Periodicals, Inc.
Challenges to Integrating Pharmacogenetic Testing into Medication Therapy Management
Allen LaPointe, Nancy M.; Moaddeb, Jivan
2015-01-01
Background Some have proposed the integration of pharmacogenetic (PGx) testing into medication therapy management (MTM) to enable further refinement of treatment(s) to reduce risk of adverse responses and improve efficacy. PGx testing involves the analysis of genetic variants associated with therapeutic or adverse response and may be useful in enhancing the ability to identify ineffective and/or harmful drugs or drug combinations. This “enhanced” MTM might also reduce patient concerns about side effects and increase confidence that the medication is effective, addressing two key factors that impact patient adherence - concern and necessity. However, the feasibility and effectiveness of the integration of PGx testing into MTM in clinical practice has not yet been determined. Objectives In this paper, we consider some of the challenges to the integration and delivery of PGx testing in MTM services. What is already known about this subject While the addition of pharmacogenetic testing has been suggested, little literature exists exploring the challenges or feasibility of doing so. PMID:25803768
2014-04-30
15, 2014 11:15 a.m. – 12: 45 p.m. Chair: Ken Mitchell Jr., Director, Research and Analysis, Defense Logistics Agency Mixture Distributions for...póåÉêÖó=Ñçê=fåÑçêãÉÇ=`Ü~åÖÉ= = - 276 - DOC = direct operating cost MTM /D = million ton-miles per day Op,i = indicates if airport i is the initial...highest level of modeled strategic airlift demand, required 32.7 million ton-miles per day ( MTM /D). MTM /D values for each type of aircraft are
Rodríguez-Rodríguez, Carlos E; Madrigal-León, Karina; Masís-Mora, Mario; Pérez-Villanueva, Marta; Chin-Pampillo, Juan Salvador
2017-01-01
The use of fungal bioaugmentation represents a promising way to improve the performance of biomixtures for the elimination of pesticides. The ligninolyitc fungus Trametes versicolor was employed for the removal of three carbamates (aldicarb, ALD; methomyl, MTM; and methiocarb, MTC) in defined liquid medium; in this matrix ALD and MTM showed similar half-lives (14d), nonetheless MTC exhibited a faster removal, with a half-life of 6.5d. Then the fungus was employed in the bioaugmentation of an optimized biomixture to remove the aforementioned carbamates plus carbofuran (CFN). Bioaugmented and non-bioaugmented systems removed over 99% ALD and MTM after 8d of treatment, nonetheless a slight initial delay in the removal was observed in the bioaugmented biomixtures (removal after 3d: ALD 87%/97%; MTM 86%/99%, in bioaugmented/non-bioaugmented systems). The elimination of the other carbamates was slower, but independent of the presence of the fungus: >98% for MTM after 35d and >99.5% for CFN after 22d. Though the bioaugmentation did not improve the removal capacity of the biomixture, it favored a lower production of transformation products at the first stages of the treatment, and in both cases, a marked decrease in the toxicity of the matrix was swiftly achieved along the process (from 435 to 448 TU to values <1TU in 16d). Copyright © 2016 Elsevier Inc. All rights reserved.
Huet, Alison L; Frail, Caitlin K; Lake, Leslie M; Snyder, Margie E
2015-01-01
To assess the impact of passive and active promotional strategies on patient acceptance of medication therapy management (MTM) services, and to identify reasons for patient acceptance or refusal. Four promotional approaches were developed to offer MTM services to eligible patients, including letters and bag stuffers ("passive" approaches), and face-to-face offers and telephone calls ("active" approaches). Thirty pharmacies in a grocery store chain were randomized to one of the four approaches. Patient acceptance rates were compared among the four groups, and between active and passive approaches using hierarchical logistic regression techniques. Depending on their decision to accept or decline the service, patients were invited to take part in one of two brief telephone surveys. No significant differences were identified among the four promotional methods or between active and passive methods in the analyses. Patients' most frequent reasons for accepting MTM services were potential cost savings, review of how the medications were working, the expert opinion of the pharmacist, and education about medications. Patients' most frequent reasons for declining MTM services were that the participant already felt comfortable with their medications and felt their pharmacist provides these services on a regular basis. No significant difference was found among any of the four groups or between active or passive approaches. Further research is warranted to identify strategies for improving patient engagement in MTM services.
The objectives of this poster were 1)to evaluate the impact of MTM/VF on the functional attributes SOD, soil/sediment respiration rate, soil/sediment DEA and dissolved trace gas concentrations across gradients of mining disturbance and hydrolgy and 2)compare these functional attr...
NASA Astrophysics Data System (ADS)
Figueiredo, Cosme Alexandre O. B.; Buriti, Ricardo A.; Paulino, Igo; Meriwether, John W.; Makela, Jonathan J.; Batista, Inez S.; Barros, Diego; Medeiros, Amauri F.
2017-08-01
The midnight temperature maximum (MTM) has been observed in the lower thermosphere by two Fabry-Pérot interferometers (FPIs) at São João do Cariri (7.4° S, 36.5° W) and Cajazeiras (6.9° S, 38.6° W) during 2011, when the solar activity was moderate and the solar flux was between 90 and 155 SFU (1 SFU = 10-22 W m-2 Hz-1). The MTM is studied in detail using measurements of neutral temperature, wind and airglow relative intensity of OI630.0 nm (referred to as OI6300), and ionospheric parameters, such as virtual height (h'F), the peak height of the F2 region (hmF2), and critical frequency of the F region (foF2), which were measured by a Digisonde instrument (DPS) at Eusébio (3.9° S, 38.4° W; geomagnetic coordinates 7.31° S, 32.40° E for 2011). The MTM peak was observed mostly along the year, except in May, June, and August. The amplitudes of the MTM varied from 64 ± 46 K in April up to 144 ± 48 K in October. The monthly temperature average showed a phase shift in the MTM peak around 0.25 h in September to 2.5 h in December before midnight. On the other hand, in February, March, and April the MTM peak occurred around midnight. International Reference Ionosphere 2012 (IRI-2012) model was compared to the neutral temperature observations and the IRI-2012 model failed in reproducing the MTM peaks. The zonal component of neutral wind flowed eastward the whole night; regardless of the month and the magnitude of the zonal wind, it was typically within the range of 50 to 150 m s-1 during the early evening. The meridional component of the neutral wind changed its direction over the months: from November to February, the meridional wind in the early evening flowed equatorward with a magnitude between 25 and 100 m s-1; in contrast, during the winter months, the meridional wind flowed to the pole within the range of 0 to -50 m s-1. Our results indicate that the reversal (changes in equator to poleward flow) or abatement of the meridional winds is an important factor in the MTM generation. From February to April and from September to December, the h'F and the hmF2 showed an increase around 18:00-20:00 LT within a range between 300 and 550 km and reached a minimal height of about 200-300 km close to midnight; then the layer rose again by about 40 km or, sometimes, remained at constant height. Furthermore, during the winter months, the h'F and hmF2 showed a different behavior; the signature of the pre-reversal enhancement did not appear as in other months and the heights did not exceed 260 and 350 km. Our observation indicated that the midnight collapse of the F region was a consequence of the MTM in the meridional wind that was reflected in the height of the F region. Lastly, the behavior of the OI6300 showed, from February to April and from September to December, an increase in intensity around midnight or 1 h before, which was associated with the MTM, whereas, from May to August, the relative intensity was more intense in the early evening and decayed during the night.
Lee, Janet S; Yang, Jianing; Stockl, Karen M; Lew, Heidi; Solow, Brian K
2016-01-01
General eligibility criteria used by the Centers for Medicare & Medicaid Services (CMS) to identify patients for medication therapy management (MTM) services include having multiple chronic conditions, taking multiple Part D drugs, and being likely to incur annual drug costs that exceed a predetermined threshold. The performance of these criteria in identifying patients in greatest need of MTM services is unknown. Although there are numerous possible versions of MTM identification algorithms that satisfy these criteria, there are limited data that evaluate the performance of MTM services using eligibility thresholds representative of those used by the majority of Part D sponsors. To (a) evaluate the performance of the 2013 CMS MTM eligibility criteria thresholds in identifying Medicare Advantage Prescription Drug (MAPD) plan patients with at least 2 drug therapy problems (DTPs) relative to alternative criteria threshold levels and (b) identify additional patient risk factors significantly associated with the number of DTPs for consideration as potential future MTM eligibility criteria. All patients in the Medicare Advantage Part D population who had pharmacy eligibility as of December 31, 2013, were included in this retrospective cohort study. Study outcomes included 7 different types of DTPs: use of high-risk medications in the elderly, gaps in medication therapy, medication nonadherence, drug-drug interactions, duplicate therapy, drug-disease interactions, and brand-to-generic conversion opportunities. DTPs were identified for each member based on 6 months of most recent pharmacy claims data and 14 months of most recent medical claims data. Risk factors examined in this study included patient demographics and prior health care utilization in the most recent 6 months. Descriptive statistics were used to summarize patient characteristics and to evaluate unadjusted relationships between the average number of DTPs identified per patient and each risk factor. Quartile values identified in the study population for number of diseases, number of drugs, and annual spend were used as potential new criteria thresholds, resulting in 27 new MTM criteria combinations. The performance of each eligibility criterion was evaluated using sensitivity, specificity, positive predictive values (PPVs), and negative predictive values (NPVs). Patients identified with at least 2 DTPs were defined as those who would benefit from MTM services and were used as the gold standard. As part of a sensitivity analysis, patients identified with at least 1 DTP were used as the gold standard. Lastly, a multivariable negative binomial regression model was used to evaluate the relationship between each risk factor and the number of identified DTPs per patient while controlling for the patients' number of drugs, number of chronic diseases, and annual drug spend. A total of 2,578,336 patients were included in the study. The sensitivity, specificity, PPV, and NPV of CMS MTM criteria for the 2013 plan year were 15.3%, 95.6%, 51.3%, and 78.8%, respectively. Sensitivity and PPV improved when the drug count threshold increased from 8 to 10, and when the annual drug cost decreased from $3,144 to $2,239 or less. Results were consistent when at least 1 DTP was used as the gold standard. The adjusted rate of DTPs was significantly greater among patients identified with higher drug and disease counts, annual drug spend, and prior ER or outpatient or hospital visits. Patients with higher median household incomes who were male, younger, or white had significantly lower rates of DTPs. The performance of MTM eligibility criteria can be improved by increasing the threshold values for drug count while decreasing the threshold value for annual drug spend. Furthermore, additional risk factors, such as a recent ER or hospital visit, may be considered as potential MTM eligibility criteria.
Chen, Xiaodong; Sadineni, Vikram; Maity, Mita; Quan, Yong; Enterline, Matthew; Mantri, Rao V
2015-12-01
Lyophilization is an approach commonly undertaken to formulate drugs that are unstable to be commercialized as ready to use (RTU) solutions. One of the important aspects of commercializing a lyophilized product is to transfer the process parameters that are developed in lab scale lyophilizer to commercial scale without a loss in product quality. This process is often accomplished by costly engineering runs or through an iterative process at the commercial scale. Here, we are highlighting a combination of computational and experimental approach to predict commercial process parameters for the primary drying phase of lyophilization. Heat and mass transfer coefficients are determined experimentally either by manometric temperature measurement (MTM) or sublimation tests and used as inputs for the finite element model (FEM)-based software called PASSAGE, which computes various primary drying parameters such as primary drying time and product temperature. The heat and mass transfer coefficients will vary at different lyophilization scales; hence, we present an approach to use appropriate factors while scaling-up from lab scale to commercial scale. As a result, one can predict commercial scale primary drying time based on these parameters. Additionally, the model-based approach presented in this study provides a process to monitor pharmaceutical product robustness and accidental process deviations during Lyophilization to support commercial supply chain continuity. The approach presented here provides a robust lyophilization scale-up strategy; and because of the simple and minimalistic approach, it will also be less capital intensive path with minimal use of expensive drug substance/active material.
77 FR 71822 - Notice of Invitation-Coal Exploration License Application MTM 103852, MT
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-04
...] Notice of Invitation--Coal Exploration License Application MTM 103852, MT AGENCY: Bureau of Land... Ambre Energy on a pro rata cost sharing basis in a program for the exploration of coal deposits owned by... program is to gain additional geologic knowledge of the coal underlying the exploration area for the...
Peiretti, Pier Giorgio; Gai, Francesco; Ortoffi, Marco; Aigotti, Riccardo; Medana, Claudio
2012-01-01
The effects of three concentrations (0.2%, 1% and 3%) of rosemary oil (RO) on the freshness indicators, oxidative stability, fatty acid and biogenic amine (BA) contents of minced rainbow trout muscle (MTM) were investigated after different periods of storage (three and nine days) at 4 ± 1 °C. Moreover, the terpene and sesquiterpene contents in the treated MTM were also measured. RO treatment improves the pH, oxidative stability of the lipids and the FA profile, which resulted in a significant extension of MTM shelf-life. Storage time influenced all freshness indicators, with the exception of yellowness and chroma. Treatment with RO had a positive effect, leading to low BA content, especially putrescine, cadaverine, tyramine and histamine. Differences in BA were also found to be due to storage time, with the exception of spermidine, which was not influenced by time. Moreover, the presence of the terpenoid fraction of RO in MTM improved the quality of this ready-to-cook fish food. PMID:28239089
Donating money is not the only way to sustain cooperation in public goods game
NASA Astrophysics Data System (ADS)
Chen, Tong; Wu, Zheng-Hong; Wang, Le
Most of the previous studies research cooperation mainly based on donating money in social public goods games. Owing to the lack of income, some people prefer to donate time instead of money to promote the activity, in our daily life. Motivated by this fact, we here investigate the influence of the encouragement of donating time on the evolution of cooperation based on village opera. In our study, we set up two models: one is money-only model (MOM). Donating money is the only choice in MOM. The other is money-time model (MTM). Besides donating money, donating time is an alternative in MTM. Through numerical simulations, we find that compared to MOM, MTM has a faster speed to reach cooperation equilibrium and cost advantage to sustain the same cooperation level, without the effects of income, reputation, satisfaction, emotion and maximum nonmonetary input. However, it should be noted that MTM is better than MOM in a moderate interval of general budget V. Our results provide stark evidence that the encouragement of donating time can promote and sustain cooperation better than only donating money.
Potential cost savings of medication therapy management in safety-net clinics.
Truong, Hoai-An; Groves, C Nicole; Congdon, Heather B; Dang, Diem-Thanh Tanya; Botchway, Rosemary; Thomas, Jennifer
2015-01-01
To evaluate potential cost savings based on estimated cost avoidance from medication therapy management (MTM) services delivered in safety-net clinics over 4 years. High-risk patients taking multiple medications and with chronic conditions were referred for MTM services in primary care safety-net clinics in Maryland from October 1, 2009, to September 30, 2013. Medication-related problems (MRPs) were identified and pharmacists' costs determined to evaluate the estimated cost savings and return on investment (ROI). A range of potential economic outcomes for each MRP identified was assigned to a cost avoidance for outpatient visit, urgent care visit, emergency department visit, and/or hospitalization. Over 4 years, 246 patients received MTM, nearly 2,100 medications were reviewed, and 814 MRPs were identified. The most common MRPs identified were subtherapeutic doses, nonadherence, and untreated indications, with respective prevalences of 38%, 19%, and 16%. The corresponding costs of medical services were estimated at $115,220-$614,570 for all MRPs identified, yielding a mean of $141.55-$755.00 per identified MRP. Pharmacists' expenses for encounters were calculated at a total expenditure of $57,307.50 for 16,965 minutes. ROI based on the time spent during billable face-to-face encounters ranged from 1:5 to 1:25. Pharmacist-provided MTM in safety-net clinics yielded potential economic benefits to the organization. The Primary Care Coalition of Montgomery County plans to expand MTM services to additional clinics to improve patient care and increase cost savings through preventable medical services.
Geologic mapping of Argyre Planitia
NASA Technical Reports Server (NTRS)
Gorsline, Donn S.; Parker, Timothy J.
1995-01-01
This report describes the results from the geologic mapping of the central and southern Argyre basin of Mars. At the Mars Geologic Mapper's Meeting in Flagstaff during July, 1993, Dave Scott (United States Geological Survey, Mars Geologic Mapping Steering Committee Chair) recommended that all four quadrangles be combined into a single 1:1,000,000 scale map for publication. It was agreed that this would be cost-effective and that the decrease in scale would not compromise the original science goals of the mapping. Tim Parker completed mapping on the 1:500,000 scale base maps, for which all the necessary materials had already been produced, and included the work as a chapter in his dissertation, which was completed in the fall of 1994. Geologic mapping of the two southernmost quadrangles (MTM -55036 and MTM -55043; MTM=Mars Transverse Mercator) was completed as planned during the first year of work. These maps and a detailed draft of the map text were given a preliminary review by Dave Scott during summer, 1993. Geologic mapping of the remaining two quadrangles (MTM -50036 and MTM -50043) was completed by summer, 1994. Results were described at the Mars Geologic Mappers Meeting, held in Pocatello, Idaho, during July, 1994. Funds for the third and final year of the project have been transferred to the Jet Propulsion Laboratory, where Tim Parker will revise and finalize all maps and map text for publication by the United States Geological Survey at the 1:1,000,000 map scale.
Condensation Heat Transfer of Steam on a Single Horizontal Tube.
1983-06-01
public release; distribution unlimited 88 09 07 1?0 L_ I jmmAzzLEim. SICUR1TY CLASStFICATION o« THIS r-AOt (*»*< Data Enf«r»d) REPORT...steam side data were taken at atmospheric pressure to test the data acquisition/reduction computer programs.^— S N 0102- LF. 014-1601 Accession For...Dlst H Special UNCLASSIFIED ItCUHlTV CLASSIFICATION O» TMI( PAOtyWft«! Data >*H mtm Im. ! Approved for public release; distribution
78 FR 76319 - Notice of Invitation-Coal Exploration License Application MTM 106757, Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-17
...] Notice of Invitation--Coal Exploration License Application MTM 106757, Montana AGENCY: Bureau of Land... Signal Peak Energy, LLC on a pro rata cost sharing basis in a program for the exploration of coal... Office coal Web site at http://www.blm.gov/mt/st/en/prog/energy/coal.html . A written notice to...
76 FR 31976 - Notice of Invitation-Coal Exploration License Application MTM 101688
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
...] Notice of Invitation--Coal Exploration License Application MTM 101688 AGENCY: Bureau of Land Management... Creek Coal Company on a pro rata cost sharing basis in a program for the exploration of coal deposits... notice to both the Bureau of Land Management (BLM) and the Spring Creek Coal Company as provided in the...
Chronic Cardiovascular Disease Mortality in Mountaintop Mining Areas of Central Appalachian States
ERIC Educational Resources Information Center
Esch, Laura; Hendryx, Michael
2011-01-01
Purpose: To determine if chronic cardiovascular disease (CVD) mortality rates are higher among residents of mountaintop mining (MTM) areas compared to mining and nonmining areas, and to examine the association between greater levels of MTM surface mining and CVD mortality. Methods: Age-adjusted chronic CVD mortality rates from 1999 to 2006 for…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-07
...; MTM-99236] Notice of Public Meeting; Proposed Alluvial Valley Floor Coal Exchange Public Interest... (SMCRA) of 1977. This exchange (serial number MTM-99236) has been proposed by Jay Nance, Brett A... Director. [FR Doc. 2010-25060 Filed 10-6-10; 8:45 am] BILLING CODE 4310-DN-P ...
Lawlor, Michael W.; Read, Benjamin P.; Edelstein, Rachel; Yang, Nicole; Pierson, Christopher R.; Stein, Matthew J.; Wermer-Colan, Ariana; Buj-Bello, Anna; Lachey, Jennifer L.; Seehra, Jasbir S.; Beggs, Alan H.
2011-01-01
X-linked myotubular myopathy (XLMTM) is a congenital disorder caused by deficiency of the lipid phosphatase, myotubularin. Patients with XLMTM often have severe perinatal weakness that requires mechanical ventilation to prevent death from respiratory failure. Muscle biopsy specimens from patients with XLMTM exhibit small myofibers with central nuclei and central aggregations of organelles in many cells. It was postulated that therapeutically increasing muscle fiber size would cause symptomatic improvement in myotubularin deficiency. Recent studies have elucidated an important role for the activin-receptor type IIB (ActRIIB) in regulation of muscle growth and have demonstrated that ActRIIB inhibition results in significant muscle hypertrophy. To evaluate whether promoting muscle hypertrophy can attenuate symptoms resulting from myotubularin deficiency, the effect of ActRIIB-mFC treatment was determined in myotubularin-deficient (Mtm1δ4) mice. Compared with wild-type mice, untreated Mtm1δ4 mice have decreased body weight, skeletal muscle hypotrophy, and reduced survival. Treatment of Mtm1δ4 mice with ActRIIB-mFC produced a 17% extension of lifespan, with transient increases in weight, forelimb grip strength, and myofiber size. Pathologic analysis of Mtm1δ4 mice during treatment revealed that ActRIIB-mFC produced marked hypertrophy restricted to type 2b myofibers, which suggests that oxidative fibers in Mtm1δ4 animals are incapable of a hypertrophic response in this setting. These results support ActRIIB-mFC as an effective treatment for the weakness observed in myotubularin deficiency. PMID:21281811
Spivey, Christina A; Wang, Junling; Qiao, Yanru; Shih, Ya-Chen Tina; Wan, Jim Y; Kuhle, Julie; Dagogo-Jack, Samuel; Cushman, William C; Chisholm-Burns, Marie
2018-02-01
Previous research found racial and ethnic disparities in meeting medication therapy management (MTM) eligibility criteria implemented by the Centers for Medicare & Medicaid Services (CMS) in accordance with the Medicare Modernization Act (MMA). To examine whether alternative MTM eligibility criteria based on the CMS Part D star ratings quality evaluation system can reduce racial and ethnic disparities. This study analyzed the Beneficiary Summary File and claims files for Medicare beneficiaries linked to the Area Health Resource File. Three million Medicare beneficiaries with continuous Parts A, B, and D enrollment in 2012-2013 were included. Proposed star ratings criteria included 9 existing medication safety and adherence measures developed mostly by the Pharmacy Quality Alliance. Logistic regression and the Blinder-Oaxaca approach were used to test disparities in meeting MMA and star ratings eligibility criteria across racial and ethnic groups. Multinomial logistic regression was used to examine whether there was a disparity reduction by comparing individuals who were MTM-eligible under MMA but not under star ratings criteria and those who were MTM-eligible under star ratings criteria but not under the MMA. Concerning MMA-based MTM criteria, main and sensitivity analyses were performed to represent the entire range of the MMA eligibility thresholds reported by plans in 2009, 2013, and proposed by CMS in 2015. Regarding star ratings criteria, meeting any 1 of the 9 measures was examined as the main analysis, and various measure combinations were examined as the sensitivity analyses. In the main analysis, adjusted odds ratios for non-Hispanic blacks (backs) and Hispanics to non-Hispanic whites (whites) were 1.394 (95% CI = 1.375-1.414) and 1.197 (95% CI = 1.176-1.218), respectively, under star ratings. Blacks were 39.4% and Hispanics were 19.7% more likely to be MTM-eligible than whites. Blacks and Hispanics were less likely to be MTM-eligible than whites in some sensitivity analyses. Disparities were not completely explained by differences in patient characteristics based on the Blinder-Oaxaca approach. The multinomial logistic regression of each main analysis found significant adjusted relative risk ratios (RRR) between whites and blacks for 2009 (RRR = 0.459, 95% CI = 0.438-0.481); 2013 (RRR = 0.449, 95% CI = 0.434-0.465); and 2015 (RRR = 0.436, 95% CI = 0.425-0.446) and between whites and Hispanics for 2009 (RRR = 0.559, 95% CI = 0.528-0.593); 2013 (RRR = 0.544, 95% CI = 0.521-0.569); and 2015 (RRR = 0.503, 95% CI = 0.488-0.518). These findings indicate a significant reduction in racial and ethnic disparities when using star ratings eligibility criteria; for example, black-white disparities in the likelihood of meeting MTM eligibility criteria were reduced by 55.1% based on star ratings compared with MMA in 2013. Similar patterns were found in most sensitivity and disease-specific analyses. This study found that minorities were more likely than whites to be MTM-eligible under the star ratings criteria. In addition, MTM eligibility criteria based on star ratings would reduce racial and ethnic disparities associated with MMA in the general Medicare population and those with specific chronic conditions. Research reported in this publication was supported by the National Institute on Aging of the National Institutes of Health under award number R01AG049696. The content of this study is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Cushman reports an Eli Lilly grant and uncompensated consulting for Takeda Pharmaceuticals outside this work. The other authors have no potential conflicts of interest to report. Study concept and design were contributed by Wang and Shih, along with Wan, Kuhle, Spivey, and Cushman. Wang, Qiao, and Wan took the lead in data collection, with assistance from the other authors. Data interpretation was performed by Wang, Kuhle, and Qiao, with assistance from the other authors. The manuscript was written by Spivey and Qiao, along with the other authors, and revised by Cushman, Dagogo-Jack, and Chisholm-Burns, along with the other authors.
Design, Fabrication and Testing of Two Dimensional Radio-Frequency Metamaterials
2014-03-03
metasurfaces . These antennas are either MTM based or utilize MTMs to increase performance. The benefits of these antennas are: reduced size, lower...reduced size and increased quality factor. Finally, the antennas loaded with metasurfaces are similar to the MTM loading; in the fact that the... metasurface enhances the antenna performance instead of performing the antenna function. This type of antenna has shown increased directionality and
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
... lease terms for rentals and royalties of $10 per acre and 16\\2/3\\ percent. The lessee paid the $500 administration fee for the reinstatement of each lease and $163 cost for publishing this Notice. The lessee met... increased royalty of 16\\2/3\\ percent; and The $163 cost of publishing this Notice. FOR FURTHER INFORMATION...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
... agrees to new lease terms for rentals and royalties of $10 per acre and 16\\2/3\\ percent. The lessee paid the $500 administration fee for the reinstatement of each lease and $163 cost for publishing this... $10 per acre; The increased royalty of 16\\2/3\\ percent; and The $163 cost of publishing this Notice...
NASA Astrophysics Data System (ADS)
Nippgen, F.; Ross, M. R. V.; Bernhardt, E. S.; McGlynn, B. L.
2017-12-01
Mountaintop mining (MTM) is an especially destructive form of surface coal mining. It is widespread in Central Appalachia and is practiced around the world. In the process of accessing coal seams up to several hundred meters below the surface, mountaintops and ridges are removed via explosives and heavy machinery with the resulting overburden pushed into nearby valleys. This broken up rock and soil material represents a largely unknown amount of storage for incoming precipitation that facilitates enhanced chemical weathering rates and increased dissolved solids exports to streams. However, assessing the independent impact of MTM can be difficult in the presence of other forms of mining, especially underground mining. Here, we evaluate the effect of MTM on water quantity and quality on annual, seasonal, and event time scales in two sets of paired watersheds in southwestern West Virginia impacted by MTM. On an annual timescale, the mined watersheds sustained baseflow throughout the year, while the first order watersheds ceased flowing during the latter parts of the growing season. In fractionally mined watersheds that continued to flow, the water in the stream was exclusively generated from mined portions of the watersheds, leading to elevated total dissolved solids in the stream water. On the event time scale, we analyzed 50 storm events over a water year for a range of hydrologic response metrics. The mined watersheds exhibited smaller runoff ratios and longer response times during the wet dormant season, but responded similarly to rainfall events during the growing season or even exceeded the runoff magnitude of the reference watersheds. Our research demonstrates a clear difference in hydrologic response between mined and unmined watersheds during the growing season and the dormant season that are detectable at annual, seasonal, and event time scales. For larger spatial scales (up to 2,000km2) the effect of MTM on water quantity is not as easily detectable. At these larger scales, other land uses can mask possible alterations in hydrology or the percentage of MTM disturbed areas becomes negligible.
Ferdosi, Hamid; Lamm, Steve H; Afari-Dwamena, Nana Ama; Dissen, Elisabeth; Chen, Rusan; Li, Ji; Feinleib, Manning
2018-01-01
To identify risk factors for small-for-gestational age (SGA) for counties in central Appalachian states (Kentucky (KY), Tennessee (TN), Virginia (VA), and West Virginia (WV)) with varied coal mining activities. Live birth certificate files (1990-2002) were used for obtaining SGA prevalence rates for mothers based on the coal mining activities of their counties of residence, mountain-top mining (MTM) activities, underground mining activities but no mountain-top mining activity (non-MTM), or having no mining activities (non-mining). Co-variable information, including maternal tobacco use, was also obtained from the live birth certificate. Adjusted odds ratios were obtained using multivariable logistic regression comparing SGA prevalence rates for counties with coal mining activities to those without coal mining activities and comparing SGA prevalence rates for counties with coal mining activities for those with and without mountain-top mining activities. Comparisons were also made among those who had reported tobacco use and those who had not. Both tobacco use prevalence and SGA prevalence were significantly greater for mining counties than for non-mining counties and for MTM counties than for non-MTM counties. Adjustment for tobacco use alone explained 50% of the increased SGA risk for mining counties and 75% of the risk for MTM counties, including demographic pre-natal care co-variables that explained 75% of the increased SGA risk for mining counties and 100% of the risk for MTM. The increased risk of SGA was limited to the third trimester births among tobacco users and independent of the mining activities of their counties of residence. This study demonstrates that the increased prevalence of SGA among residents of counties with mining activity was primarily explained by the differences in maternal tobacco use prevalence, an effect that itself was gestational-age dependent. Self-reported tobacco use marked the population at the increased risk for SGA in central Appalachian states. Int J Occup Med Environ Health 2018;31(1):11-23. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Hitt, Nathaniel P.; Chambers, Douglas B.
2014-01-01
Mountaintop mining (MTM) affects chemical, physical, and hydrological properties of receiving streams, but the long-term consequences for fish-assemblage structure and function are poorly understood. We sampled stream fish assemblages using electrofishing techniques in MTM exposure sites and reference sites within the Guyandotte River basin, USA, during 2010–2011. We calculated indices of taxonomic diversity (species richness, abundance, Shannon diversity) and functional diversity (functional richness, functional evenness, functional divergence) to compare exposure and reference assemblages between seasons (spring and autumn) and across years (1999–2011). We based temporal comparisons on 2 sites that were sampled during 1999–2001 by Stauffer and Ferreri (2002). Exposure assemblages had lower taxonomic and functional diversity than reference assemblages or simulated assemblages that accounted for random variation. Differences in taxonomic composition between reference and exposure assemblages were associated with conductivity and aqueous Se concentrations. Exposure assemblages had fewer species, lower abundances, and less biomass than reference assemblages across years and seasons. Green Sunfish (Lepomis cyanellus) and Creek Chub (Semotilus atromaculatus) became numerically dominant in exposure assemblages over time because of their persistence and losses of other taxa. In contrast, species richness increased over time in reference assemblages, a result that may indicate recovery from drought. Mean individual biomass increased as fish density decreased and most obligate invertivores were apparently extirpated at MTM exposure sites. Effects of MTM were not related to physical-habitat conditions but were associated with water-quality variables, which may limit quality and availability of benthic macroinvertebrate prey. Simulations revealed effects of MTM that could not be attributed to random variation in fish assemblage structure.
The Professional Culture of Community Pharmacy and the Provision of MTM Services.
Rosenthal, Meagen M; Holmes, Erin R
2018-03-21
The integration of advanced pharmacy services into community pharmacy practice is not complete. According to implementation research understanding professional culture, as a part of context, may provide insights for accelerating this process. There are three objectives in this study. The first objective of this study was to validate an adapted version of an organizational culture measure in a sample of United States' (US) community pharmacists. The second objective was to examine potential relationships between the cultural factors identified using the validated instrument and a number of socialization and education variables. The third objective was to examine any relationships between the scores on the identified cultural factors and the provision of MTM services. This study was a cross-sectional online survey for community pharmacists in the southeastern US. The survey contained questions on socialization/education, respondents' self-reported provision of medication therapy management (MTM) services, and the organizational culture profile (OCP). Analyses included descriptive statistics, a principle components analysis (PCA), independent samples t-test, and multivariate ordinal regression. A total of 303 surveys were completed. The PCA revealed a six-factor structure: social responsibility, innovation, people orientation, competitiveness, attention to detail, and reward orientation. Further analysis revealed significant relationships between social responsibility and years in practice, and people orientation and attention to detail and pharmacists' training and practice setting. Significant positive relationships were observed between social responsibility, innovation, and competitiveness and the increased provision of MTM services. The significant relationships identified between the OCP factors and community pharmacist respondents' provision of MTM services provides an important starting point for developing interventions to improve the uptake of practice change opportunities.
Nonlinear surface waves at ferrite-metamaterial waveguide structure
NASA Astrophysics Data System (ADS)
Hissi, Nour El Houda; Mokhtari, Bouchra; Eddeqaqi, Noureddine Cherkaoui; Shabat, Mohammed Musa; Atangana, Jacques
2016-09-01
A new ferrite slab made of a metamaterial (MTM), surrounded by a nonlinear cover cladding and a ferrite substrate, was shown to support unusual types of electromagnetic surface waves. We impose the boundary conditions to derive the dispersion relation and others necessary to formulate the proposed structure. We analyse the dispersion properties of the nonlinear surface waves and we calculate the associated propagation index and the film-cover interface nonlinearity. In the calculation, several sets of the permeability of the MTM are considered. Results show that the waves behaviour depends on the values of the permeability of the MTM, the thickness of the waveguide and the film-cover interface nonlinearity. It is also shown that the use of the singular solutions to the electric field equation allows to identify several new properties of surface waves which do not exist in conventional waveguide.
Modeling the microstructurally dependent mechanical properties of poly(ester-urethane-urea)s.
Warren, P Daniel; Sycks, Dalton G; McGrath, Dominic V; Vande Geest, Jonathan P
2013-12-01
Poly(ester-urethane-urea) (PEUU) is one of many synthetic biodegradable elastomers under scrutiny for biomedical and soft tissue applications. The goal of this study was to investigate the effect of the experimental parameters on mechanical properties of PEUUs following exposure to different degrading environments, similar to that of the human body, using linear regression, producing one predictive model. The model utilizes two independent variables of poly(caprolactone) (PCL) type and copolymer crystallinity to predict the dependent variable of maximum tangential modulus (MTM). Results indicate that comparisons between PCLs at different degradation states are statistically different (p < 0.0003), while the difference between experimental and predicted average MTM is statistically negligible (p < 0.02). The linear correlation between experimental and predicted MTM values is R(2) = 0.75. Copyright © 2013 Wiley Periodicals, Inc., a Wiley Company.
Manufacturing Work Measurement System Evaluation. Reference Guide
1987-04-01
discussions with users of the MTM-MEK predetermined time system. The coranents listed below are based on these discussions and, while a broad range...discussions with users of the MTM-V predeterroined time system. The coranents listed below are based on these discussions and, while a broad range of...industries were sampled, coranents should not be considered universal and therefore may not be applicable to all manufacturing environments: 0 Easy to
Ye, Xuan; Cui, Zhiguo; Fang, Huajun; Li, Xide
2017-01-01
We report a novel material testing system (MTS) that uses hierarchical designs for in-situ mechanical characterization of multiscale materials. This MTS is adaptable for use in optical microscopes (OMs) and scanning electron microscopes (SEMs). The system consists of a microscale material testing module (m-MTM) and a nanoscale material testing module (n-MTM). The MTS can measure mechanical properties of materials with characteristic lengths ranging from millimeters to tens of nanometers, while load capacity can vary from several hundred micronewtons to several nanonewtons. The m-MTM is integrated using piezoelectric motors and piezoelectric stacks/tubes to form coarse and fine testing modules, with specimen length from millimeters to several micrometers, and displacement distances of 12 mm with 0.2 µm resolution for coarse level and 8 µm with 1 nm resolution for fine level. The n-MTM is fabricated using microelectromechanical system technology to form active and passive components and realizes material testing for specimen lengths ranging from several hundred micrometers to tens of nanometers. The system’s capabilities are demonstrated by in-situ OM and SEM testing of the system’s performance and mechanical properties measurements of carbon fibers and metallic microwires. In-situ multiscale deformation tests of Bacillus subtilis filaments are also presented. PMID:28777341
[Better multidisciplinary team meetings are linked to better care].
van Drielen, Eveline; de Vries, Antoinette W; Ottevanger, P B Nelleke; Hermens, Rosella P M G
2012-01-01
Discussing a patient in an oncology multidisciplinary team meeting (MTM) increases the value of the quality of the treatment chosen. MTMs are increasingly mentioned in guidelines and indicator sets. Based on literature review and observations, the Comprehensive Cancer Centre Netherlands (CCCNL), in collaboration with IQ Healthcare and the Department of Medical Oncology of the UMC St Radboud Nijmegen in the Netherlands, has conducted research into the quality criteria for a good MTM. Two of our studies show that the organisation of MTMs can be significantly improved. Based on the results, we developed a checklist to accomplish this. The most significant areas of improvement for optimising the organisation of MTMs are: (a) the presence of specialists from all relevant disciplines; (b) a capable chairman who promotes the efficiency of the MTM; and (c) the reduction of intruding factors, such as mobile phones and participants who walk in and out.
Patel, Rajul A.; Thai, Huong K.; Phou, Christine M.; Walberg, Mark P.; Woelfel, Joseph A.; Carr-Lopez, Sian M.; Chan, Emily K.
2012-01-01
Objective. To determine the impact of an elective course on pharmacy students’ perceptions, knowledge, and confidence regarding Medicare Part D, medication therapy management (MTM), and immunizations. Design. Thirty-three pharmacy students were enrolled in a Medicare Part D elective course that included both classroom instruction and experiential training. Assessment. Students’ self-reported confidence in and knowledge of Part D significantly improved upon course completion. End-of-course student perceptions about the relative importance of various aspects of MTM interventions and their confidence in performing MTM services significantly improved from those at the beginning of the course. Students’ confidence in performing immunizations also increased significantly from the start of the course. Conclusion. A classroom course covering Medicare Part D with an experiential requirement serving beneficiaries can improve students’ attitudes and knowledge about Medicare Part D and their confidence in providing related services to beneficiaries in the community. PMID:22761532
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, Lan; Saunders, R. Jesse; Drobná, Zuzana
2012-10-01
Arsenic (+ 3 oxidation state) methyltransferase (AS3MT) is the key enzyme in the pathway for methylation of arsenicals. A common polymorphism in the AS3MT gene that replaces a threonyl residue in position 287 with a methionyl residue (AS3MT/M287T) occurs at a frequency of about 10% among populations worldwide. Here, we compared catalytic properties of recombinant human wild-type (wt) AS3MT and AS3MT/M287T in reaction mixtures containing S-adenosylmethionine, arsenite (iAs{sup III}) or methylarsonous acid (MAs{sup III}) as substrates and endogenous or synthetic reductants, including glutathione (GSH), a thioredoxin reductase (TR)/thioredoxin (Trx)/NADPH reducing system, or tris (2-carboxyethyl) phosphine hydrochloride (TCEP). With either TR/Trx/NADPHmore » or TCEP, wtAS3MT or AS3MT/M287T catalyzed conversion of iAs{sup III} to MAs{sup III}, methylarsonic acid (MAs{sup V}), dimethylarsinous acid (DMAs{sup III}), and dimethylarsinic acid (DMAs{sup V}); MAs{sup III} was converted to DMAs{sup III} and DMAs{sup V}. Although neither enzyme required GSH to support methylation of iAs{sup III} or MAs{sup III}, addition of 1 mM GSH decreased K{sub m} and increased V{sub max} estimates for either substrate in reaction mixtures containing TR/Trx/NADPH. Without GSH, V{sub max} and K{sub m} values were significantly lower for AS3MT/M287T than for wtAS3MT. In the presence of 1 mM GSH, significantly more DMAs{sup III} was produced from iAs{sup III} in reactions catalyzed by the M287T variant than in wtAS3MT-catalyzed reactions. Thus, 1 mM GSH modulates AS3MT activity, increasing both methylation rates and yield of DMAs{sup III}. AS3MT genotype exemplified by differences in regulation of wtAS3MT and AS3MT/M287T-catalyzed reactions by GSH may contribute to differences in the phenotype for arsenic methylation and, ultimately, to differences in the disease susceptibility in individuals chronically exposed to inorganic arsenic. -- Highlights: ► Human AS3MT and AS3MT(M287T) require a dithiol reductant for optimal activity. ► Both enzymes methylate arsenite to tri- and pentavalent methylated metabolites. ► Neither enzyme requires glutathione (GSH) to methylate arsenite or methylarsonite. ► However, in presence of a dithiol addition of 1 mM GSH increases methylation rates. ► In presence of 1 mM GSH, AS3MT(M287T) produces more dimethylarsinite than AS3MT.« less
NASA Astrophysics Data System (ADS)
Guaje, Javier; Molina, Juan; Rudas, Jorge; Demertzi, Athena; Heine, Lizette; Tshibanda, Luaba; Soddu, Andrea; Laureys, Steven; Gómez, Francisco
2015-12-01
Functional magnetic resonance imaging in resting state (fMRI-RS) constitutes an informative protocol to investigate several pathological and pharmacological conditions. A common approach to study this data source is through the analysis of changes in the so called resting state networks (RSNs). These networks correspond to well-defined functional entities that have been associated to different low and high brain order functions. RSNs may be characterized by using Independent Component Analysis (ICA). ICA provides a decomposition of the fMRI-RS signal into sources of brain activity, but it lacks of information about the nature of the signal, i.e., if the source is artifactual or not. Recently, a multiple template-matching (MTM) approach was proposed to automatically recognize RSNs in a set of Independent Components (ICs). This method provides valuable information to assess subjects at individual level. Nevertheless, it lacks of a mechanism to quantify how much certainty there is about the existence/absence of each network. This information may be important for the assessment of patients with severely damaged brains, in which RSNs may be greatly affected as a result of the pathological condition. In this work we propose a set of changes to the original MTM that improves the RSNs recognition task and also extends the functionality of the method. The key points of this improvement is a standardization strategy and a modification of method's constraints that adds flexibility to the approach. Additionally, we also introduce an analysis to the trustworthiness measurement of each RSN obtained by using template-matching approach. This analysis consists of a thresholding strategy applied over the computed Goodness-of-Fit (GOF) between the set of templates and the ICs. The proposed method was validated on 2 two independent studies (Baltimore, 23 healthy subjects and Liege, 27 healthy subjects) with different configurations of MTM. Results suggest that the method will provide complementary information for characterization of RSNs at individual level.
Analyzing the costs to deliver medication therapy management services.
Rupp, Michael T
2011-01-01
To provide pharmacy managers and consultant pharmacists with a step-by-step approach for analyzing of the costs of delivering medication therapy management (MTM) services and to describe use of a free online software application for determining costs of delivering MTM. The process described is applicable to community pharmacies and consultant pharmacists who provide MTM services from nonpharmacy settings. The PharmAccount Service Cost Calculator is an Internet- based software application that uses a guided online interview to collect information needed to conduct a comprehensive cost analysis of any specialized pharmacy service. In addition to direct variable and fixed costs, the software automatically allocates indirect and overhead costs to the service and generates an itemized report that details the components of service delivery costs. The service cost calculator is sufficiently flexible to support the analysis of virtually any specialized pharmacy service, irrespective of whether the service is being delivered from a physical pharmacy. The software application allows users to perform sensitivity analysis to quickly determine the potential impact that alternate scenarios would have on service delivery cost. It is therefore particularly well suited to assist in the design and planning of a new pharmacy service. Good management requires that the cost implications of service delivery decisions are known and considered. Analyzing the cost of an MTM service is an important step in developing a sustainable business model.
[Prostate cancer: Quality assessment of clinical management in the Midi-Pyrenean region in 2011].
Daubisse-Marliac, L; Lamy, S; Lunardi, P; Tollon, C; Thoulouzan, M; Latorzeff, I; Bauvin, E; Grosclaude, P
2017-02-01
Assessing the quality of the clinical management of prostate cancer in the Midi-Pyrenean region in 2011. The study population was randomly selected among new cases of prostate cancer presented in Multidisciplinary Team Meeting (MTM) in 2011. The indicators defined with the professionals have evaluated the quality of the diagnostic care, when treatment started and at the time of the MTM. Six hundred and thirty-three new patients were included (median age at diagnosis=69years, min: 48; max: 93). In diagnostic period, 92% of patients had a prostate biopsy. Performing a pelvic MRI, an abdomino-pelvic CT and bone scintigraphy concerned respectively 53%, 55% and 61% of intermediate or high-risk patients. The Gleason score, surgical margins and pathological stage were included in over 98% patient records treated by radical prostatectomy. A PSA assay in 3months after prostatectomy was found in 59% of surgical patients. The MTM was performed before treatment to 83% of patients. About three-quarters of surgical patients with stage pT≥3 or pN1 or with no healthy margins were discussed in MTM after surgery. Most of the studied indicators reach a high level. However, the lower level of realization of complementary examinations may question about their real place, accessibility and traceability. 4. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Medication therapy management and condition care services in a community-based employer setting.
Johannigman, Mark J; Leifheit, Michael; Bellman, Nick; Pierce, Tracey; Marriott, Angela; Bishop, Cheryl
2010-08-15
A program in which health-system pharmacists and pharmacy technicians provide medication therapy management (MTM), wellness, and condition care (disease management) services under contract with local businesses is described. The health-system pharmacy department's Center for Medication Management contracts directly with company benefits departments for defined services to participating employees. The services include an initial wellness and MTM session and, for certain patients identified during the initial session, ongoing condition care. The initial appointment includes a medication history, point-of-care testing for serum lipids and glucose, body composition analysis, and completion of a health risk assessment. The pharmacist conducts a structured MTM session, reviews the patient's test results and risk factors, provides health education, discusses opportunities for cost savings, and documents all activities on the patient's medication action plan. Eligibility for the condition care program is based on a diagnosis of diabetes, hypertension, asthma, heart failure, or hyperlipidemia or elevation of lipid or glucose levels. Findings are summarized for employers after the initial wellness screening and at six-month intervals. Patients receiving condition care sign a customized contract, establish goals, attend up to four MTM sessions per year, and track their information on a website; employers may offer incentives for participation. When pharmacists recommend adjustments to therapy or cost-saving changes, it is up to patients to discuss these with their physician. A survey completed by each patient after the initial wellness session has indicated high satisfaction. Direct cost savings related to medication changes have averaged $253 per patient per year. Total cost savings to companies in the first year of the program averaged $1011 per patient. For the health system, the program has been financially sustainable. Key laboratory values indicate positive clinical outcomes. A business model in which health-system pharmacists provide MTM and condition care services for company employees has demonstrated successful outcomes in terms of patient satisfaction, cost savings, and clinical benefits.
Compact Single-Layer Traveling-Wave Antenna DesignUsing Metamaterial Transmission Lines
NASA Astrophysics Data System (ADS)
Alibakhshikenari, Mohammad; Virdee, Bal Singh; Limiti, Ernesto
2017-12-01
This paper presents a single-layer traveling-wave antenna (TWA) that is based on composite right/left-handed (CRLH)-metamaterial (MTM) transmission line (TL) structure, which is implemented by using a combination of interdigital capacitors and dual-spiral inductive slots. By embedding dual-spiral inductive slots inside the CRLH MTM-TL results in a compact TWA. Dimensions of the proposed CRLH MTM-TL TWA is 21.5 × 30.0 mm2 or 0.372λ0 × 0.520λ0 at 5.2 GHz (center frequency). The fabricated TWA operates over 1.8-8.6 GHz with a fractional bandwidth greater than 120%, and it exhibits a peak gain and radiation efficiency of 4.2 dBi and 81%, respectively, at 5 GHz. By avoiding the use of lumped components, via-holes or defected ground structures, the proposed TWA design is economic for mass production as well as easy to integrate with wireless communication systems.
Xu, He-Xiu; Wang, Guang-Ming; Qi, Mei-Qing; Zeng, Hui-Yong
2012-09-24
We report initially the design, fabrication and measurement of using waveguided electric metamaterials (MTM) in the design of closely-spaced microtrip antenna arrays with mutual coupling reduction. The complementary spiral ring resonators (CSRs) which exhibit single negative resonant permittivity around 3.5GHz are used as the basic electric MTM element. For verification, two CSRs with two and three concentric rings are considered, respectively. By properly arranging these well engineered waveguided MTMs between two H-plane coupled patch antennas, both numerical and measured results indicate that more than 8.4 dB mutual coupling reduction is obtained. The mechanism has been studied from a physical insight. The electric MTM element is electrically small, enabling the resultant antenna array to exhibit a small separation (λo/8 at the operating wavelength) and thus a high directivity. The proposed strategy opens an avenue to new types of antenna with super performances and can be generalized for other electric resonators.
A New Mechanism of Sound Generation in Songbirds
NASA Astrophysics Data System (ADS)
Goller, Franz; Larsen, Ole N.
1997-12-01
Our current understanding of the sound-generating mechanism in the songbird vocal organ, the syrinx, is based on indirect evidence and theoretical treatments. The classical avian model of sound production postulates that the medial tympaniform membranes (MTM) are the principal sound generators. We tested the role of the MTM in sound generation and studied the songbird syrinx more directly by filming it endoscopically. After we surgically incapacitated the MTM as a vibratory source, zebra finches and cardinals were not only able to vocalize, but sang nearly normal song. This result shows clearly that the MTM are not the principal sound source. The endoscopic images of the intact songbird syrinx during spontaneous and brain stimulation-induced vocalizations illustrate the dynamics of syringeal reconfiguration before phonation and suggest a different model for sound production. Phonation is initiated by rostrad movement and stretching of the syrinx. At the same time, the syrinx is closed through movement of two soft tissue masses, the medial and lateral labia, into the bronchial lumen. Sound production always is accompanied by vibratory motions of both labia, indicating that these vibrations may be the sound source. However, because of the low temporal resolution of the imaging system, the frequency and phase of labial vibrations could not be assessed in relation to that of the generated sound. Nevertheless, in contrast to the previous model, these observations show that both labia contribute to aperture control and strongly suggest that they play an important role as principal sound generators.
Puntumetakul, Rungthip; Suvarnnato, Thavatchai; Werasirirat, Phurichaya; Uthaikhup, Sureeporn; Yamauchi, Junichiro; Boucaut, Rose
2015-01-01
Background Thoracic spine manipulation has become a popular alternative to local cervical manipulative therapy for mechanical neck pain. This study investigated the acute effects of single-level and multiple-level thoracic manipulations on chronic mechanical neck pain (CMNP). Methods Forty-eight patients with CMNP were randomly allocated to single-level thoracic manipulation (STM) at T6–T7 or multiple-level thoracic manipulation (MTM), or to a control group (prone lying). Cervical range of motion (CROM), visual analog scale (VAS), and the Thai version of the Neck Disability Index (NDI-TH) scores were measured at baseline, and at 24-hour and at 1-week follow-up. Results At 24-hour and 1-week follow-up, neck disability and pain levels were significantly (P<0.05) improved in the STM and MTM groups compared with the control group. CROM in flexion and left lateral flexion were increased significantly (P<0.05) in the STM group when compared with the control group at 1-week follow-up. The CROM in right rotation was increased significantly after MTM compared to the control group (P<0.05) at 24-hour follow-up. There were no statistically significant differences in neck disability, pain level at rest, and CROM between the STM and MTM groups. Conclusion These results suggest that both single-level and multiple-level thoracic manipulation improve neck disability, pain levels, and CROM at 24-hour and 1-week follow-up in patients with CMNP. PMID:25624764
Fidani, L; Karagianni, P; Tsakalidis, C; Mitsiako, G; Hatziioannidis, I; Biancalana, V; Nikolaidis, N
2011-01-01
X-linked myotubular myopathy (XLMTM) is a rare congenital myopathy, usually characterized by severe hypotonia and respiratory insufficiency at birth, in affected, male infants. The disease is causally associated with mutations in the MTM1 gene, coding for phosphatase myotubularin. We report a severe case of XLMTM with a novel mutation, at a donor splicing site (c.1467+1G) previously associated with severe phenotype. The mutation was also identified in the patient's mother, providing an opportunity for sound genetic counseling. PMID:22435031
Fidani, L; Karagianni, P; Tsakalidis, C; Mitsiako, G; Hatziioannidis, I; Biancalana, V; Nikolaidis, N
2011-07-01
X-linked myotubular myopathy (XLMTM) is a rare congenital myopathy, usually characterized by severe hypotonia and respiratory insufficiency at birth, in affected, male infants. The disease is causally associated with mutations in the MTM1 gene, coding for phosphatase myotubularin. We report a severe case of XLMTM with a novel mutation, at a donor splicing site (c.1467+1G) previously associated with severe phenotype. The mutation was also identified in the patient's mother, providing an opportunity for sound genetic counseling.
Topographic Map of the West Candor Chasma Region of Mars, MTM 500k -05/282E OMKT
,
2004-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 290? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.
Topographic Map of the Ophir and Central Candor Chasmata Region of Mars MTM 500k -05/287E OMKT
,
2004-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 290? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.
Topographic map of the Tithonium Chasma Region of Mars, MTM 500k -05/277E OMKT
,
2004-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km. The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter. The projection is part of a Mars Transverse Mercator (MTM) system with 20? wide zones. For the area covered by this map sheet the central meridian is at 270? E. (70? W.). The scale factor at the central meridian of the zone containing this quadrangle is 0.9960 relative to a nominal scale of 1:500,000. Longitude increases to the east and latitude is planetocentric as allowed by IAU/IAG standards and in accordance with current NASA and USGS standards. A secondary grid (printed in red) has been added to the map as a reference to the west longitude/planetographic latitude system that is also allowed by IAU/IAG standards and has been used for previous Mars maps.
Topographic map of the Parana Valles region of Mars MTM 500k -25/337E OMKT
,
2003-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. MTM 500k –25/347E OMKT: Abbreviation for Mars Transverse Mercator; 1:500,000 series; center of sheet latitude 25° S., longitude 347.5° E. in planetocentric coordinate system (this corresponds to –25/012; latitude 25° S., longitude 12.5° W. in planetographic coordinate system); orthophotomosaic (OM) with color coded (K) topographic contours and nomenclature (T) [Greeley and Batson, 1990]. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km (Kirk and others, 2000). The datum (the 0-km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter (Smith and others, 2001). The image base for this map employs Viking Orbiter images from orbit 651. An orthophotomosaic was created on the digital photogrammetric workstation using the DTM compiled from stereo models. Integrated Software for Imagers and Spectrometers (ISIS) (Torson and Becker, 1997) provided the software to project the orthophotomosaic into the Transverse Mercator Projection.
Topographic Map of the Northwest Loire Valles Region of Mars MTM 500k -15/337E OMKT
,
2003-01-01
This map, compiled photogrammetrically from Viking Orbiter stereo image pairs, is part of a series of topographic maps of areas of special scientific interest on Mars. MTM 500k –15/337E OMKT: Abbreviation for Mars Transverse Mercator; 1:500,000 series; center of sheet latitude 15° S., longitude 337.5° E. in planetocentric coordinate system (this corresponds to –15/022; latitude 15° S., longitude 22.5° W. in planetographic coordinate system); orthophotomosaic (OM) with color coded (K) topographic contours and nomenclature (T) [Greeley and Batson, 1990]. The figure of Mars used for the computation of the map projection is an oblate spheroid (flattening of 1/176.875) with an equatorial radius of 3396.0 km and a polar radius of 3376.8 km (Kirk and others, 2000). The datum (the 0–km contour line) for elevations is defined as the equipotential surface (gravitational plus rotational) whose average value at the equator is equal to the mean radius as determined by Mars Orbiter Laser Altimeter (Smith and others, 2001). The image base for this map employs Viking Orbiter images from orbit 651. An orthophotomosaic was created on the digital photogrammetric workstation using the DTM compiled from stereo models. Integrated Software for Imagers and Spectrometers (ISIS) (Torson and Becker, 1997) provided the software to project the orthophotomosaic into the Transverse Mercator Projection.
Integrating pharmacogenomics into pharmacy practice via medication therapy management.
Reiss, Susan M
2011-01-01
To explore the application and integration of pharmacogenomics in pharmacy clinical practice via medication therapy management (MTM) to improve patient care. Department of Health & Human Services (HHS) Personalized Health Care Initiative, Food and Drug Administration (FDA) pharmacogenomics activity, and findings from the Utilizing E-Prescribing Technologies to Integrate Pharmacogenomics into Prescribing and Dispensing Practices Stakeholder Workshop, convened by the American Pharmacists Association (APhA) on March 5, 2009. Participants at the Stakeholder Workshop included diverse representatives from pharmacy, medicine, pathology, health information technology (HIT), standards, science, academia, government, and others with a key interest in the clinical application of pharmacogenomics. In 2006, HHS initiated the Personalized Health Care Initiative with the goal of building the foundation for the delivery of gene-based care, which may prove to be more effective for large patient subpopulations. In the years since the initiative was launched, drug manufacturers and FDA have begun to incorporate pharmacogenomic data and applications of this information into the drug development, labeling, and approval processes. New applications and processes for using this emerging pharmacogenomics data are needed to effectively integrate this information into clinical practice. Building from the findings of a stakeholder workshop convened by APhA and the advancement of the pharmacist's collaborative role in patient care through MTM, emerging roles for pharmacists using pharmacogenomic information to improve patient care are taking hold. Realizing the potential role of the pharmacist in pharmacogenomics through MTM will require connectivity of pharmacists into the electronic health record infrastructure to permit the exchange of pertinent health information among all members of a patient's health care team. Addressing current barriers, concerns, and system limitations and developing an effective infrastructure will be necessary for pharmacogenomics to achieve its true potential. To achieve integration of pharmacogenomics into clinical practice via MTM, the pharmacy profession must define a process for the application of pharmacogenomic data into pharmacy clinical practice that is aligned with MTM service delivery, develop a viable business model for these practices, and encourage and direct the development of HIT solutions that support the pharmacist's role in this emerging field.
Tiret, Laurent; Blot, Stéphane; Kessler, Jean-Louis; Gaillot, Hugues; Breen, Matthew; Panthier, Jean-Jacques
2003-09-01
Myotubular/centronuclear myopathies are a nosological group of hereditary disorders characterised by severe architectural and metabolic remodelling of skeletal muscle fibres. In most myofibres, nuclei are found at an abnormal central position within a halo devoid of myofibrillar proteins. The X-linked form (myotubular myopathy) is the most prevalent and severe form in human, leading to death during early postnatal life. Maturation of fibres is not completed and fibres resemble myotubes. Linkage analysis in human has helped to identify MTM1 as the morbid gene. MTM1 encodes myotubularin, a dual protein phosphatase. In families in which myotubular myopathy segregates, detected mutations in MTM1 abolish the specific phosphatase activity targeting the second messenger phosphatidylinositol 3-phosphate. Autosomal forms (centronuclear) have a later onset and are often compatible with life. At birth, fibres are normally constituted but progressively follow remodelling with a secondary centralisation of nuclei. Their prevalence is low; hence, no linkage data can be performed and no molecular aetiology is known. In the Labrador Retriever, a spontaneous disorder strikingly mimics the clinical evolution of the human centronuclear myopathy. We have established a canine pedigree and show that the disorder segregates as an autosomal recessive trait in that pedigree. We have further mapped the dog locus to a region on chromosome 2 that is orthologous to human chromosome 10p. To date, no human MTM1 gene member has been mapped to this genetic region. This report thus describes the first spontaneous mammalian model of centronuclear myopathy and defines a new locus for this group of diseases.
Development of a Selective Culture Medium for Primary Isolation of the Main Brucella Species▿
De Miguel, M. J.; Marín, C. M.; Muñoz, P. M.; Dieste, L.; Grilló, M. J.; Blasco, J. M.
2011-01-01
Bacteriological diagnosis of brucellosis is performed by culturing animal samples directly on both Farrell medium (FM) and modified Thayer-Martin medium (mTM). However, despite inhibiting most contaminating microorganisms, FM also inhibits the growth of Brucella ovis and some B. melitensis and B. abortus strains. In contrast, mTM is adequate for growth of all Brucella species but only partially inhibitory for contaminants. Moreover, the performance of both culture media for isolating B. suis has never been established properly. We first determined the performance of both media for B. suis isolation, proving that FM significantly inhibits B. suis growth. We also determined the susceptibility of B. suis to the antibiotics contained in both selective media, proving that nalidixic acid and bacitracin are highly inhibitory, thus explaining the reduced performance of FM for B. suis isolation. Based on these results, a new selective medium (CITA) containing vancomycin, colistin, nystatin, nitrofurantoin, and amphotericin B was tested for isolation of the main Brucella species, including B. suis. CITA's performance was evaluated using reference contaminant strains but also field samples taken from brucella-infected animals or animals suspected of infection. CITA inhibited most contaminant microorganisms but allowed the growth of all Brucella species, to levels similar to those for both the control medium without antibiotics and mTM. Moreover, CITA medium was more sensitive than both mTM and FM for isolating all Brucella species from field samples. Altogether, these results demonstrate the adequate performance of CITA medium for the primary isolation of the main Brucella species, including B. suis. PMID:21270216
Luder, Heidi R; Frede, Stacey M; Kirby, James A; Epplen, Kelly; Cavanaugh, Teresa; Martin-Boone, Jill E; Conrad, Wayne F; Kuhlmann, Diane; Heaton, Pamela C
2015-01-01
To determine if a community pharmacy-based transition of care (TOC) program that included the full scope of medication therapy management (MTM) services (TransitionRx) decreased hospital readmissions, resolved medication-related problems, and increased patient satisfaction. Prospective, quasi-experimental study. Nine Kroger Pharmacies located in Western Cincinnati. Patients older than 18 years of age and discharged from two local hospitals with a diagnosis of congestive heart failure, chronic obstructive pulmonary disease, or pneumonia. Patients were recruited from two local hospitals and referred to the community pharmacy for MTM services with the pharmacist within 1 week of discharge. Pharmacists reconciled the patients' medications, identified drug therapy problems, recommended changes to therapy, and provided self-management education. At 30 days after discharge, research personnel conducted telephone surveys, using a previously validated survey instrument, to assess hospital readmissions and patient satisfaction. Pharmacist interventions and medication-related problems were documented. A total of 90 patients completed the study. Of these, 20% of patients in the usual care group were admitted to the hospital within 30 days compared with 6.9% of patients in the intervention group (P = 0.019). In the 30 patients who received MTM services from the pharmacist, 210 interventions were made. The overall mean patient satisfaction with the TOC process was not significantly different between patients who were seen by the pharmacist and those who were not seen by the pharmacist. Community pharmacies successfully collaborated with hospitals to develop a referral process for TOC interventions. Patients who received MTM services from the pharmacist experienced significantly fewer readmissions than patients who received usual care.
Fang, Zhiwei; Peng, Lele; Qian, Yumin; Zhang, Xiao; Xie, Yujun; Cha, Judy J; Yu, Guihua
2018-04-18
Seeking earth-abundant electrocatalysts with high efficiency and durability has become the frontier of energy conversion research. Mixed-transition-metal (MTM)-based electrocatalysts, owing to the desirable electrical conductivity, synergistic effect of bimetal atoms, and structural stability, have recently emerged as new-generation hydrogen evolution reaction (HER) electrocatalysts. However, the correlation between anion species and their intrinsic electrocatalytic properties in MTM-based electrocatalysts is still not well understood. Here we present a novel approach to tuning the anion-dependent electrocatalytic characteristics in MTM-based catalyst for HER, using holey Ni/Co-based phosphides/selenides/oxides (Ni-Co-A, A = P, Se, O) as the model materials. The electrochemical results, combined with the electrical conductivity measurement and DFT calculation, reveal that P substitution could modulate the electron configuration, lower the hydrogen adsorption energy, and facilitate the desorption of hydrogen on the active sites in Ni-Co-A holey nanostructures, resulting in superior HER catalytic activity. Accordingly we fabricate the NCP holey nanosheet electrocatalyst for HER with an ultralow onset overpotential of nearly zero, an overpotential of 58 mV, and long-term durability, along with an applied potential of 1.56 V to boost overall water splitting at 10 mA cm -2 , among the best electrocatalysts reported for non-noble-metal catalysts to date. This work not only presents a deeper understanding of the intrinsic HER electrocatalytic properties for MTM-based electrocatalyst with various anion species but also offers new insights to better design efficient and durable water-splitting electrocatalysts.
Comparison of a new transport medium with universal transport medium at a tropical field site.
Schlaudecker, Elizabeth P; Heck, Joan P; MacIntyre, Elizabeth T; Martinez, Ruben; Dodd, Caitlin N; McNeal, Monica M; Staat, Mary A; Heck, Jeffery E; Steinhoff, Mark C
2014-10-01
Limited data are available in rural Honduran settings describing the etiology of respiratory infections, partially due to limited specimen transport. A new molecular transport media (MTM) preserves released nucleic acid at ambient temperature for later detection. Prospective surveillance was conducted in a Honduran clinic to identify 233 children less than 5 years of age presenting with respiratory symptoms. We obtained 2 nasopharyngeal samples and stored 1 in PrimeStore® MTM at room temperature and 1 in universal transport media (UTM) at -80 °C. The specimens were then transported to Cincinnati Children's Hospital and tested for 16 respiratory viruses using a multiplex PCR panel. The 2 specimen collection systems were similar for detecting the 4 most common viruses: influenza (Kappa = 0.7676, P < 0.0001), human metapneumovirus (Kappa = 0.8770, P < 0.0001), respiratory syncytial virus (Kappa = 0.6849, P < 0.0001), and parainfluenza (Kappa = 0.8796, P < 0.0001). These results suggest that clinical specimens transported via PrimeStore® MTM and UTM yield similar viral multiplex PCR results. Copyright © 2014 Elsevier Inc. All rights reserved.
Houle, Sherilyn K D; Chuck, Anderson W; Tsuyuki, Ross T
2012-01-01
To develop an economic model based on the use of pharmacy-based blood pressure kiosks for case finding of remunerable medication therapy management (MTM) opportunities. Descriptive, exploratory, nonexperimental study. Ontario, Canada, between January 2010 and September 2011. More than 7.5 million blood pressure kiosk readings were taken from 341 pharmacies. A model was developed to estimate revenues achievable by using blood pressure kiosks for 1 month to identify a cohort of patients with blood pressure of 130/80 mm Hg or more and caring for those patients during 1 year. Revenue generated from MTM programs. Pharmacies could generate an average of $12,270 (range $4,523-24,420) annually in revenue from billing for MTM services. Blood pressure kiosks can be used to identify patients with elevated blood pressure who may benefit from reimbursable pharmacist cognitive services. Revenue can be reinvested to purchase automated dispensing technology or offset pharmacy technician costs to free pharmacists to provide pharmaceutical care. Improved patient outcomes, increased patient loyalty, and improved adherence are additional potential benefits.
Towards a deterministic KPZ equation with fractional diffusion: the stationary problem
NASA Astrophysics Data System (ADS)
Abdellaoui, Boumediene; Peral, Ireneo
2018-04-01
In this work, we investigate by analysis the possibility of a solution to the fractional quasilinear problem: where is a bounded regular domain ( is sufficient), , 1 < q and f is a measurable non-negative function with suitable hypotheses. The analysis is done separately in three cases: subcritical, 1 < q < 2s critical, q = 2s and supercritical, q > 2s. The authors were partially supported by Ministerio de Economia y Competitividad under grants MTM2013-40846-P and MTM2016-80474-P (Spain).
NASA Astrophysics Data System (ADS)
Di Matteo, S.; Villante, U.
2017-05-01
The occurrence of waves at discrete frequencies in the solar wind (SW) parameters has been reported in the scientific literature with some controversial results, mostly concerning the existence (and stability) of favored sets of frequencies. On the other hand, the experimental results might be influenced by the analytical methods adopted for the spectral analysis. We focused attention on the fluctuations of the SW dynamic pressure (PSW) occurring in the leading edges of streams following interplanetary shocks and compared the results of the Welch method (WM) with those of the multitaper method (MTM). The results of a simulation analysis demonstrate that the identification of the wave occurrence and the frequency estimate might be strongly influenced by the signal characteristics and analytical methods, especially in the presence of multicomponent signals. In SW streams, PSW oscillations are routinely detected in the entire range f ≈ 1.2-5.0 mHz; nevertheless, the WM/MTM agreement in the identification and frequency estimate occurs in ≈50% of events and different sets of favored frequencies would be proposed for the same set of events by the WM and MTM analysis. The histogram of the frequency distribution of the events identified by both methods suggests more relevant percentages between f ≈ 1.7-1.9, f ≈ 2.7-3.4, and f ≈ 3.9-4.4 (with a most relevant peak at f ≈ 4.2 mHz). Extremely severe thresholds select a small number (14) of remarkable events, with a one-to-one correspondence between WM and MTM: interestingly, these events reveal a tendency for a favored occurrence in bins centered at f ≈ 2.9 and at f ≈ 4.2 mHz.
Shah, Mohammad Tahir; Khan, Sardar; Saddique, Umar; Gul, Nida; Khan, Muhammad Usman; Malik, Riffat Naseem; Farooq, Muhammad; Naz, Alia
2013-01-01
This study investigates the wild plant species for their phytoremediation potential of macro and trace metals (MTM). For this purpose, soil and wild plant species samples were collected along mafic and ultramafic terrain in the Jijal, Dubair, and Alpuri areas of Kohistan region, northern Pakistan. These samples were analyzed for the concentrations of MTM (Na, K, Ca, Mg, Fe, Mn, Pb, Zn, Cd, Cu, Cr, Ni, and Co) using atomic absorption spectrometer (AAS-PEA-700). Soil showed significant (P < .001) contamination level, while plants had greater variability in metal uptake from the contaminated sites. Plant species such as Selaginella jacquemontii, Rumex hastatus, and Plectranthus rugosus showed multifold enrichment factor (EF) of Fe, Mn, Cr, Ni, and Co as compared to background area. Results revealed that these wild plant species have the ability to uptake and accumulate higher metals concentration. Therefore, these plant species may be used for phytoremediation of metals contaminated soil. However, higher MTM concentrations in the wild plant species could cause environmental hazards in the study area, as selected metals (Fe, Mn, Cr, Ni, Co, and Pb) have toxicological concerns. PMID:24078907
Experimental Testing of a Metamaterial Slow Wave Structure for High-Power Microwave Generation
NASA Astrophysics Data System (ADS)
Shipman, K.; Prasad, S.; Andreev, D.; Fisher, D. M.; Reass, D. B.; Schamiloglu, E.; Gilmore, M.
2017-10-01
A high-power L band source has been developed using a metamaterial (MTM) to produce a double negative slow wave structure (SWS) for interaction with an electron beam. The beam is generated by a 700 kV, 6 kA short pulse (10 ns) accelerator. The design of the SWS consists of a cylindrical waveguide, loaded with alternating split-rings that are arrayed axially down the waveguide. The beam is guided down the center of the rings, where electrons interact with the MTM-SWS producing radiation. Power is extracted axially via a circular waveguide, and radiated by a horn antenna. Microwaves are characterized by an external detector placed in a waveguide. Mode characterization is performed using a neon bulb array. The bulbs are lit by the electric field, resulting in an excitation pattern that resembles the field pattern. This is imaged using an SLR camera. The MTM structure has electrically small features so breakdown is a concern. In addition to high speed cameras, a fiber-optic-fed, sub-ns photomultiplier tube array diagnostic has been developed and used to characterize breakdown light. Work supported by the Air Force Office of Scientific Research, MURI Grant FA9550-12-1-0489.
The stratospheric QBO signal in the NCEP reanalysis, 1948-2001
NASA Astrophysics Data System (ADS)
Ribera, P.; Gallego, D.; Pena-Ortiz, C.; Gimeno, L.; Garcia, R.; Hernandez, E.; Calvo, N.
2003-04-01
The spatiotemporal evolution of the zonal wind in the stratosphere is analyzed based on the use of the NCEP reanalysis dataset (1948-2001). MTM-SVD, a frequency-domain analysis method, is applied to isolate significant spatially-coherent variability with narrowband, oscillatory character. A quasibiennial oscillation is detected as the most intense coherent signal in the whole mid and high stratosphere, being the signal less intense in the lower levels, closer to the troposphere. There is a clear downward propagation of the signal with time over low latitudes, from 10 to 100 hPa, that is not evident over mid and high latitudes. A different behavior of the signal is detected over the Northern and the Southern Hemisphere. In the NH an anomaly in the zonal wind field, in phase with the equatorial signal, is detected to run around the whole hemisphere at 60º, and two regions in subtropical latitudes show wind anomalies with their sing opposed to that of the equator. In the SH no signal is detected in extratropical areas.
Gieseler, Henning; Kramer, Tony; Pikal, Michael J
2007-12-01
This report provides, for the first time, a summary of experiments using SMART Freeze Dryer technology during a 9 month testing period. A minimum ice sublimation area of about 300 cm(2) for the laboratory freeze dryer, with a chamber volume 107.5 L, was found consistent with data obtained during previous experiments with a smaller freeze dryer (52 L). Good reproducibility was found for cycle design with different type of excipients, formulations, and vials used. SMART primary drying end point estimates were accurate in the majority of the experiments, but showed an over prediction of primary cycle time when the product did not fully achieve steady state conditions before the first MTM measurement was performed. Product resistance data for 5% sucrose mixtures at varying fill depths were very reproducible. Product temperature determined by SMART was typically in good agreement with thermocouple data through about 50% of primary drying time, with significant deviations occurring near the end of primary drying, as expected, but showing a bias much earlier in primary drying for high solid content formulations (16.6% Pfizer product) and polyvinylpyrrolidone (40 kDa) likely due to water "re-adsorption" by the amorphous product during the MTM test. (c) 2007 Wiley-Liss, Inc.
Systemic AAV8-Mediated Gene Therapy Drives Whole-Body Correction of Myotubular Myopathy in Dogs.
Mack, David L; Poulard, Karine; Goddard, Melissa A; Latournerie, Virginie; Snyder, Jessica M; Grange, Robert W; Elverman, Matthew R; Denard, Jérôme; Veron, Philippe; Buscara, Laurine; Le Bec, Christine; Hogrel, Jean-Yves; Brezovec, Annie G; Meng, Hui; Yang, Lin; Liu, Fujun; O'Callaghan, Michael; Gopal, Nikhil; Kelly, Valerie E; Smith, Barbara K; Strande, Jennifer L; Mavilio, Fulvio; Beggs, Alan H; Mingozzi, Federico; Lawlor, Michael W; Buj-Bello, Ana; Childers, Martin K
2017-04-05
X-linked myotubular myopathy (XLMTM) results from MTM1 gene mutations and myotubularin deficiency. Most XLMTM patients develop severe muscle weakness leading to respiratory failure and death, typically within 2 years of age. Our objective was to evaluate the efficacy and safety of systemic gene therapy in the p.N155K canine model of XLMTM by performing a dose escalation study. A recombinant adeno-associated virus serotype 8 (rAAV8) vector expressing canine myotubularin (cMTM1) under the muscle-specific desmin promoter (rAAV8-cMTM1) was administered by simple peripheral venous infusion in XLMTM dogs at 10 weeks of age, when signs of the disease are already present. A comprehensive analysis of survival, limb strength, gait, respiratory function, neurological assessment, histology, vector biodistribution, transgene expression, and immune response was performed over a 9-month study period. Results indicate that systemic gene therapy was well tolerated, prolonged lifespan, and corrected the skeletal musculature throughout the body in a dose-dependent manner, defining an efficacious dose in this large-animal model of the disease. These results support the development of gene therapy clinical trials for XLMTM. Copyright © 2017 The American Society of Gene and Cell Therapy. Published by Elsevier Inc. All rights reserved.
The Effects of Mountaintop Mines and Valley Fills on Aquatic ...
This report assesses the state of the science on the environmental impacts of mountaintop mines and valley fills (MTM-VF) on streams in the Central Appalachian Coalfields. Our review focused on the aquatic impacts of mountaintop removal coal mining, which, as its name suggests, involves removing all or some portion of the top of a mountain or ridge to expose and mine one or more coal seams. The excess overburden is disposed of in constructed fills in small valleys or hollows adjacent to the mining site. MTM-VF lead directly to five principal alterations of stream ecosystems: (1) springs, intermittent streams, and small perennial streams are permanently lost with the removal of the mountain and from burial under fill, (2) concentrations of major chemical ions are persistently elevated downstream, (3) degraded water quality reaches levels that are acutely lethal to standard laboratory test organisms, (4) selenium concentrations are elevated, reaching concentrations that have caused toxic effects in fish and birds and (5) macroinvertebrate and fish communities are consistently and significantly degraded. This report assesses the state of the science on the environmental impacts of Mountaintop Mines and Valley Fills (MTM-VF) on streams in the Central Appalachian Coalfields. The draft report will be externally peer reviewed by EPA's Science Advisory Board in early 2010.
Srimath-Tirumula-Peddinti, Ravi Chandra Pavan Kumar; Neelapu, Nageswara Rao Reddy; Sidagam, Naresh
2015-01-01
Malarial incidence, severity, dynamics and distribution of malaria are strongly determined by climatic factors, i.e., temperature, precipitation, and relative humidity. The objectives of the current study were to analyse and model the relationships among climate, vector and malaria disease in district of Visakhapatnam, India to understand malaria transmission mechanism (MTM). Epidemiological, vector and climate data were analysed for the years 2005 to 2011 in Visakhapatnam to understand the magnitude, trends and seasonal patterns of the malarial disease. Statistical software MINITAB ver. 14 was used for performing correlation, linear and multiple regression analysis. Perennial malaria disease incidence and mosquito population was observed in the district of Visakhapatnam with peaks in seasons. All the climatic variables have a significant influence on disease incidence as well as on mosquito populations. Correlation coefficient analysis, seasonal index and seasonal analysis demonstrated significant relationships among climatic factors, mosquito population and malaria disease incidence in the district of Visakhapatnam, India. Multiple regression and ARIMA (I) models are best suited models for modeling and prediction of disease incidences and mosquito population. Predicted values of average temperature, mosquito population and malarial cases increased along with the year. Developed MTM algorithm observed a major MTM cycle following the June to August rains and occurring between June to September and minor MTM cycles following March to April rains and occurring between March to April in the district of Visakhapatnam. Fluctuations in climatic factors favored an increase in mosquito populations and thereby increasing the number of malarial cases. Rainfall, temperatures (20°C to 33°C) and humidity (66% to 81%) maintained a warmer, wetter climate for mosquito growth, parasite development and malaria transmission. Changes in climatic factors influence malaria directly by modifying the behaviour and geographical distribution of vectors and by changing the length of the life cycle of the parasite.
Srimath-Tirumula-Peddinti, Ravi Chandra Pavan Kumar; Neelapu, Nageswara Rao Reddy; Sidagam, Naresh
2015-01-01
Background Malarial incidence, severity, dynamics and distribution of malaria are strongly determined by climatic factors, i.e., temperature, precipitation, and relative humidity. The objectives of the current study were to analyse and model the relationships among climate, vector and malaria disease in district of Visakhapatnam, India to understand malaria transmission mechanism (MTM). Methodology Epidemiological, vector and climate data were analysed for the years 2005 to 2011 in Visakhapatnam to understand the magnitude, trends and seasonal patterns of the malarial disease. Statistical software MINITAB ver. 14 was used for performing correlation, linear and multiple regression analysis. Results/Findings Perennial malaria disease incidence and mosquito population was observed in the district of Visakhapatnam with peaks in seasons. All the climatic variables have a significant influence on disease incidence as well as on mosquito populations. Correlation coefficient analysis, seasonal index and seasonal analysis demonstrated significant relationships among climatic factors, mosquito population and malaria disease incidence in the district of Visakhapatnam, India. Multiple regression and ARIMA (I) models are best suited models for modeling and prediction of disease incidences and mosquito population. Predicted values of average temperature, mosquito population and malarial cases increased along with the year. Developed MTM algorithm observed a major MTM cycle following the June to August rains and occurring between June to September and minor MTM cycles following March to April rains and occurring between March to April in the district of Visakhapatnam. Fluctuations in climatic factors favored an increase in mosquito populations and thereby increasing the number of malarial cases. Rainfall, temperatures (20°C to 33°C) and humidity (66% to 81%) maintained a warmer, wetter climate for mosquito growth, parasite development and malaria transmission. Conclusions/Significance Changes in climatic factors influence malaria directly by modifying the behaviour and geographical distribution of vectors and by changing the length of the life cycle of the parasite. PMID:26110279
1982-03-01
C MTm ) X . r’ - 3 O M. o ) 0 20 40 .6(0r O0 100 I ,4- I I I I i - 4. 0 LO 2.0 3.0 4.0 50 - COMPUTED 3. Fig. 4 - Tip...0.40 C " 2.81 ITS 2.01 6.3 1.9 . 3.0 .. :,: 1’"". : ’I@’,: A. , 010 1 113.0 1: 1.6 0.5 4.0 1 0 a 0 1*, o At*site 8.6 AN NO - 36Bill 11.2 no 14 %G ANN, 1...0e c - - We-) >1 0.P eL 2- EQUAIICN |ti However, for high viscosity fuels, where c 0 , o 20 30 40 50 go., b3 Ap, 10 ReL 2 , ,2 6 3 . -% Pg
Migliozzi, Daniel R; Zullo, Andrew R; Collins, Christine; Elsaid, Khaled A
2015-11-15
The implementation and outcomes of a program combining electronic home blood pressure monitoring (HBPM) and pharmacist-provided medication therapy management (MTM) services in a renal transplantation clinic are described. Patients enrolled in the program were provided with a computer-enabled blood pressure monitor. A dedicated renal transplantation pharmacist was integrated into the renal transplantation team under a collaborative care practice agreement. The collaborative care agreement allowed the pharmacist to authorize medication additions, deletions, and dosage changes. Comprehensive disease and blood pressure education was provided by a clinical pharmacist. In the pretransplantation setting, the pharmacist interviewed the renal transplant candidate and documents allergies, verified the patient's medication profile, and identified and assessed barriers to medication adherence. A total of 50 renal transplant recipients with at least one recorded home blood pressure reading and at least one year of follow-up were included in our analysis. A significant reduction in mean systolic and diastolic blood pressure values were observed at 30, 90, 180, and 360 days after enrollment in the program (p < 0.05). Pharmacist interventions were documented for 37 patients. Medication-related problems accounted for 46% of these interventions and included dosage modifications, regimen changes, and mitigation of barriers to medication access and adherence. Implementation of electronic HBPM and pharmacist-provided MTM services implemented in a renal transplant clinic was associated with sustained improvements in blood pressure control. Incorporation of a pharmacist in the renal transplant clinic resulted in the detection and resolution of medication-related problems. Copyright © 2015 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Analytical performance bounds for multi-tensor diffusion-MRI.
Ahmed Sid, Farid; Abed-Meraim, Karim; Harba, Rachid; Oulebsir-Boumghar, Fatima
2017-02-01
To examine the effects of MR acquisition parameters on brain white matter fiber orientation estimation and parameter of clinical interest in crossing fiber areas based on the Multi-Tensor Model (MTM). We compute the Cramér-Rao Bound (CRB) for the MTM and the parameter of clinical interest such as the Fractional Anisotropy (FA) and the dominant fiber orientations, assuming that the diffusion MRI data are recorded by a multi-coil, multi-shell acquisition system. Considering the sum-of-squares method for the reconstructed magnitude image, we introduce an approximate closed-form formula for Fisher Information Matrix that has the simplicity and easy interpretation advantages. In addition, we propose to generalize the FA and the mean diffusivity to the multi-tensor model. We show the application of the CRB to reduce the scan time while preserving a good estimation precision. We provide results showing how the increase of the number of acquisition coils compensates the decrease of the number of diffusion gradient directions. We analyze the impact of the b-value and the Signal-to-Noise Ratio (SNR). The analysis shows that the estimation error variance decreases with a quadratic rate with the SNR, and that the optimum b-values are not unique but depend on the target parameter, the context, and eventually the target cost function. In this study we highlight the importance of choosing the appropriate acquisition parameters especially when dealing with crossing fiber areas. We also provide a methodology for the optimal tuning of these parameters using the CRB. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Mancho, Ana M.; Wiggins, Stephen; Curbelo, Jezabel; Mendoza, Carolina
2013-11-01
Lagrangian descriptors are a recent technique which reveals geometrical structures in phase space and which are valid for aperiodically time dependent dynamical systems. We discuss a general methodology for constructing them and we discuss a ``heuristic argument'' that explains why this method is successful. We support this argument by explicit calculations on a benchmark problem. Several other benchmark examples are considered that allow us to assess the performance of Lagrangian descriptors with both finite time Lyapunov exponents (FTLEs) and finite time averages of certain components of the vector field (``time averages''). In all cases Lagrangian descriptors are shown to be both more accurate and computationally efficient than these methods. We thank CESGA for computing facilities. This research was supported by MINECO grants: MTM2011-26696, I-Math C3-0104, ICMAT Severo Ochoa project SEV-2011-0087, and CSIC grant OCEANTECH. SW acknowledges the support of the ONR (Grant No. N00014-01-1-0769).
Patel, Sajal M; Pikal, Michael J
2010-07-01
This study is aimed at characterizing and understanding different modes of heat and mass transfer in glass syringes to develop a robust freeze-drying process. Two different holder systems were used to freeze-dry in syringes: an aluminum (Al) block and a plexiglass holder. The syringe heat transfer coefficient was characterized by a sublimation test using pure water. Mannitol and sucrose (5% w/v) were also freeze-dried, as model systems, in both the assemblies. Dry layer resistance was determined from manometric temperature measurement (MTM) and product temperature was measured using thermocouples, and was also determined from MTM. Further, freeze-drying process was also designed using Smart freeze-dryer to assess its application for freeze-drying in novel container systems. Heat and mass transfer in syringes were compared against the traditional container system (i.e., glass tubing vial). In the Al block, the heat transfer was via three modes: contact conduction, gas conduction, and radiation with gas conduction being the dominant mode of heat transfer. In the plexiglass holder, the heat transfer was mostly via radiation; convection was not involved. Also, MTM/Smart freeze-drying did work reasonably well for freeze-drying in syringes. When compared to tubing vials, product temperature decreases and hence drying time increases in syringes. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association
El-Guebaly, Laila; Rowcliffe, Arthur; Menard, Jonathan; ...
2016-08-11
The qualification and validation of nuclear technologies are daunting tasks for fusion demonstration (DEMO) and power plants. This is particularly true for advanced designs that involve harsh radiation environment with 14 MeV neutrons and high-temperature operating regimes. This paper outlines the unique qualification and validation processes developed in the U.S., offering the only access to the complete fusion environment, focusing on the most prominent U.S. blanket concept (the dual cooled PbLi (DCLL)) along with testing new generations of structural and functional materials in dedicated test modules. The venue for such activities is the proposed Fusion Nuclear Science Facility (FNSF), whichmore » is viewed as an essential element of the U.S. fusion roadmap. A staged blanket testing strategy has been developed to test and enhance the DCLL blanket performance during each phase of FNSF D-T operation. A materials testing module (MTM) is critically important to include in the FNSF as well to test a broad range of specimens of future, more advanced generations of materials in a relevant fusion environment. Here, the most important attributes for MTM are the relevant He/dpa ratio (10–15) and the much larger specimen volumes compared to the 10–500 mL range available in the International Fusion Materials Irradiation Facility (IFMIF) and European DEMO-Oriented Neutron Source (DONES).« less
Sharma, Manoj; Catalano, Hannah Priest; Nahar, Vinayak K; Lingam, Vimala C; Johnson, Paul; Ford, M Allison
2017-02-25
A substantial proportion of college students to not drink enough water and consume sugar-sweetened beverages (SSBs). Consumption of SSBs is associated with weight gain, obesity, type 2 diabetes mellitus, dental carries, and increased risk for cardiovascular disease. Hence, the purpose of this study was to use the multi-theory model (MTM) in predicting initiation and sustenance of plain water consumption instead of sugar-sweetened beverages among college students. A cross-sectional study. In this cross-sectional study, a 37-item valid and reliable MTM-based survey was administered to college students in 2016 via Qualtrics at a large public university in the Southeastern United States. Overall, 410 students responded to the survey; of those, 174 were eligible for the study and completed it. Stepwise multiple regression analysis revealed that 61.8% of the variance in the initiation of drinking plain water instead of SSBs was explained by behavioral confidence (P<0.001) and changes in the physical environment (P<0.001). Further, 58.3% of the variance in the sustenance of drinking plain water instead of SSBs was explained by emotional transformation (P<0.001) and practice for change (P=0.001). Multi-theory model of health behavior change is a robust theory for predicting plain water consumption instead of SSBs in college students. Interventions should be developed based on this theory for this target population.
Evaluation of Sandwich Structure Bonding In Out-of-Autoclave Processing
NASA Technical Reports Server (NTRS)
Hou, Tan-Hung; Baughman, James M.; Zimmerman, Thomas J.; Sutter, James K.; Gardner, John M.
2010-01-01
The out-of-autoclave-vacuum-bag-only (OOA-VBO) process is low in capital expenditures compared to the traditional autoclave, however, the material challenges for OOA-VBO workable material systems are high. Presently there are few such aerospace grade prepreg materials available commercially. In this study, we evaluated processing and properties of honeycomb sandwich structure (HC/SS) panels fabricated by co-curing composite face sheet with adhesives by the OOA-VBO process in an oven. The prepreg materials were IM7/MTM 45-1 and T40-800B/5320. Adhesives studied were AF-555M, XMTA-241/PM15, FM-309-1M and FM-300K. Aluminum H/C cores with and without perforations were included. It was found that adhesives in IM7/MTM 45-1/AF-555M, T40-800B/5320/FM 309-1M and T40-800B/5320/FM-300K panels all foamed but yielded high flatwise tensile (FWT) strength values above 8,275 kPA (1,200 psi). IM7/MTM 45-1/XMTA-241/PM15 did not foam, yet yielded a low FWT strength. SEM photomicrographs revealed that the origin of this low strength was poor adhesion in the interfaces between the adhesive and face sheet composite due to poor wetting associated with the high initial viscosity of the XMTA-241/PM15 adhesive.
Schneid, Stefan C; Johnson, Robert E; Lewis, Lavinia M; Stärtzel, Peter; Gieseler, Henning
2015-05-01
Process analytical technology (PAT) and quality by design have gained importance in all areas of pharmaceutical development and manufacturing. One important method for monitoring of critical product attributes and process optimization in laboratory scale freeze-drying is manometric temperature measurement (MTM). A drawback of this innovative technology is that problems are encountered when processing high-concentrated amorphous materials, particularly protein formulations. In this study, a model solution of bovine serum albumin and sucrose was lyophilized at both conservative and aggressive primary drying conditions. Different temperature sensors were employed to monitor product temperatures. The residual moisture content at primary drying endpoints as indicated by temperature sensors and batch PAT methods was quantified from extracted sample vials. The data from temperature probes were then used to recalculate critical product parameters, and the results were compared with MTM data. The drying endpoints indicated by the temperature sensors were not suitable for endpoint indication, in contrast to the batch methods endpoints. The accuracy of MTM Pice data was found to be influenced by water reabsorption. Recalculation of Rp and Pice values based on data from temperature sensors and weighed vials was possible. Overall, extensive information about critical product parameters could be obtained using data from complementary PAT tools. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Molina, D.; Pérez-Beteta, J.; Martínez-González, A.; Velásquez, C.; Martino, J.; Luque, B.; Revert, A.; Herruzo, I.; Arana, E.; Pérez-García, V. M.
2017-01-01
Abstract Introduction: Textural analysis refers to a variety of mathematical methods used to quantify the spatial variations in grey levels within images. In brain tumors, textural features have a great potential as imaging biomarkers having been shown to correlate with survival, tumor grade, tumor type, etc. However, these measures should be reproducible under dynamic range and matrix size changes for their clinical use. Our aim is to study this robustness in brain tumors with 3D magnetic resonance imaging, not previously reported in the literature. Materials and methods: 3D T1-weighted images of 20 patients with glioblastoma (64.80 ± 9.12 years-old) obtained from a 3T scanner were analyzed. Tumors were segmented using an in-house semi-automatic 3D procedure. A set of 16 3D textural features of the most common types (co-occurrence and run-length matrices) were selected, providing regional (run-length based measures) and local information (co-ocurrence matrices) on the tumor heterogeneity. Feature robustness was assessed by means of the coefficient of variation (CV) under both dynamic range (16, 32 and 64 gray levels) and/or matrix size (256x256 and 432x432) changes. Results: None of the textural features considered were robust under dynamic range changes. The textural co-occurrence matrix feature Entropy was the only textural feature robust (CV < 10%) under spatial resolution changes. Conclusions: In general, textural measures of three-dimensional brain tumor images are neither robust under dynamic range nor under matrix size changes. Thus, it becomes mandatory to fix standards for image rescaling after acquisition before the textural features are computed if they are to be used as imaging biomarkers. For T1-weighted images a dynamic range of 16 grey levels and a matrix size of 256x256 (and isotropic voxel) is found to provide reliable and comparable results and is feasible with current MRI scanners. The implications of this work go beyond the specific tumor type and MRI sequence studied here and pose the need for standardization in textural feature calculation of oncological images. FUNDING: James S. Mc. Donnell Foundation (USA) 21st Century Science Initiative in Mathematical and Complex Systems Approaches for Brain Cancer [Collaborative award 220020450 and planning grant 220020420], MINECO/FEDER [MTM2015-71200-R], JCCM [PEII-2014-031-P].
Getting It Right Matters: Climate Spectra and Their Estimation
NASA Astrophysics Data System (ADS)
Privalsky, Victor; Yushkov, Vladislav
2018-06-01
In many recent publications, climate spectra estimated with different methods from observed, GCM-simulated, and reconstructed time series contain many peaks at time scales from a few years to many decades and even centuries. However, respective spectral estimates obtained with the autoregressive (AR) and multitapering (MTM) methods showed that spectra of climate time series are smooth and contain no evidence of periodic or quasi-periodic behavior. Four order selection criteria for the autoregressive models were studied and proven sufficiently reliable for 25 time series of climate observations at individual locations or spatially averaged at local-to-global scales. As time series of climate observations are short, an alternative reliable nonparametric approach is Thomson's MTM. These results agree with both the earlier climate spectral analyses and the Markovian stochastic model of climate.
New results on equatorial thermospheric winds and temperatures from Ethiopia, Africa
NASA Astrophysics Data System (ADS)
Tesema, Fasil; Mesquita, Rafael; Meriwether, John; Damtie, Baylie; Nigussie, Melessew; Makela, Jonathan; Fisher, Daniel; Harding, Brian; Yizengaw, Endawoke; Sanders, Samuel
2017-03-01
Measurements of equatorial thermospheric winds, temperatures, and 630 nm relative intensities were obtained using an imaging Fabry-Perot interferometer (FPI), which was recently deployed at Bahir Dar University in Ethiopia (11.6° N, 37.4° E, 3.7° N magnetic). The results obtained in this study cover 6 months (53 nights of useable data) between November 2015 and April 2016. The monthly-averaged values, which include local winter and equinox seasons, show the magnitude of the maximum monthly-averaged zonal wind is typically within the range of 70 to 90 ms-1 and is eastward between 19:00 and 21:00 LT. Compared to prior studies of the equatorial thermospheric wind for this local time period, the magnitude is considerably weaker as compared to the maximum zonal wind speed observed in the Peruvian sector but comparable to Brazilian FPI results. During the early evening, the meridional wind speeds are 30 to 50 ms-1 poleward during the winter months and 10 to 25 ms-1 equatorward in the equinox months. The direction of the poleward wind during the winter months is believed to be mainly caused by the existence of the interhemispheric wind flow from the summer to winter hemispheres. An equatorial wind surge is observed later in the evening and is shifted to later local times during the winter months and to earlier local times during the equinox months. Significant night-to-night variations are also observed in the maximum speed of both zonal and meridional winds. The temperature observations show the midnight temperature maximum (MTM) to be generally present between 00:30 and 02:00 LT. The amplitude of the MTM was ˜ 110 K in January 2016 with values smaller than this in the other months. The local time difference between the appearance of the MTM and a pre-midnight equatorial wind was generally 60 to 180 min. A meridional wind reversal was also observed after the appearance of the MTM (after 02:00 LT). Climatological models, HWM14 and MSIS-00, were compared to the observations and the HWM14 model generally predicted the zonal wind observations well with the exception of higher model values by 25 ms-1 in the winter months. The HWM14 model meridional wind showed generally good agreement with the observations. Finally, the MSIS-00 model overestimated the temperature by 50 to 75 K during the early evening hours of local winter months. Otherwise, the agreement was generally good, although, in line with prior studies, the model failed to reproduce the MTM peak for any of the 6 months compared with the FPI data.
Disparity Implications of the Medicare MTM Eligibility Criteria: A Literature Review
Munshi, Kiraat D.; Shih, Ya-Chen Tina; Brown, Lawrence M.; Dagogo-Jack, Samuel; Wan, Jim Y.; Wang, Junling
2013-01-01
Summary The emphasis on eliminating racial and ethnic disparities in health care has received national attention, with various policy initiatives addressing this problem and proposing solutions. However, in the current economic era requiring tight monetary constraints, emphasis is increasingly being placed on economic efficiency, which often conflicts with the equality doctrine upon which many policies have been framed. Our review aims to highlight the disparity implications of one such policy provision—the predominantly utilization-based eligibility criteria for medication therapy management (MTM) services under Medicare Part D—by identifying studies that have documented racial and ethnic disparities in health status and the use of and spending on prescription medications. Future design and evaluation of various regulations and legislations employing utilization-based eligibility criteria must use caution in order to strike an equity-efficiency balance. PMID:23570431
Geologic Mapping of MTM -30247, -35247 and -40247 Quadrangles, Reull Vallis Region, Mars
NASA Technical Reports Server (NTRS)
Mest, S. C.; Crown, D. A.
2009-01-01
Geologic mapping of MTM -30247, -35247, and -40247 quadrangles is being used to characterize Reull Vallis (RV) and to determine the history of the eastern Hellas region of Mars. Studies of RV examine the roles and timing of volatile-driven erosional and depositional processes and provide constraints on potential associated climatic changes. This study complements earlier investigations of the eastern Hellas region, including regional analyses [1-6], mapping studies of circum-Hellas canyons [7-10], and volcanic studies of Hadriaca and Tyrrhena Paterae [11-13]. Key scientific objectives include 1) characterizing RV in its "fluvial zone," 2) analysis of channels in the surrounding plains and potential connections to and interactions with RV, 3) examining young, presumably sedimentary plains along RV, and 4) determining the nature of the connection between the segments of RV.
Medication therapy management and complex patients with disability: a randomized controlled trial.
Chrischilles, Elizabeth A; Doucette, William; Farris, Karen; Lindgren, Scott; Gryzlak, Brian; Rubenstein, Linda; Youland, Kelly; Wallace, Robert B
2014-02-01
Drug therapy problems, adverse drug events (ADEs), and symptom burden are high among adults with disabilities. To compare the effects of a modified medication therapy management (MTM) program within a self-efficacy workshop versus the workshop alone or usual care on symptom burden among adults with activity limitations. Three-group randomized controlled trial among adults (age 40 and older) with self-reported activity limitations in community practice. 8 weekly Living Well With a Disability (LWD) 2-hour workshop sessions with and without a collaborative medication management (CMM) module. mean number of moderate to very severe symptoms from a list of 11 physical and mental symptoms. Process measures: changes in medication regimens and self-reported ADEs. general linear mixed models (continuous outcomes) and generalized estimating equations (categorical outcomes). Participants had high symptom burden, low physical health, and took many medications. There was a significant increase in ADE reporting in the LWD + CMM group relative to the other 2 groups (Study group × Time P = .014), and there were significantly more changes in medication regimens in the LWD + CMM group (P = .013 LWD only vs LWD + CMM). The oldest third of participants had significantly fewer mean symptoms but received more intense CMM. There was no difference between the LWD-only, LWD + CMM, and usual care groups in symptom burden over time. Pharmacist MTM practices and MTM guidelines may need to be modified to affect symptom burden in a population with physical activity limitations.
Ross, Leigh Ann; Bloodworth, Lauren S
2012-01-01
To describe and provide preliminary clinical and economic outcomes from a pharmacist-delivered patient-centered health care (PCHC) model implemented in the Mississippi Delta. Mississippi between July 2008 and June 2010. 13 community pharmacies in nine Mississippi Delta counties. This PCHC model implements a comprehensive medication therapy management (MTM) program with pharmacist training, individualized patient encounters and group education, provider outreach, integration of pharmacists into health information technology, and on-site support in community pharmacies in a medically underserved region with a large burden of chronic disease and health disparities. The program also expands on traditional MTM services through initiatives in health literacy/cultural competency and efforts to increase the provider network and improve access to care. Criteria-based clinical outcomes, quality indicator reports, cost avoidance. PCHC services have been implemented in 13 pharmacies in nine counties in this underserved region, and 78 pharmacists and 177 students have completed the American Pharmacists Association's MTM Certificate Training Program. Preliminary data from 468 patients showed 681 encounters in which 1,471 drug therapy problems were identified and resolved. Preliminary data for clinical indicators and economic outcome measures are trending in a positive direction. Preliminary data analyses suggest that pharmacist-provided PCHC is beneficial and has the potential to be replicated in similar rural communities that are plagued with chronic disease and traditional primary care provider shortages. This effort aligns with national priorities to reduce medication errors, improve health outcomes, and reduce health care costs in underserved communities.
Kutchukian, Candice; Lo Scrudato, Mirella; Tourneur, Yves; Poulard, Karine; Vignaud, Alban; Berthier, Christine; Allard, Bruno; Lawlor, Michael W.; Buj-Bello, Ana; Jacquemond, Vincent
2016-01-01
Mutations in the gene encoding the phosphoinositide 3-phosphatase myotubularin (MTM1) are responsible for a pediatric disease of skeletal muscle named myotubular myopathy (XLMTM). Muscle fibers from MTM1-deficient mice present defects in excitation–contraction (EC) coupling likely responsible for the disease-associated fatal muscle weakness. However, the mechanism leading to EC coupling failure remains unclear. During normal skeletal muscle EC coupling, transverse (t) tubule depolarization triggers sarcoplasmic reticulum (SR) Ca2+ release through ryanodine receptor channels gated by conformational coupling with the t-tubule voltage-sensing dihydropyridine receptors. We report that MTM1 deficiency is associated with a 60% depression of global SR Ca2+ release over the full range of voltage sensitivity of EC coupling. SR Ca2+ release in the diseased fibers is also slower than in normal fibers, or delayed following voltage activation, consistent with the contribution of Ca2+-gated ryanodine receptors to EC coupling. In addition, we found that SR Ca2+ release is spatially heterogeneous within myotubularin-deficient muscle fibers, with focally defective areas recapitulating the global alterations. Importantly, we found that pharmacological inhibition of phosphatidylinositol 3-kinase (PtdIns 3-kinase) activity rescues the Ca2+ release defects in isolated muscle fibers and increases the lifespan and mobility of XLMTM mice, providing proof of concept for the use of PtdIns 3-kinase inhibitors in myotubular myopathy and suggesting that unbalanced PtdIns 3-kinase activity plays a critical role in the pathological process. PMID:27911767
Another Program For Generating Interactive Graphics
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
VAX/Ultrix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. When used throughout company for wide range of applications, makes both application program and computer seem transparent, with noticeable improvements in learning curve. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC's and PS/2 computers running AIX, and HP 9000 S
Schenk, Robert J; Schenk, Jenna
2011-01-01
A pharmacist-delivered, outpatient-focused medication therapy management (MTM) program is using a remote blood glucose (BG) meter upload device to provide better care and to improve outcomes for its patients with diabetes. Sharing uploaded BG meter data, presented in easily comprehensible graphs and charts, enables patients, caregivers, and the medical team to better understand how the patients' diabetes care is progressing. Pharmacists are becoming increasingly more active in helping to manage patients' complex medication regimens in an effort to help detect and avoid medication-related problems. Working together with patients and their physicians as part of an interdisciplinary health care team, pharmacists are helping to improve medication outcomes. This article focuses on two case studies highlighting the Diabetes Monitoring Program, one component of the Meridian Pharmacology Institute MTM service, and discusses the clinical application of a unique BG meter upload device. © 2010 Diabetes Technology Society.
High-order rogue wave solutions of the classical massive Thirring model equations
NASA Astrophysics Data System (ADS)
Guo, Lijuan; Wang, Lihong; Cheng, Yi; He, Jingsong
2017-11-01
The nth-order solutions of the classical massive Thirring model (MTM) equations are derived by using the n-fold Darboux transformation. These solutions are expressed by the ratios of the two determinants consisted of 2n eigenfunctions under the reduction conditions. Using this method, rogue waves are constructed explicitly up to the third-order. Three patterns, i.e., fundamental, triangular and circular patterns, of the rogue waves are discussed. The parameter μ in the MTM model plays the role of the mass in the relativistic field theory while in optics it is related to the medium periodic constant, which also results in a significant rotation and a remarkable lengthening of the first-order rogue wave. These results provide new opportunities to observe rouge waves by using a combination of electromagnetically induced transparency and the Bragg scattering four-wave mixing because of large amplitudes.
Geologic Mapping of MTM -30247, -35247 and -40247 Quadrangles, Reull Vallis Region of Mars
NASA Technical Reports Server (NTRS)
Mest, S. C.; Crown, D. A.
2008-01-01
Geologic mapping and stratigraphic analyses of MTM -30247, -35247, and -40247 quadrangles are being used to characterize the Reull Vallis (RV) system and to determine the history of the eastern Hellas region of Mars. Studies of RV examine the roles and timing of volatile-driven erosional and depositional processes and provide constraints on potential associated climatic changes. This study complements earlier investigations of the eastern Hellas region, including regional analyses [1-6], mapping studies of circum-Hellas canyons [7-10], and volcanic studies of Hadriaca and Tyrrhena Paterae [11-13]. Key scientific objectives for these quadrangles include 1) characterization of RV in its "fluvial zone," 2) analysis of channels in the surrounding plains and potential connections to and interactions with RV, 3) examination of young (?), presumably sedimentary plains along RV that embay the surrounding highlands, and 4) determination of the nature of the connection between segments 1 and 2 of RV.
Epithelial self-healing is recapitulated by a 3D biomimetic E-cadherin junction.
Cohen, Daniel J; Gloerich, Martijn; Nelson, W James
2016-12-20
Epithelial monolayers undergo self-healing when wounded. During healing, cells collectively migrate into the wound site, and the converging tissue fronts collide and form a stable interface. To heal, migrating tissues must form cell-cell adhesions and reorganize from the front-rear polarity characteristic of cell migration to the apical-basal polarity of an epithelium. However, identifying the "stop signal" that induces colliding tissues to cease migrating and heal remains an open question. Epithelial cells form integrin-based adhesions to the basal extracellular matrix (ECM) and E-cadherin-mediated cell-cell adhesions on the orthogonal, lateral surfaces between cells. Current biological tools have been unable to probe this multicellular 3D interface to determine the stop signal. We addressed this problem by developing a unique biointerface that mimicked the 3D organization of epithelial cell adhesions. This "minimal tissue mimic" (MTM) comprised a basal ECM substrate and a vertical surface coated with purified extracellular domain of E-cadherin, and was designed for collision with the healing edge of an epithelial monolayer. Three-dimensional imaging showed that adhesions formed between cells, and the E-cadherin-coated MTM resembled the morphology and dynamics of native epithelial cell-cell junctions and induced the same polarity transition that occurs during epithelial self-healing. These results indicate that E-cadherin presented in the proper 3D context constitutes a minimum essential stop signal to induce self-healing. That the Ecad:Fc MTM stably integrated into an epithelial tissue and reduced migration at the interface suggests that this biointerface is a complimentary approach to existing tissue-material interfaces.
Epithelial self-healing is recapitulated by a 3D biomimetic E-cadherin junction
Cohen, Daniel J.; Gloerich, Martijn; Nelson, W. James
2016-01-01
Epithelial monolayers undergo self-healing when wounded. During healing, cells collectively migrate into the wound site, and the converging tissue fronts collide and form a stable interface. To heal, migrating tissues must form cell–cell adhesions and reorganize from the front-rear polarity characteristic of cell migration to the apical-basal polarity of an epithelium. However, identifying the "stop signal" that induces colliding tissues to cease migrating and heal remains an open question. Epithelial cells form integrin-based adhesions to the basal extracellular matrix (ECM) and E-cadherin–mediated cell–cell adhesions on the orthogonal, lateral surfaces between cells. Current biological tools have been unable to probe this multicellular 3D interface to determine the stop signal. We addressed this problem by developing a unique biointerface that mimicked the 3D organization of epithelial cell adhesions. This "minimal tissue mimic" (MTM) comprised a basal ECM substrate and a vertical surface coated with purified extracellular domain of E-cadherin, and was designed for collision with the healing edge of an epithelial monolayer. Three-dimensional imaging showed that adhesions formed between cells, and the E-cadherin-coated MTM resembled the morphology and dynamics of native epithelial cell–cell junctions and induced the same polarity transition that occurs during epithelial self-healing. These results indicate that E-cadherin presented in the proper 3D context constitutes a minimum essential stop signal to induce self-healing. That the Ecad:Fc MTM stably integrated into an epithelial tissue and reduced migration at the interface suggests that this biointerface is a complimentary approach to existing tissue–material interfaces. PMID:27930308
DOE Office of Scientific and Technical Information (OSTI.GOV)
Laporte, J.; Hu, Ling-Jia; Kretz, C.
1997-05-01
We have identified a novel human gene that is entirely deleted in two boys with abnormal genital development and myotubular myopathy (MTM1). The gene, F18, is located in proximal Xq28, approximately 80 kb centromeric to the recently isolated MTM1 gene. Northern analysis of mRNA showed a ubiquitous pattern and suggested high levels of expression in skeletal muscle, brain, and heart. A transcript of 4.6 kb was detected in a range of tissues, and additional alternate forms of 3.8 and 2.6 kb were present in placenta and pancreas, respectively. The gene extends over 100 kb and is composed of at leastmore » seven exons, of which two are non-coding. Sequence analysis of a 4.6-kb cDNA contig revealed two overlapping open reading frames (ORFs) that encode putative proteins of 701 and 424 amino acids, respectively. Two alternative spliced transcripts affecting the large open reading frame were identified that, together with the Northern blot results, suggest that distinct proteins are derived from the gene. No significant homology to other known proteins was detected, but segments of the first ORF encode polyglutamine tracts and proline-rich domains, which are frequently observed in DNA-binding proteins. The F18 gene is a strong candidate for being implicated in the intersexual genitalia present in the two MTM1-deleted patients. The gene also serves as a candidate for other disorders that map to proximal Xq28. 15 refs., 3 figs., 1 tab.« less
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SILICON GRAPHICS VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (CONCURRENT VERSION)
NASA Technical Reports Server (NTRS)
Pearson, R. W.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (SUN VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (MASSCOMP VERSION)
NASA Technical Reports Server (NTRS)
Walters, D.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
ELAS - SCIENCE & TECHNOLOGY LABORATORY APPLICATIONS SOFTWARE (DEC VAX VERSION)
NASA Technical Reports Server (NTRS)
Junkin, B. G.
1994-01-01
The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
REMOTE SENSING AND MOUNTAINTOP MINING
Coal mining is Appalachia has undergone dramatic changes in the past decade. Modem mining practices know as Mountaintop Mining (MTM) and Valley Fills (VF) are at the center of an environmental and legal controversy that has spawned lawsuits and major environmental investigations....
Evaluation of 3M(TM) Scotchlite linear delineation system : final report.
DOT National Transportation Integrated Search
2004-09-01
Major construction projects present many hazards for drivers to negotiate. Detours, lane shifts and confusing curves : present unique challenges to all drivers. At night, the difficulties in negotiating these obstacles are amplified due to : reduced ...
Investigations of Volcanic and Volatile-Driven Processes Northeast of Hellas Basin, Mars
NASA Astrophysics Data System (ADS)
Mest, S. C.; Crown, D. A.; Michalski, J.; Chuang, F. C.; Price Blount, K.; Bleamaster, L. F.
2018-06-01
We are mapping the geologic units and features in three MTM quadrangles northeast of Hellas basin at 1:1M scale. The area displays evidence for volcanism and widespread volatile-related modification of the surface.
2013-12-01
45 A. METHODOLOGY... 45 B. ANALYSIS .....................................................................................................46 C...digital airborne guidance section MFOM MLRS family of munitions MLRS Multiple Launch Rocket System MMW millimeter wave MT material test MTM
Geologic Map of the MTM -30262 and -30267 Quadrangles, Hadriaca Patera Region of Mars
Crown, David A.; Greeley, Ronald
2007-01-01
Introduction Mars Transverse Mercator (MTM) -30262 and -30267 quadrangles cover the summit region and east margin of Hadriaca Patera, one of the Martian volcanoes designated highland paterae. MTM -30262 quadrangle includes volcanic deposits from Hadriaca Patera and Tyrrhena Patera (summit northeast of map area) and floor deposits associated with the Dao and Niger Valles canyon systems (south of map area). MTM -30267 quadrangle is centered on the caldera of Hadriaca Patera. The highland paterae are among the oldest, central-vent volcanoes on Mars and exhibit evidence for explosive eruptions, which make a detailed study of their geology an important component in understanding the evolution of Martian volcanism. Photogeologic mapping at 1:500,000-scale from analysis of Viking Orbiter images complements volcanological studies of Hadriaca Patera, geologic investigations of the other highland paterae, and an analysis of the styles and evolution of volcanic activity east of Hellas Planitia in the ancient, cratered highlands of Mars. This photogeologic study is an extension of regional geologic mapping east of Hellas Planitia. The Martian highland paterae are low-relief, areally extensive volcanoes exhibiting central calderas and radial channels and ridges. Four of these volcanoes, Hadriaca, Tyrrhena, Amphitrites, and Peneus Paterae, are located in the ancient cratered terrains surrounding Hellas Planitia and are thought to be located on inferred impact basin rings or related fractures. Based on analyses of Mariner 9 images, Potter (1976), Peterson (1977), and King (1978) suggested that the highland paterae were shield volcanoes formed by eruptions of fluid lavas. Later studies noted morphologic similarities between the paterae and terrestrial ash shields and the lack of primary lava flow features on the flanks of the volcanoes. The degraded appearances of Hadriaca and Tyrrhena Paterae and the apparently easily eroded materials composing their low, broad shields further suggest that the highland paterae are composed predominantly of pyroclastic deposits. Analyses of eruption and flow processes indicate that the distribution of units at Hadriaca and Tyrrhena Paterae is consistent with emplacement by gravity-driven pyroclastic flows. Detailed geologic study of the summit caldera and flanks of Hadriaca Patera is essential to determine the types of volcanic materials exposed, the nature of the processes forming these deposits, and the role of volcanism in the evolution of the cratered highlands that are characteristic of the southern hemisphere of Mars.
The Weekly Fab Five: Things You Should Do Every Week To Keep Your Computer Running in Tip-Top Shape.
ERIC Educational Resources Information Center
Crispen, Patrick
2001-01-01
Describes five steps that school librarians should follow every week to keep their computers running at top efficiency. Explains how to update virus definitions; run Windows update; run ScanDisk to repair errors on the hard drive; run a disk defragmenter; and backup all data. (LRW)
Program For Generating Interactive Displays
NASA Technical Reports Server (NTRS)
Costenbader, Jay; Moleski, Walt; Szczur, Martha; Howell, David; Engelberg, Norm; Li, Tin P.; Misra, Dharitri; Miller, Philip; Neve, Leif; Wolf, Karl;
1991-01-01
Sun/Unix version of Transportable Applications Environment Plus (TAE+) computer program provides integrated, portable software environment for developing and running interactive window, text, and graphical-object-based application software systems. Enables programmer or nonprogrammer to construct easily custom software interface between user and application program and to move resulting interface program and its application program to different computers. Plus viewed as productivity tool for application developers and application end users, who benefit from resultant consistent and well-designed user interface sheltering them from intricacies of computer. Available in form suitable for following six different groups of computers: DEC VAX station and other VMS VAX computers, Macintosh II computers running AUX, Apollo Domain Series 3000, DEC VAX and reduced-instruction-set-computer workstations running Ultrix, Sun 3- and 4-series workstations running Sun OS and IBM RT/PC and PS/2 compute
A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations
NASA Astrophysics Data System (ADS)
Demir, I.; Agliamzanov, R.
2014-12-01
Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.
NASA Astrophysics Data System (ADS)
Gerbault, Muriel; Schneider, Julie; Reverso-Peila, Alexandre; Corsini, Michel
2016-04-01
The Maures-Tanneron Massif (MTM), together with Corsica and Sardinia, hosted the South-Eastern Variscan belt and record a continuous evolution from continental collision to exhumation. We present a synthesis of the available geological and geochronogical data that explores the transition from convergence to perpendicular Permean extension in the MTM (at ~ 325 Ma ± 25 My). The migmatitic Internal Zone that composes the Western MTM displays structural clues such as backthrusting and magmatic foliations, and metamorphic data indicating exhumation of deep seated partially molten rocks at an apparent heating rate of 1-2 °C/km/My from ca. 345 Ma to 320 Ma. This suggests vertical advective heat transport during continued N140° convergence (D2 phase). In contrast at the same time, the low grade External zone composing the Eastern part of the MTM recorded exhumation of more conductive patterns at an apparent rate of 0.3-0.6 °C/km/My. It is only from ca. 320 Ma that transcurrent motion dominates in the Internal zone and progressively leaves way to N-S strecthing (D3 phase), indicative of orogenic collapse and extension and in asociation with emplacement of larger volumes of magmatism in the crust. Thermo-mechanical modeling complements this synthesis in order to highlight the conditions under which deep seated HP units could melt and massively start to exhume during maintained convergence (phase D2). Accounting for temperature dependent elasto-visco-plastic rheologies, our models explore the dynamics of an orogenic prism starting from a dis-equilibrated state just after slab break-off or delamination, at ca. 350 Ma. We simulate the development of gravitational instabilities in partially melting crust, a process that is already well known to depend on strain-rate, heat sources and strength layering. In order to reproduce the exhumation patterns of rocks from ~50 km depth over the appropriate time-scale (>20 My) and spatial extent (>100 km), a best fit was obtained with a mean convergence rate of 0.5 cm/yr and no exceptional surface processes. Internal heating has a crucial effect and mostly resulted from the radiogenic decay of stacked felsic crustal units. However alternation with mafic units is also necessary in order to prevent lateral spreading of the orogeny. A low viscosity partially molten (eg. felsic) crust also permits mechanical decoupling of surface deformation from the deeper mantle domains, thus reducing the differences due to either a shallow asthenosphere or a competent mantle lithosphere under the orogeny. A shallow asthenosphere produces too warm and fast exhumation. The bulk viscosity of the partially molten orogenic crust controls the timing of exhumation, pointing to the need for further constraints to link the behaviour at different scales of partially molten crust. The MTM witnesses the typical competition between far-field plate convergence and internal body forces, and we plead for a subsequent progressive evolution of transpression to perpendicular extension (still to be tackled with 3D modeling).
Nahar, Vinayak K; Sharma, Manoj; Catalano, Hannah Priest; Ickes, Melinda J; Johnson, Paul; Ford, M Allison
2016-01-01
Most college students do not adequately participate in enough physical activity (PA) to attain health benefits. A theory-based approach is critical in developing effective interventions to promote PA. The purpose of this study was to examine the utility of the newly proposed multi-theory model (MTM) of health behavior change in predicting initiation and sustenance of PA among college students. Using a cross-sectional design, a valid and reliable survey was administered in October 2015 electronically to students enrolled at a large Southern US University. The internal consistency Cronbach alphas of the subscales were acceptable (0.65-0.92). Only those who did not engage in more than 150 minutes of moderate to vigorous intensity aerobic PA during the past week were included in this study. Of the 495 respondents, 190 met the inclusion criteria of which 141 completed the survey. The majority of participants were females (72.3%) and Caucasians (70.9%). Findings of the confirmatory factor analysis (CFA) confirmed construct validity of subscales (initiation model: χ2 = 253.92 [df = 143], P < 0.001, CFI = 0.91, RMSEA = 0.07, SRMR = 0.07; sustenance model: χ2= 19.40 [df = 22], P < 0.001, CFI = 1.00, RMSEA = 0.00, SRMR = 0.03). Multivariate regression analysis showed that 26% of the variance in the PA initiation was explained by advantages outweighing disadvantages, behavioral confidence, work status, and changes in physical environment. Additionally, 29.7% of the variance in PA sustenance was explained by emotional transformation, practice for change, and changes in social environment. Based on this study's findings, MTM appears to be a robust theoretical framework for predicting PA behavior change. Future research directions and development of suitable intervention strategies are discussed.
High power long pulse microwave generation from a metamaterial structure with reverse symmetry
NASA Astrophysics Data System (ADS)
Lu, Xueying; Stephens, Jacob C.; Mastovsky, Ivan; Shapiro, Michael A.; Temkin, Richard J.
2018-02-01
Experimental operation of a high power microwave source with a metamaterial (MTM) structure is reported at power levels to 2.9 MW at 2.4 GHz in full 1 μs pulses. The MTM structure is formed by a waveguide that is below cutoff for TM modes. The waveguide is loaded by two axial copper plates machined with complementary split ring resonators, allowing two backward wave modes to propagate in the S-Band. A pulsed electron beam of up to 490 kV, 84 A travels down the center of the waveguide, midway between the plates. The electron beam is generated by a Pierce gun and is focused by a lens into a solenoidal magnetic field. The MTM plates are mechanically identical but are placed in the waveguide with reverse symmetry. Theory indicates that both Cherenkov and Cherenkov-cyclotron beam-wave interactions can occur. High power microwave generation was studied by varying the operating parameters over a wide range, including the electron beam voltage, the lens magnetic field, and the solenoidal field. Frequency tuning with a magnetic field and beam voltage was studied to discriminate between operation in the Cherenkov mode and the Cherenkov-cyclotron mode. Both modes were observed, but pulses above 1 MW of output power were only seen in the Cherenkov-cyclotron mode. A pair of steering coils was installed prior to the interaction space to initiate the cyclotron motion of the electron beam and thus encourage the Cherenkov-cyclotron high power mode. This successfully increased the output power from 2.5 MW to 2.9 MW (450 kV, 74 A, 9% efficiency).
Kurth, Laura; Kolker, Allan; Engle, Mark A.; Geboy, Nicholas J.; Hendryx, Michael; Orem, William H.; McCawley, Michael; Crosby, Lynn M.; Tatu, Calin A.; Varonka, Matthew S.; DeVera, Christina A.
2015-01-01
Mountaintop removal mining (MTM) is a widely used approach to surface coal mining in the US Appalachian region whereby large volumes of coal overburden are excavated using explosives, removed, and transferred to nearby drainages below MTM operations. To investigate the air quality impact of MTM, the geochemical characteristics of atmospheric particulate matter (PM) from five surface mining sites in south central West Virginia, USA, and five in-state study control sites having only underground coal mining or no coal mining whatsoever were determined and compared. Epidemiologic studies show increased rates of cancer, respiratory disease, cardiovascular disease, and overall mortality in Appalachian surface mining areas compared to Appalachian non-mining areas. In the present study, 24-h coarse (>2.5 µm) and fine (≤2.5 µm) PM samples were collected from two surface mining sites in June 2011 showed pronounced enrichment in elements having a crustal affinity (Ga, Al, Ge, Rb, La, Ce) contributed by local sources, relative to controls. Follow-up sampling in August 2011 lacked this enrichment, suggesting that PM input from local sources is intermittent. Using passive samplers, dry deposition total PM elemental fluxes calculated for three surface mining sites over multi-day intervals between May and August 2012 were 5.8 ± 1.5 times higher for crustal elements than at controls. Scanning microscopy of 2,249 particles showed that primary aluminosilicate PM was prevalent at surface mining sites compared to secondary PM at controls. Additional testing is needed to establish any link between input of lithogenic PM and disease rates in the study area.
Nahar, Vinayak K.; Sharma, Manoj; Catalano, Hannah Priest; Ickes, Melinda J.; Johnson, Paul; Ford, M. Allison
2016-01-01
Background: Most college students do not adequately participate in enough physical activity (PA) to attain health benefits. A theory-based approach is critical in developing effective interventions to promote PA. The purpose of this study was to examine the utility of the newly proposed multi-theory model (MTM) of health behavior change in predicting initiation and sustenance of PA among college students. Methods: Using a cross-sectional design, a valid and reliable survey was administered in October 2015 electronically to students enrolled at a large Southern US University. The internal consistency Cronbach alphas of the subscales were acceptable (0.65-0.92). Only those who did not engage in more than 150 minutes of moderate to vigorous intensity aerobic PA during the past week were included in this study. Results: Of the 495 respondents, 190 met the inclusion criteria of which 141 completed the survey. The majority of participants were females (72.3%) and Caucasians (70.9%). Findings of the confirmatory factor analysis (CFA) confirmed construct validity of subscales (initiation model: χ2 = 253.92 [df = 143], P < 0.001, CFI = 0.91, RMSEA = 0.07, SRMR = 0.07; sustenance model: χ2= 19.40 [df = 22], P < 0.001, CFI = 1.00, RMSEA = 0.00, SRMR = 0.03). Multivariate regression analysis showed that 26% of the variance in the PA initiation was explained by advantages outweighing disadvantages, behavioral confidence, work status, and changes in physical environment. Additionally, 29.7% of the variance in PA sustenance was explained by emotional transformation, practice for change, and changes in social environment. Conclusion: Based on this study’s findings, MTM appears to be a robust theoretical framework for predicting PA behavior change. Future research directions and development of suitable intervention strategies are discussed. PMID:27386419
Factors associated with independent pharmacy owners' satisfaction with Medicare Part D contracts.
Zhang, Su; Doucette, William R; Urmie, Julie M; Xie, Yang; Brooks, John M
2010-06-01
As Medicare Part D contracts apply pressure on the profitability of independent pharmacies, there is concern about their owners' willingness to sign such contracts. Identifying factors affecting independent pharmacy owners' satisfaction with Medicare Part D contracts could inform policy makers in managing Medicare Part D. (1) To identify influences on independent pharmacy owners' satisfaction with Medicare Part D contracts and (2) to characterize comments made by independent pharmacy owners about Medicare Part D. This cross-sectional study used a mail survey of independent pharmacy owners in 15 states comprising 6 Medicare regions to collect information on their most- and least-favorable Medicare Part D contracts, including satisfaction, contract management activities, market position, pharmacy operation, and specific payment levels on brand and generic drugs. Of the 1649 surveys mailed, 296 surveys were analyzed. The regression models for satisfaction with both the least and the most-favorable Part D contracts were significant (P<0.05). A different set of significant influences on satisfaction was identified for each regression model. For the most-favorable contract, influences were contending and equity. For the least-favorable contract, influences were negotiation, equity, generic rate bonus, and medication therapy management (MTM) payment. About one-third of the survey respondents made at least 1 comment. The most frequent themes in the comments were that Medicare Part D reimbursement rate is too low (28%) and that contracts are offered without negotiation in a "take it or leave it" manner (20%). Equity, contending, negotiation, generic rate bonus, and MTM payments were identified as the influences of independent pharmacy owners' satisfaction toward Medicare Part D contracts. Generic rate bonus and MTM payment provide additional financial incentives to less financially favorable contracts and, in turn, contribute to independent pharmacy owner's satisfaction toward these contracts. Copyright 2010 Elsevier Inc. All rights reserved.
Wilson, Jennifer A; Pegram, Angela H; Battise, Dawn M; Robinson, April M
2017-11-01
To determine if traditional didactic lecture or the jigsaw learning method is more effective to teach the medication therapy management (MTM) core elements in a first year pharmacy course. Traditional didactic lecture and a pre-class reading assignment were used in the fall semester cohort, and the jigsaw method was used in the spring semester cohort. Jigsaw is a cooperative learning strategy requiring students to assume responsibility for learning, and subsequently teaching peers. The students were responsible for reading specific sections of the pre-class reading, and then teaching other students in small groups about their specific reading assignments. To assess potential differences, identical pre- and post-tests were administered before and after the MTM section. Additionally, grade performance on an in-class project and final exam questions were compared, and students were surveyed on perceptions of teaching method used. A total of 45 and 43 students completed both the pre- and post-test in the fall and spring (96% and 93% response rate), respectively. Improvement in post-test scores favored the traditional method (p = 0.001). No statistical differences were noted between groups with grade performance on the in-class project and final exam questions. However, students favored the jigsaw method over traditional lecture and perceived improvements in problem solving skills, listening/communication skills and encouragement of cooperative learning (p = 0.018, 0.025 and 0.031). Although students favored the jigsaw learning method, traditional didactic lecture was more effective for the pre- and post-knowledge test performance. This may indicate that traditional didactic lecture is more effective for more foundational content. Copyright © 2017 Elsevier Inc. All rights reserved.
Kurth, Laura; Kolker, Allan; Engle, Mark; Geboy, Nicholas; Hendryx, Michael; Orem, William; McCawley, Michael; Crosby, Lynn; Tatu, Calin; Varonka, Matthew; DeVera, Christina
2015-06-01
Mountaintop removal mining (MTM) is a widely used approach to surface coal mining in the US Appalachian region whereby large volumes of coal overburden are excavated using explosives, removed, and transferred to nearby drainages below MTM operations. To investigate the air quality impact of MTM, the geochemical characteristics of atmospheric particulate matter (PM) from five surface mining sites in south central West Virginia, USA, and five in-state study control sites having only underground coal mining or no coal mining whatsoever were determined and compared. Epidemiologic studies show increased rates of cancer, respiratory disease, cardiovascular disease, and overall mortality in Appalachian surface mining areas compared to Appalachian non-mining areas. In the present study, 24-h coarse (>2.5 µm) and fine (≤2.5 µm) PM samples were collected from two surface mining sites in June 2011 showed pronounced enrichment in elements having a crustal affinity (Ga, Al, Ge, Rb, La, Ce) contributed by local sources, relative to controls. Follow-up sampling in August 2011 lacked this enrichment, suggesting that PM input from local sources is intermittent. Using passive samplers, dry deposition total PM elemental fluxes calculated for three surface mining sites over multi-day intervals between May and August 2012 were 5.8 ± 1.5 times higher for crustal elements than at controls. Scanning microscopy of 2,249 particles showed that primary aluminosilicate PM was prevalent at surface mining sites compared to secondary PM at controls. Additional testing is needed to establish any link between input of lithogenic PM and disease rates in the study area.
76 FR 37372 - Notice of Proposed Withdrawal Extension and Notification of a Public Meeting; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-27
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [61510-8451-0000; MTM 80092] Notice of Proposed Withdrawal Extension and Notification of a Public Meeting; Montana AGENCY: Bureau of Land...; 8:45 am] BILLING CODE 4310-55-P ...
Carosio, F; Kochumalayil, J; Cuttica, F; Camino, G; Berglund, L
2015-03-18
The toxicity of the most efficient fire retardant additives is a major problem for polymeric materials. Cellulose nanofiber (CNF)/clay nanocomposites, with unique brick-and-mortar structure and prepared by simple filtration, are characterized from the morphological point of view by scanning electron microscopy and X-ray diffraction. These nanocomposites have superior fire protection properties to other clay nanocomposites and fiber composites. The corresponding mechanisms are evaluated in terms of flammability (reaction to a flame) and cone calorimetry (exposure to heat flux). These two tests provide a wide spectrum characterization of fire protection properties in CNF/montmorrilonite (MTM) materials. The morphology of the collected residues after flammability testing is investigated. In addition, thermal and thermo-oxidative stability are evaluated by thermogravimetric analyses performed in inert (nitrogen) and oxidative (air) atmospheres. Physical and chemical mechanisms are identified and related to the unique nanostructure and its low thermal conductivity, high gas barrier properties and CNF/MTM interactions for char formation.
Broadbanding of circularly polarized patch antenna by waveguided magneto-dielectric metamaterial
NASA Astrophysics Data System (ADS)
Yang, Xin Mi; Wen, Juan; Liu, Chang Rong; Liu, Xue Guan; Cui, Tie Jun
2015-12-01
Design of bandwidth-enhanced circularly polarized (CP) patch antenna using artificial magneto-dielectric substrate was investigated. The artificial magneto-dielectric material adopted here takes the form of waveguided metamaterial (WG-MTM). In particular, the embedded meander line (EML) structure was employed as the building element of the WG-MTM. As verified by the retrieved effective medium parameters, the EML-based waveguided magneto-dielectric metamaterial (WG-MDM) exhibits two-dimensionally isotropic magneto-dielectric property with respect to TEM wave excitations applied in two orthogonal directions. A CP patch antenna loaded with the EML-based WG-MDM (WG-MDM antenna) has been proposed and its design procedure is described in detail. Simulation results show that the impedance and axial ratio bandwidths of the WG-MDM antenna have increased by 125% and 133%, respectively, compared with those obtained with pure dielectric substrate offering the same patch size. The design of the novel WG-MDM antenna was also validated by measurement results, which show good agreement with their simulated counterparts.
Statistical Analyses of Raw Material Data for MTM45-1/CF7442A-36% RW: CMH Cure Cycle
NASA Technical Reports Server (NTRS)
Coroneos, Rula; Pai, Shantaram, S.; Murthy, Pappu
2013-01-01
This report describes statistical characterization of physical properties of the composite material system MTM45-1/CF7442A, which has been tested and is currently being considered for use on spacecraft structures. This composite system is made of 6K plain weave graphite fibers in a highly toughened resin system. This report summarizes the distribution types and statistical details of the tests and the conditions for the experimental data generated. These distributions will be used in multivariate regression analyses to help determine material and design allowables for similar material systems and to establish a procedure for other material systems. Additionally, these distributions will be used in future probabilistic analyses of spacecraft structures. The specific properties that are characterized are the ultimate strength, modulus, and Poisson??s ratio by using a commercially available statistical package. Results are displayed using graphical and semigraphical methods and are included in the accompanying appendixes.
System and method for controlling power consumption in a computer system based on user satisfaction
Yang, Lei; Dick, Robert P; Chen, Xi; Memik, Gokhan; Dinda, Peter A; Shy, Alex; Ozisikyilmaz, Berkin; Mallik, Arindam; Choudhary, Alok
2014-04-22
Systems and methods for controlling power consumption in a computer system. For each of a plurality of interactive applications, the method changes a frequency at which a processor of the computer system runs, receives an indication of user satisfaction, determines a relationship between the changed frequency and the user satisfaction of the interactive application, and stores the determined relationship information. The determined relationship can distinguish between different users and different interactive applications. A frequency may be selected from the discrete frequencies at which the processor of the computer system runs based on the determined relationship information for a particular user and a particular interactive application running on the processor of the computer system. The processor may be adapted to run at the selected frequency.
VizieR Online Data Catalog: MV Lyr - bursts and periods (Pavlenko, 1998)
NASA Astrophysics Data System (ADS)
Pavlenko, E. P.
2001-01-01
The analysis of photometric behavior of the nova-like star MV Lyr in B in the low brightness state in 1995-1996 based on the observations performed at the Crimean Astrophysical Observatory with the MTM-500 television system was presented. (1 data file).
Theising, Katie M; Fritschle, Traci L; Scholfield, Angelina M; Hicks, Emily L; Schymik, Michelle L
2015-11-01
Our objective was to describe the implementation and clinical outcomes of an employer-sponsored, pharmacist-provided medication therapy management (MTM) program for health plan beneficiaries with diabetes mellitus and/or hypertension. We conducted a single-center retrospective medical record review. The setting was a Pharmacy MTM Clinic at a self-insured health system consisting of six hospitals and several ancillary facilities. A total of 161 health plan beneficiaries with diabetes identified during annual wellness screenings for the health plan in 2012 and 225 health plan beneficiaries with diabetes and/or hypertension identified during annual wellness screenings for the health plan in 2013 were referred to the MTM clinic based on specific criteria. In 2012 the health system expanded its existing wellness program by implementing a voluntary diabetes care program for health plan beneficiaries with uncontrolled diabetes (hemoglobin A(1c) [A1C] 7% or higher); a similar program was added for hypertension for the 2013 plan year. All participants' A1C and blood pressure results were tracked from the date of their wellness screening through the end of the plan year. The pharmacists involved had the capability to directly implement drug regimen changes according to hospital protocol or provide recommendations to the physician, as specified by the referring physician. For the 2012-2013 plan year, the mean difference in A1C from baseline to program completion was -0.38% (95% confidence interval [CI] -0.58 to -0.18%, p<0.05). For beneficiaries with a baseline A1C of 7% or higher, the mean difference was -0.69% (95% CI -0.99 to -0.39%, p<0.05). For the 2013-2014 plan year, the mean difference in A1C from baseline to program completion was -0.62% (95% CI -0.81 to -0.44%, p<0.05). In that year, the mean difference in A1C for beneficiaries with A1C 7% or higher was -0.97% (95% CI -1.23 to -0.72%, p<0.05). For those referred for hypertension, a mean difference of -13 mm Hg (95% CI -18.59 to -7.73, p<0.05) from baseline to program completion was seen in systolic blood pressure, and the mean difference in diastolic blood pressure was -7 mm Hg (95% CI -9.92 to -4.04, p<0.05). This study demonstrated that health plan beneficiaries who participated in the employer-sponsored, pharmacist-provided MTM program had significant decreases in A1C and blood pressure. © 2015 Pharmacotherapy Publications, Inc.
Becoming a Leader: Finding My Voice
ERIC Educational Resources Information Center
Finchum, Tabetha R.
2014-01-01
In this article, fourth-grade teacher Tabetha Finchum describes how a program called Arizona Master Teacher of Mathematics (AZ-MTM), a Noyce grant funded through the National Science Foundation, helped boost her confidence and broaden her understanding of the philosophies, curricula, and instructional decisions being implemented by other teachers.…
Geologic Mapping of Athabasca Valles
NASA Technical Reports Server (NTRS)
Keszthelyi, L. P.; Jaeger, W. L.; Tanaka, K.; Hare, T.
2009-01-01
We are approaching the end of the third year of mapping the Athabasca Valles region of Mars. The linework has been adjusted in response to new CTX images and we are on schedule to submit the 4 MTM quads (05202, 05207, 10202, 10207) and ac-companying paper by the end of this fiscal year.
Midnight Temperature Maximum (MTM) in Whole Atmosphere Model (WAM) Simulations
2016-04-14
naturally strongly dissipative medium, eliminating the need for ‘‘ sponge layers’’ and extra numerical dissipation often imposed in upper layers to...stabilize atmospheric model codes. WAM employs no ‘‘ sponge layers’’ and remains stable using a substantially reduced numerical Rayleigh friction coeffi
75 FR 59741 - Notice of Proposed Withdrawal and Opportunity for Public Meeting; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-28
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTB072000-L14300000-ET0000; MTM 98499] Notice of Proposed Withdrawal and Opportunity for Public Meeting; Montana AGENCY: Bureau of Land.... [FR Doc. 2010-24281 Filed 9-27-10; 8:45 am] BILLING CODE 4310-$$-P ...
75 FR 63856 - Public Land Order No. 7753; Extension of Public Land Order No. 7464; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-18
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTL06000-L14300000.ET0000; MTM 89170] Public Land Order No. 7753; Extension of Public Land Order No. 7464; Montana AGENCY: Bureau of Land...:45 am] BILLING CODE 4310-DN-P ...
77 FR 53226 - Public Land Order No. 7792; Partial Revocation, Power Site Reserve No. 109; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-31
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [MT-LLB05000-LL14300000-FQ0000; MTM 40412] Public Land Order No. 7792; Partial Revocation, Power Site Reserve No. 109; Montana Correction In notice...:45 am] BILLING CODE 1505-01-D ...
Efficacy of Two Mathematics Interventions for Enhancing Fluency with Elementary Students
ERIC Educational Resources Information Center
Mong, Michael D.; Mong, Kristi W.
2010-01-01
An alternating treatments design was used to evaluate two curriculum-based mathematics interventions designed to enhance fluency with three elementary school students. Results indicate that both the Math to Mastery (MTM) intervention and the Cover, Copy, Compare (CCC) intervention were effective at increasing mathematics fluency, as measured by…
Simulating three dimensional wave run-up over breakwaters covered by antifer units
NASA Astrophysics Data System (ADS)
Najafi-Jilani, A.; Niri, M. Zakiri; Naderi, Nader
2014-06-01
The paper presents the numerical analysis of wave run-up over rubble-mound breakwaters covered by antifer units using a technique integrating Computer-Aided Design (CAD) and Computational Fluid Dynamics (CFD) software. Direct application of Navier-Stokes equations within armour blocks, is used to provide a more reliable approach to simulate wave run-up over breakwaters. A well-tested Reynolds-averaged Navier-Stokes (RANS) Volume of Fluid (VOF) code (Flow-3D) was adopted for CFD computations. The computed results were compared with experimental data to check the validity of the model. Numerical results showed that the direct three dimensional (3D) simulation method can deliver accurate results for wave run-up over rubble mound breakwaters. The results showed that the placement pattern of antifer units had a great impact on values of wave run-up so that by changing the placement pattern from regular to double pyramid can reduce the wave run-up by approximately 30%. Analysis was done to investigate the influences of surface roughness, energy dissipation in the pores of the armour layer and reduced wave run-up due to inflow into the armour and stone layer.
WinSCP for Windows File Transfers | High-Performance Computing | NREL
WinSCP for Windows File Transfers WinSCP for Windows File Transfers WinSCP for can used to securely transfer files between your local computer running Microsoft Windows and a remote computer running Linux
RAPPORT: running scientific high-performance computing applications on the cloud.
Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt
2013-01-28
Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.
GEANT4 distributed computing for compact clusters
NASA Astrophysics Data System (ADS)
Harrawood, Brian P.; Agasthya, Greeshma A.; Lakshmanan, Manu N.; Raterman, Gretchen; Kapadia, Anuj J.
2014-11-01
A new technique for distribution of GEANT4 processes is introduced to simplify running a simulation in a parallel environment such as a tightly coupled computer cluster. Using a new C++ class derived from the GEANT4 toolkit, multiple runs forming a single simulation are managed across a local network of computers with a simple inter-node communication protocol. The class is integrated with the GEANT4 toolkit and is designed to scale from a single symmetric multiprocessing (SMP) machine to compact clusters ranging in size from tens to thousands of nodes. User designed 'work tickets' are distributed to clients using a client-server work flow model to specify the parameters for each individual run of the simulation. The new g4DistributedRunManager class was developed and well tested in the course of our Neutron Stimulated Emission Computed Tomography (NSECT) experiments. It will be useful for anyone running GEANT4 for large discrete data sets such as covering a range of angles in computed tomography, calculating dose delivery with multiple fractions or simply speeding the through-put of a single model.
Computational Methods for Feedback Controllers for Aerodynamics Flow Applications
2007-08-15
Iteration #, and y-translation by: »> Fy=[unf(:,8);runA(:,8);runB(:,8);runC(:,8);runD(:,S); runE (:,8)]; >> Oy-[unf(:,23) ;runA(:,23) ;runB(:,23) ;runC(:,23...runD(:,23) ; runE (:,23)]; >> Iter-[unf(:,1);runA(U ,l);runB(:,l);runC(:,l) ;runD(:,l); runE (:,l)]; >> plot(Fy) Cobalt version 4.0 €blso!,,tic,,. ř-21
Proposal for grid computing for nuclear applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Idris, Faridah Mohamad; Ismail, Saaidi; Haris, Mohd Fauzi B.
2014-02-12
The use of computer clusters for computational sciences including computational physics is vital as it provides computing power to crunch big numbers at a faster rate. In compute intensive applications that requires high resolution such as Monte Carlo simulation, the use of computer clusters in a grid form that supplies computational power to any nodes within the grid that needs computing power, has now become a necessity. In this paper, we described how the clusters running on a specific application could use resources within the grid, to run the applications to speed up the computing process.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-29
... James M. Sparks, Manager, Billings Field Office, 5001 Southgate Drive, Billings, Montana 59101-4669... CONTACT: Craig Drake, Assistant Manager, Billings Field Office, 5001 Southgate Drive, Billings, Montana... Application MTM 97988. DATES: The public hearing will be held in the BLM Montana State Office's main...
This report assesses the state of the science on the environmental impacts of mountaintop mines and valley fills (MTM-VF) on streams in the Central Appalachian Coalfields. Our review focused on the aquatic impacts of mountaintop removal coal mining, which, as its name suggests, ...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-12
... environmental permitting requirements for Appalachian mountaintop removal and other surface coal mining projects.../guidance/mining.html ). Both documents will be reviewed by an independent review panel convened by EPA's... the state of the science on the ecological impacts of Mountaintop Mining and Valley Fill (MTM-VF...
Mountaintop mining consequences
M.A. Palmer; E.S. Bernhardt; W.H. Schlesinger; K.N. Eshleman; E. Foufoula-Georgiou; M.S. Hendryx; A.D. Lemly; G.E. Likens; O.L. Loucks; M.E. Power; P.S. White; P.R. Wilcock
2010-01-01
There has been a global, 30-year increase in surface mining (1), which is now the dominant driver of land-use change in the central Appalachian ecoregion of the United States (2). One major form of such mining, mountaintop mining with valley fills (MTM/VF) (3), is widespread throughout eastern Kentucky, West Virginia (WV), and southwestern Virginia. Upper elevation...
76 FR 9358 - Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-17
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [MTM 067221] Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana AGENCY: Bureau of Land Management, Interior. ACTION.... Cynthia Staszak, Chief, Branch of Land Resources. [FR Doc. 2011-3617 Filed 2-16-11; 8:45 am] BILLING CODE...
2014-03-27
42 4.2.3 Number of Hops Hs . . . . . . . . . . . . . . . . . . . . . . . . . 45 4.2.4 Number of Sensors M... 45 4.5 Standard deviation vs. Ns. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46 4.6 Bias...laboratory MTM multiple taper method MUSIC multiple signal classification MVDR minimum variance distortionless reposnse PSK phase shift keying QAM
Detection of the secondary meridional circulation associated with the quasi-biennial oscillation
NASA Astrophysics Data System (ADS)
Ribera, P.; PeñA-Ortiz, C.; Garcia-Herrera, R.; Gallego, D.; Gimeno, L.; HernáNdez, E.
2004-09-01
The quasi-biennial oscillation (QBO) signal in stratospheric zonal and meridional wind, temperature, and geopotential height fields is analyzed based on the use of the National Centers for Environmental Prediction (NCEP) reanalysis (1958-2001). The multitaper method-singular value decomposition (MTM-SVD), a multivariate frequency domain analysis method, is used to detect significant and spatially coherent narrowband oscillations. The QBO is found as the most intense signal in the stratospheric zonal wind. Then, the MTM-SVD method is used to determine the patterns induced by the QBO at every stratospheric level and data field. The secondary meridional circulation associated with the QBO is identified in the obtained patterns. This circulation can be characterized by negative (positive) temperature anomalies associated with adiabatic rising (sinking) motions over zones of easterly (westerly) wind shear and over the subtropics and midlatitudes, while meridional convergence and divergence levels are found separated by a level of maximum zonal wind shear. These vertical and meridional motions form quasi-symmetric circulation cells over both hemispheres, though less intense in the Southern Hemisphere.
SSL - THE SIMPLE SOCKETS LIBRARY
NASA Technical Reports Server (NTRS)
Campbell, C. E.
1994-01-01
The Simple Sockets Library (SSL) allows C programmers to develop systems of cooperating programs using Berkeley streaming Sockets running under the TCP/IP protocol over Ethernet. The SSL provides a simple way to move information between programs running on the same or different machines and does so with little overhead. The SSL can create three types of Sockets: namely a server, a client, and an accept Socket. The SSL's Sockets are designed to be used in a fashion reminiscent of the use of FILE pointers so that a C programmer who is familiar with reading and writing files will immediately feel comfortable with reading and writing with Sockets. The SSL consists of three parts: the library, PortMaster, and utilities. The user of the SSL accesses it by linking programs to the SSL library. The PortMaster initializes connections between clients and servers. The PortMaster also supports a "firewall" facility to keep out socket requests from unapproved machines. The "firewall" is a file which contains Internet addresses for all approved machines. There are three utilities provided with the SSL. SKTDBG can be used to debug programs that make use of the SSL. SPMTABLE lists the servers and port numbers on requested machine(s). SRMSRVR tells the PortMaster to forcibly remove a server name from its list. The package also includes two example programs: multiskt.c, which makes multiple accepts on one server, and sktpoll.c, which repeatedly attempts to connect a client to some server at one second intervals. SSL is a machine independent library written in the C-language for computers connected via Ethernet using the TCP/IP protocol. It has been successfully compiled and implemented on a variety of platforms, including Sun series computers running SunOS, DEC VAX series computers running VMS, SGI computers running IRIX, DECstations running ULTRIX, DEC alpha AXPs running OSF/1, IBM RS/6000 computers running AIX, IBM PC and compatibles running BSD/386 UNIX and HP Apollo 3000/4000/9000/400T computers running HP-UX. SSL requires 45K of RAM to run under SunOS and 80K of RAM to run under VMS. For use on IBM PC series computers and compatibles running DOS, SSL requires Microsoft C 6.0 and the Wollongong TCP/IP package. Source code for sample programs and debugging tools are provided. The documentation is available on the distribution medium in TeX and PostScript formats. The standard distribution medium for SSL is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 5.25 inch 360K MS-DOS format diskette. The SSL was developed in 1992 and was updated in 1993.
Margolis, Amanda R; Martin, Beth A; Mott, David A
2016-01-01
To determine the feasibility and fidelity of student pharmacists collecting patient medication list information using a structured interview tool and the accuracy of documenting the information. The medication lists were used by a community pharmacist to provide a targeted medication therapy management (MTM) intervention. Descriptive analysis of patient medication lists collected with telephone interviews. Ten trained student pharmacists collected the medication lists. Trained student pharmacists conducted audio-recorded telephone interviews with 80 English-speaking, community-dwelling older adults using a structured interview tool to collect and document medication lists. Feasibility was measured using the number of completed interviews, the time student pharmacists took to collect the information, and pharmacist feedback. Fidelity to the interview tool was measured by assessing student pharmacists' adherence to asking all scripted questions and probes. Accuracy was measured by comparing the audio-recorded interviews to the medication list information documented in an electronic medical record. On average, it took student pharmacists 26.7 minutes to collect the medication lists. The community pharmacist said the medication lists were complete and that having the medication lists saved time and allowed him to focus on assessment, recommendations, and education during the targeted MTM session. Fidelity was high, with an overall proportion of asked scripted probes of 83.75% (95% confidence interval [CI], 80.62-86.88%). Accuracy was also high for both prescription (95.1%; 95% CI, 94.3-95.8%) and nonprescription (90.5%; 95% CI, 89.4-91.4%) medications. Trained student pharmacists were able to use an interview tool to collect and document medication lists with a high degree of fidelity and accuracy. This study suggests that student pharmacists or trained technicians may be able to collect patient medication lists to facilitate MTM sessions in the community pharmacy setting. Evaluating the sustainability of using student pharmacists or trained technicians to collect medication lists is needed. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
Margolis, Amanda R.; Martin, Beth A.; Mott, David A.
2016-01-01
Objective To determine the feasibility and fidelity of student pharmacists collecting patient medication list information using a structured interview tool and the accuracy of documenting the information. The medication lists were used by a community pharmacist to provide a targeted medication therapy management (MTM) intervention. Design Descriptive analysis of patient medication lists collected via telephone interviews. Participants 10 trained student pharmacists collected the medication lists. Intervention Trained student pharmacists conducted audio-recorded telephone interviews with 80 English-speaking community dwelling older adults using a structured interview tool to collect and document medication lists. Main outcome measures Feasibility was measured using the number of completed interviews, the time student pharmacists took to collect the information, and pharmacist feedback. Fidelity to the interview tool was measured by assessing student pharmacists’ adherence to asking all scripted questions and probes. Accuracy was measured by comparing the audio recorded interviews to the medication list information documented in an electronic medical record. Results On average it took student pharmacists 26.7 minutes to collect the medication lists. The community pharmacist said the medication lists were complete and that having the medication lists saved time and allowed him to focus on assessment, recommendations, and education during the targeted MTM session. Fidelity was high with an overall proportion of asked scripted probes of 83.75% (95%CI: 80.62–86.88%). Accuracy was also high for both prescription (95.1%, 95%CI: 94.3–95.8%) and non-prescription (90.5%, 95%CI: 89.4–91.4%) medications. Conclusion Trained student pharmacists were able to use an interview tool to collect and document medication lists with a high degree of fidelity and accuracy. This study suggests that student pharmacists or trained technicians may be able to collect patient medication lists to facilitate MTM sessions in the community pharmacy setting. Evaluating the sustainability of using student pharmacists or trained technicians to collect medication lists is needed. PMID:27000165
Geologic map of the MTM 85200 quadrangle, Olympia Rupes region of Mars
Skinner, James A.; Herkenhoff, Kenneth E.
2012-01-01
The north polar region of Mars is dominated by Planum Boreum, a roughly circular, domical plateau that rises >2,500 m above the surrounding lowland. Planum Boreum is >1,500 km in diameter, contains deep, curvilinear troughs and chasmata, isolated cavi, and marginal scarps and slopes. The north polar plateau is surrounded by low-lying and nearly horizontal plains of various surface texture, geologic origin, and stratigraphic significance. The MTM 85200 quadrangle spans 5° of latitude (lat 82.5° to 87.5° N.) and 40° of longitude (long 140° to 180° E.) within the eastern hemisphere of Mars. The quadrangle includes the high-standing Planum Boreum, curvilinear troughs of Boreales Scopuli, deep, sinuous scarps of Olympia Rupes, isolated and coalesced depressions of Olympia Cavi, margins of the circular polar erg Olympia Undae, and low-standing Olympia Planum. The surface of Planum Boreum within the MTM 85200 quadrangle is characterized by smoothly sculptured landforms with shallow slopes and variable relief at kilometer scales. Areas that are perennially covered with bright frost are generally smooth and planar at 100-m scales. However, MGS MOC and MRO HiRISE images show that much of the icy polar plateau is rough at decameter scale. The Martian polar plateaus are likely to contain a record of global climate history for >107 to as much as ~3 x 109 years. This record is partly observable as rhythmically layered deposits exposed in the curvilinear troughs of the north polar plateau, Planum Boreum. The north polar layered deposits are widely interpreted to be among the most youthful bedrock deposits on the Martian surface. These materials and their stratigraphic and structural relations provide a glimpse into some of the more recent geologic processes that have occurred on Mars. The ability of the massive polar deposits to periodically trap and release both volatiles and lithic particles may represent a globally important, recurring geologic process for Mars.
Pinto, Sharrel L; Kumar, Jinender; Partha, Gautam; Bechtol, Robert A
2013-01-01
Background The purpose of this study was to determine the cost savings of a pharmacist-led, employer-sponsored medication therapy management (MTM) program for diabetic patients and to assess for any changes in patient satisfaction and self-reported medication adherence for enrollees. Methods Participants in this study were enrollees of an employer-sponsored MTM program. They were included if their primary medical insurance and prescription coverage was from the City of Toledo, they had a diagnosis of type 2 diabetes, and whether or not they had been on medication or had been given a new prescription for diabetes treatment. The data were analyzed on a prospective, pre-post longitudinal basis, and tracked for one year following enrollment. Outcomes included economic costs, patient satisfaction, and self-reported patient adherence. Descriptive statistics were used to characterize the population, calculate the number of visits, and determine the mean costs for each visit. Friedman’s test was used to determine changes in outcomes due to the nonparametric nature of the data. Results The mean number of visits to a physician’s office decreased from 10.22 to 7.07. The mean cost of these visits for patients increased from $47.70 to $66.41, but use of the emergency room and inpatient visits decreased by at least 50%. Employer spending on emergency room visits decreased by $24,214.17 and inpatient visit costs decreased by $166,610.84. Office visit spending increased by $11,776.41. A total cost savings of $179,047.80 was realized by the employer at the end of the program. Significant improvements in patient satisfaction and adherence were observed. Conclusion Pharmacist interventions provided through the employer-sponsored MTM program led to substantial cost savings to the employer with improved patient satisfaction and adherence on the part of employees at the conclusion of the program. PMID:23610526
Simulation of LHC events on a millions threads
NASA Astrophysics Data System (ADS)
Childers, J. T.; Uram, T. D.; LeCompte, T. J.; Papka, M. E.; Benjamin, D. P.
2015-12-01
Demand for Grid resources is expected to double during LHC Run II as compared to Run I; the capacity of the Grid, however, will not double. The HEP community must consider how to bridge this computing gap by targeting larger compute resources and using the available compute resources as efficiently as possible. Argonne's Mira, the fifth fastest supercomputer in the world, can run roughly five times the number of parallel processes that the ATLAS experiment typically uses on the Grid. We ported Alpgen, a serial x86 code, to run as a parallel application under MPI on the Blue Gene/Q architecture. By analysis of the Alpgen code, we reduced the memory footprint to allow running 64 threads per node, utilizing the four hardware threads available per core on the PowerPC A2 processor. Event generation and unweighting, typically run as independent serial phases, are coupled together in a single job in this scenario, reducing intermediate writes to the filesystem. By these optimizations, we have successfully run LHC proton-proton physics event generation at the scale of a million threads, filling two-thirds of Mira.
Running Jobs on the Peregrine System | High-Performance Computing | NREL
on the Peregrine high-performance computing (HPC) system. Running Different Types of Jobs Batch jobs scheduling policies - queue names, limits, etc. Requesting different node types Sample batch scripts
Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories
NASA Technical Reports Server (NTRS)
Ng, Hok Kwan; Sridhar, Banavar
2016-01-01
This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.
76 FR 31977 - Public Land Order No. 7768; Extension of Public Land Order No. 6861; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-02
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTB01000-L14300000.ET0000; MTM 79264] Public Land Order No. 7768; Extension of Public Land Order No. 6861; Montana AGENCY: Bureau of Land... and Minerals Management. [FR Doc. 2011-13720 Filed 6-1-11; 8:45 am] BILLING CODE P ...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-14
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [61510-8451-0000; MTM 80092] Notice of Proposed Withdrawal Extension and Notification of a Public Meeting; Montana; Correction AGENCY: Bureau of... Staszak, Chief, Branch of Land Resources. [FR Doc. 2011-17716 Filed 7-13-11; 8:45 am] BILLING CODE 4310-55...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-03
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTB072000-L14300000-ET0000; MTM 98499] Public Land Order No. 7803; Withdrawal of Public Lands for the Limestone Hills Training Area; MT AGENCY... Filed 10-2-12; 8:45 am] BILLING CODE 1430-DN-P ...
75 FR 55604 - Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-13
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMTM01000-L14300000.ET0000; MTM 79264] Notice of Proposed Withdrawal Extension and Opportunity for Public Meeting; Montana AGENCY: Bureau of... Miller, Acting Chief, Branch of Land Resources. [FR Doc. 2010-22740 Filed 9-10-10; 8:45 am] BILLING CODE...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-10
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [LLMT924000-L14300000.FR0000; MTM 99415] Notice of Correction to Notice of Realty Action; Application for Recordable Disclaimer of Interest... ``disclaimer''. Cindy Staszak, Chief, Branch of Land Resources. [FR Doc. 2010-2851 Filed 2-9-10; 8:45 am...
78 FR 35957 - Public Land Order No. 7815; Extension of Public Land Order No. 6997; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-14
... DEPARTMENT OF THE INTERIOR Bureau of Land Management [61510-8451-0000; MTM 80092] Public Land Order No. 7815; Extension of Public Land Order No. 6997; Montana AGENCY: Bureau of Land Management.... Suh, Assistant Secretary--Policy, Management and Budget. [FR Doc. 2013-14115 Filed 6-13-13; 8:45 am...
WinHPC System | High-Performance Computing | NREL
System WinHPC System NREL's WinHPC system is a computing cluster running the Microsoft Windows operating system. It allows users to run jobs requiring a Windows environment such as ANSYS and MATLAB
Analyzing Spacecraft Telecommunication Systems
NASA Technical Reports Server (NTRS)
Kordon, Mark; Hanks, David; Gladden, Roy; Wood, Eric
2004-01-01
Multi-Mission Telecom Analysis Tool (MMTAT) is a C-language computer program for analyzing proposed spacecraft telecommunication systems. MMTAT utilizes parameterized input and computational models that can be run on standard desktop computers to perform fast and accurate analyses of telecommunication links. MMTAT is easy to use and can easily be integrated with other software applications and run as part of almost any computational simulation. It is distributed as either a stand-alone application program with a graphical user interface or a linkable library with a well-defined set of application programming interface (API) calls. As a stand-alone program, MMTAT provides both textual and graphical output. The graphs make it possible to understand, quickly and easily, how telecommunication performance varies with variations in input parameters. A delimited text file that can be read by any spreadsheet program is generated at the end of each run. The API in the linkable-library form of MMTAT enables the user to control simulation software and to change parameters during a simulation run. Results can be retrieved either at the end of a run or by use of a function call at any time step.
2011-08-01
5 Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis...classification of streaming data. Example input images (top left). All digit prototypes (cluster centers) found, with size proportional to frequency (top...Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis 1 http
Design for Run-Time Monitor on Cloud Computing
NASA Astrophysics Data System (ADS)
Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
NASA Astrophysics Data System (ADS)
Myre, Joseph M.
Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that this environment provides scientists and engineers with means to reduce the programmatic complexity of their applications, to perform geophysical inversions for characterizing physical systems, and to determine high-performing run-time configurations of heterogeneous computing systems using a run-time autotuner.
Nonlinear Analysis of a Bolted Marine Riser Connector Using NASTRAN Substructuring
NASA Technical Reports Server (NTRS)
Fox, G. L.
1984-01-01
Results of an investigation of the behavior of a bolted, flange type marine riser connector is reported. The method used to account for the nonlinear effect of connector separation due to bolt preload and axial tension load is described. The automated multilevel substructing capability of COSMIC/NASTRAN was employed at considerable savings in computer run time. Simplified formulas for computer resources, i.e., computer run times for modules SDCOMP, FBS, and MPYAD, as well as disk storage space, are presented. Actual run time data on a VAX-11/780 is compared with the formulas presented.
Scalable computing for evolutionary genomics.
Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert
2012-01-01
Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.
Fingerprinting Communication and Computation on HPC Machines
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean
2010-06-02
How do we identify what is actually running on high-performance computing systems? Names of binaries, dynamic libraries loaded, or other elements in a submission to a batch queue can give clues, but binary names can be changed, and libraries provide limited insight and resolution on the code being run. In this paper, we present a method for"fingerprinting" code running on HPC machines using elements of communication and computation. We then discuss how that fingerprint can be used to determine if the code is consistent with certain other types of codes, what a user usually runs, or what the user requestedmore » an allocation to do. In some cases, our techniques enable us to fingerprint HPC codes using runtime MPI data with a high degree of accuracy.« less
METHYLATION OF ARSENIC BY RECOMBINANT HUMAN AS3MT/287M AND AS3MT/287T POLYMORPHS
Arsenic (+3 oxidation state) methyltransferase (AS3MT) is the key enzyme in the pathway for methylation of inorganic arsenic (iAs). AS3MT polymorphism is, in part, responsible for interindividual differences in iAs metabolism. AS3MT/M287T polymorphism that is found in ~ 10% of C...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, R.J.
1976-11-01
The FFTF fuel pin design analysis is shown to be conservative through comparison with pin irradiation experience in EBR-II. This comparison shows that the actual lifetimes of EBR-II fuel pins are either greater than 80,000 MWd/MTM or greater than the calculated allowable lifetimes based on thermal creep strain.
46 CFR Table I to Part 150 - Alphabetical List of Cargoes
Code of Federal Regulations, 2014 CFR
2014-10-01
... (C17+) alkanoic acid 34 CUS CFT Corn syrup 43 CSY Cottonseed oil, fatty acid 34 CFY Creosote 21 2 CCT... tar 33 COR OCT Coal tar distillate 33 CDL Coal tar, high temperature 33 CHH Coal tar pitch 33 CTP... MTM Formaldehyde solution 19 2 FMS Formamide 10 FAM Formic acid 4 2 FMA Fructose solution 43 Fumaric...
46 CFR Table I to Part 150 - Alphabetical List of Cargoes
Code of Federal Regulations, 2013 CFR
2013-10-01
... (C17+) alkanoic acid 34 CUS CFT Corn syrup 43 CSY Cottonseed oil, fatty acid 34 CFY Creosote 21 2 CCT... tar 33 COR OCT Coal tar distillate 33 CDL Coal tar, high temperature 33 CHH Coal tar pitch 33 CTP... MTM Formaldehyde solution 19 2 FMS Formamide 10 FAM Formic acid 4 2 FMA Fructose solution 43 Fumaric...
46 CFR Table I to Part 150 - Alphabetical List of Cargoes
Code of Federal Regulations, 2010 CFR
2010-10-01
... (C17+) alkanoic acid 34 CUS CFT Corn syrup 43 CSY Cottonseed oil, fatty acid 34 CFY Creosote 21 2 CCT... tar 33 COR OCT Coal tar distillate 33 CDL Coal tar, high temperature 33 CHH Coal tar pitch 33 CTP... MTM Formaldehyde solution 19 2 FMS Formamide 10 FAM Formic acid 4 2 FMA Fructose solution 43 Fumaric...
46 CFR Table I to Part 150 - Alphabetical List of Cargoes
Code of Federal Regulations, 2011 CFR
2011-10-01
... (C17+) alkanoic acid 34 CUS CFT Corn syrup 43 CSY Cottonseed oil, fatty acid 34 CFY Creosote 21 2 CCT... tar 33 COR OCT Coal tar distillate 33 CDL Coal tar, high temperature 33 CHH Coal tar pitch 33 CTP... MTM Formaldehyde solution 19 2 FMS Formamide 10 FAM Formic acid 4 2 FMA Fructose solution 43 Fumaric...
46 CFR Table I to Part 150 - Alphabetical List of Cargoes
Code of Federal Regulations, 2012 CFR
2012-10-01
... (C17+) alkanoic acid 34 CUS CFT Corn syrup 43 CSY Cottonseed oil, fatty acid 34 CFY Creosote 21 2 CCT... tar 33 COR OCT Coal tar distillate 33 CDL Coal tar, high temperature 33 CHH Coal tar pitch 33 CTP... MTM Formaldehyde solution 19 2 FMS Formamide 10 FAM Formic acid 4 2 FMA Fructose solution 43 Fumaric...
77 FR 46111 - Public Land Order No. 7792; Partial Revocation, Power Site Reserve No. 109; Montana
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-02
... DEPARTMENT OF THE INTERIOR Bureau of land management [MT-LLB05000-LL14300000-FQ0000; MTM 40412] Public Land Order No. 7792; Partial Revocation, Power Site Reserve No. 109; Montana AGENCY: Bureau of...--Policy, Management and Budget. [FR Doc. 2012-18888 Filed 8-1-12; 8:45 am] BILLING CODE 4310-DN-P ...
Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches
NASA Technical Reports Server (NTRS)
Skinner, J. A., Jr.; Herkenhoff, K.
2010-01-01
The 1:500K-scale geologic map of MTM 85200 - the Olympia Cavi region of Mars - has been submitted for peer review [1]. Physiographically, the quadrangle includes portions of Olympia Rupes, a set of sinuous scarps which elevate Planum Boreum 800 meters above Olympia Planum. The region includes the high-standing, spiral troughs of Boreales Scopuli, the rugged and deep depressions of Olympia Cavi, and the vast dune fields of Olympia Undae. Geologically, the mapped units and landforms reflect the recent history of repeated accumulation and degradation. The widespread occurrence of both weakly and strongly stratified units implicates the drape-like accumulation of ice, dust, and sand through climatic variations. Similarly, the occurrence of layer truncations, particularly at unit boundaries, implicates punctuated periods of both localized and regional erosion and surface deflation whereby underlying units were exhumed and their material transported and re-deposited. Herein, we focus on the iterative mapping approaches that allowed not only the accommodation of the burgeoning variety and volume of data sets, but also facilitated the efficient presentation of map information. Unit characteristics and their geologic history are detailed in past abstracts [2-3].
Pan, Bai Cao; Tang, Wen Xuan; Qi, Mei Qing; Ma, Hui Feng; Tao, Zui; Cui, Tie Jun
2016-07-22
Mutual coupling inside antenna array is usually caused by two routes: signal leakage via conducting currents on the metallic background or surface wave along substrates; radio leakage received from space between antenna elements. The former one can be depressed by changing the distribution of surface currents, as reported in literatures. But when it comes to the latter one, the radiation-leakage-caused coupling, traditional approaches using circuit manipulation may be inefficient. In this article, we propose and design a new type of decoupling module, which is composed of coupled metamaterial (MTM) slabs. Two classes of MTM particles, the interdigital structure (IS) and the split-ring resonators (SRRs), are adopted to provide the first and second modulations of signal. We validate its function to reduce the radiation leakage between two dual-polarized patch antennas. A prototype is fabricated in a volume with subwavelength scale (0.6λ × 0.3λ × 0.053λ) to provide 7dB improvement for both co-polarization and cross-polarization isolations from 1.95 to 2.2 GHz. The design has good potential for wireless communication and radar systems.
Pan, Bai Cao; Tang, Wen Xuan; Qi, Mei Qing; Ma, Hui Feng; Tao, Zui; Cui, Tie Jun
2016-01-01
Mutual coupling inside antenna array is usually caused by two routes: signal leakage via conducting currents on the metallic background or surface wave along substrates; radio leakage received from space between antenna elements. The former one can be depressed by changing the distribution of surface currents, as reported in literatures. But when it comes to the latter one, the radiation-leakage-caused coupling, traditional approaches using circuit manipulation may be inefficient. In this article, we propose and design a new type of decoupling module, which is composed of coupled metamaterial (MTM) slabs. Two classes of MTM particles, the interdigital structure (IS) and the split-ring resonators (SRRs), are adopted to provide the first and second modulations of signal. We validate its function to reduce the radiation leakage between two dual-polarized patch antennas. A prototype is fabricated in a volume with subwavelength scale (0.6λ × 0.3λ × 0.053λ) to provide 7dB improvement for both co-polarization and cross-polarization isolations from 1.95 to 2.2 GHz. The design has good potential for wireless communication and radar systems. PMID:27444147
Robot computer problem solving system
NASA Technical Reports Server (NTRS)
Becker, J. D.; Merriam, E. W.
1974-01-01
The conceptual, experimental, and practical phases of developing a robot computer problem solving system are outlined. Robot intelligence, conversion of the programming language SAIL to run under the THNEX monitor, and the use of the network to run several cooperating jobs at different sites are discussed.
Active Nodal Task Seeking for High-Performance, Ultra-Dependable Computing
1994-07-01
implementation. Figure 1 shows a hardware organization of ANTS: stand-alone computing nodes inter - connected by buses. 2.1 Run Time Partitioning The...nodes in 14 respond to changing loads [27] or system reconfiguration [26]. Existing techniques are all source-initiated or server-initiated [27]. 5.1...short-running task segments. The task segments must be short-running in order that processors will become avalable often enough to satisfy changing
Parallel computing for automated model calibration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burke, John S.; Danielson, Gary R.; Schulz, Douglas A.
2002-07-29
Natural resources model calibration is a significant burden on computing and staff resources in modeling efforts. Most assessments must consider multiple calibration objectives (for example magnitude and timing of stream flow peak). An automated calibration process that allows real time updating of data/models, allowing scientists to focus effort on improving models is needed. We are in the process of building a fully featured multi objective calibration tool capable of processing multiple models cheaply and efficiently using null cycle computing. Our parallel processing and calibration software routines have been generically, but our focus has been on natural resources model calibration. Somore » far, the natural resources models have been friendly to parallel calibration efforts in that they require no inter-process communication, only need a small amount of input data and only output a small amount of statistical information for each calibration run. A typical auto calibration run might involve running a model 10,000 times with a variety of input parameters and summary statistical output. In the past model calibration has been done against individual models for each data set. The individual model runs are relatively fast, ranging from seconds to minutes. The process was run on a single computer using a simple iterative process. We have completed two Auto Calibration prototypes and are currently designing a more feature rich tool. Our prototypes have focused on running the calibration in a distributed computing cross platform environment. They allow incorporation of?smart? calibration parameter generation (using artificial intelligence processing techniques). Null cycle computing similar to SETI@Home has also been a focus of our efforts. This paper details the design of the latest prototype and discusses our plans for the next revision of the software.« less
The Impact and Promise of Open-Source Computational Material for Physics Teaching
NASA Astrophysics Data System (ADS)
Christian, Wolfgang
2017-01-01
A computer-based modeling approach to teaching must be flexible because students and teachers have different skills and varying levels of preparation. Learning how to run the ``software du jour'' is not the objective for integrating computational physics material into the curriculum. Learning computational thinking, how to use computation and computer-based visualization to communicate ideas, how to design and build models, and how to use ready-to-run models to foster critical thinking is the objective. Our computational modeling approach to teaching is a research-proven pedagogy that predates computers. It attempts to enhance student achievement through the Modeling Cycle. This approach was pioneered by Robert Karplus and the SCIS Project in the 1960s and 70s and later extended by the Modeling Instruction Program led by Jane Jackson and David Hestenes at Arizona State University. This talk describes a no-cost open-source computational approach aligned with a Modeling Cycle pedagogy. Our tools, curricular material, and ready-to-run examples are freely available from the Open Source Physics Collection hosted on the AAPT-ComPADRE digital library. Examples will be presented.
Colt: an experiment in wormhole run-time reconfiguration
NASA Astrophysics Data System (ADS)
Bittner, Ray; Athanas, Peter M.; Musgrove, Mark
1996-10-01
Wormhole run-time reconfiguration (RTR) is an attempt to create a refined computing paradigm for high performance computational tasks. By combining concepts from field programmable gate array (FPGA) technologies with data flow computing, the Colt/Stallion architecture achieves high utilization of hardware resources, and facilitates rapid run-time reconfiguration. Targeted mainly at DSP-type operations, the Colt integrated circuit -- a prototype wormhole RTR device -- compares favorably to contemporary DSP alternatives in terms of silicon area consumed per unit computation and in computing performance. Although emphasis has been placed on signal processing applications, general purpose computation has not been overlooked. Colt is a prototype that defines an architecture not only at the chip level but also in terms of an overall system design. As this system is realized, the concept of wormhole RTR will be applied to numerical computation and DSP applications including those common to image processing, communications systems, digital filters, acoustic processing, real-time control systems and simulation acceleration.
Virtualization and cloud computing in dentistry.
Chow, Frank; Muftu, Ali; Shorter, Richard
2014-01-01
The use of virtualization and cloud computing has changed the way we use computers. Virtualization is a method of placing software called a hypervisor on the hardware of a computer or a host operating system. It allows a guest operating system to run on top of the physical computer with a virtual machine (i.e., virtual computer). Virtualization allows multiple virtual computers to run on top of one physical computer and to share its hardware resources, such as printers, scanners, and modems. This increases the efficient use of the computer by decreasing costs (e.g., hardware, electricity administration, and management) since only one physical computer is needed and running. This virtualization platform is the basis for cloud computing. It has expanded into areas of server and storage virtualization. One of the commonly used dental storage systems is cloud storage. Patient information is encrypted as required by the Health Insurance Portability and Accountability Act (HIPAA) and stored on off-site private cloud services for a monthly service fee. As computer costs continue to increase, so too will the need for more storage and processing power. Virtual and cloud computing will be a method for dentists to minimize costs and maximize computer efficiency in the near future. This article will provide some useful information on current uses of cloud computing.
Framework for architecture-independent run-time reconfigurable applications
NASA Astrophysics Data System (ADS)
Lehn, David I.; Hudson, Rhett D.; Athanas, Peter M.
2000-10-01
Configurable Computing Machines (CCMs) have emerged as a technology with the computational benefits of custom ASICs as well as the flexibility and reconfigurability of general-purpose microprocessors. Significant effort from the research community has focused on techniques to move this reconfigurability from a rapid application development tool to a run-time tool. This requires the ability to change the hardware design while the application is executing and is known as Run-Time Reconfiguration (RTR). Widespread acceptance of run-time reconfigurable custom computing depends upon the existence of high-level automated design tools. Such tools must reduce the designers effort to port applications between different platforms as the architecture, hardware, and software evolves. A Java implementation of a high-level application framework, called Janus, is presented here. In this environment, developers create Java classes that describe the structural behavior of an application. The framework allows hardware and software modules to be freely mixed and interchanged. A compilation phase of the development process analyzes the structure of the application and adapts it to the target platform. Janus is capable of structuring the run-time behavior of an application to take advantage of the memory and computational resources available.
Progress in Machine Learning Studies for the CMS Computing Infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo
Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.
Progress in Machine Learning Studies for the CMS Computing Infrastructure
Bonacorsi, Daniele; Kuznetsov, Valentin; Magini, Nicolo; ...
2017-12-06
Here, computing systems for LHC experiments developed together with Grids worldwide. While a complete description of the original Grid-based infrastructure and services for LHC experiments and its recent evolutions can be found elsewhere, it is worth to mention here the scale of the computing resources needed to fulfill the needs of LHC experiments in Run-1 and Run-2 so far.
Multitasking the code ARC3D. [for computational fluid dynamics
NASA Technical Reports Server (NTRS)
Barton, John T.; Hsiung, Christopher C.
1986-01-01
The CRAY multitasking system was developed in order to utilize all four processors and sharply reduce the wall clock run time. This paper describes the techniques used to modify the computational fluid dynamics code ARC3D for this run and analyzes the achieved speedup. The ARC3D code solves either the Euler or thin-layer N-S equations using an implicit approximate factorization scheme. Results indicate that multitask processing can be used to achieve wall clock speedup factors of over three times, depending on the nature of the program code being used. Multitasking appears to be particularly advantageous for large-memory problems running on multiple CPU computers.
NASA Technical Reports Server (NTRS)
Eberhardt, D. S.; Baganoff, D.; Stevens, K.
1984-01-01
Implicit approximate-factored algorithms have certain properties that are suitable for parallel processing. A particular computational fluid dynamics (CFD) code, using this algorithm, is mapped onto a multiple-instruction/multiple-data-stream (MIMD) computer architecture. An explanation of this mapping procedure is presented, as well as some of the difficulties encountered when trying to run the code concurrently. Timing results are given for runs on the Ames Research Center's MIMD test facility which consists of two VAX 11/780's with a common MA780 multi-ported memory. Speedups exceeding 1.9 for characteristic CFD runs were indicated by the timing results.
Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing
Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon
2011-01-01
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811
Design and development of a run-time monitor for multi-core architectures in cloud computing.
Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon
2011-01-01
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Buhl, Fred; Haves, Philip
2008-09-20
EnergyPlus is a new generation building performance simulation program offering many new modeling capabilities and more accurate performance calculations integrating building components in sub-hourly time steps. However, EnergyPlus runs much slower than the current generation simulation programs. This has become a major barrier to its widespread adoption by the industry. This paper analyzed EnergyPlus run time from comprehensive perspectives to identify key issues and challenges of speeding up EnergyPlus: studying the historical trends of EnergyPlus run time based on the advancement of computers and code improvements to EnergyPlus, comparing EnergyPlus with DOE-2 to understand and quantify the run time differences,more » identifying key simulation settings and model features that have significant impacts on run time, and performing code profiling to identify which EnergyPlus subroutines consume the most amount of run time. This paper provides recommendations to improve EnergyPlus run time from the modeler?s perspective and adequate computing platforms. Suggestions of software code and architecture changes to improve EnergyPlus run time based on the code profiling results are also discussed.« less
Compressed quantum computation using a remote five-qubit quantum computer
NASA Astrophysics Data System (ADS)
Hebenstreit, M.; Alsina, D.; Latorre, J. I.; Kraus, B.
2017-05-01
The notion of compressed quantum computation is employed to simulate the Ising interaction of a one-dimensional chain consisting of n qubits using the universal IBM cloud quantum computer running on log2(n ) qubits. The external field parameter that controls the quantum phase transition of this model translates into particular settings of the quantum gates that generate the circuit. We measure the magnetization, which displays the quantum phase transition, on a two-qubit system, which simulates a four-qubit Ising chain, and show its agreement with the theoretical prediction within a certain error. We also discuss the relevant point of how to assess errors when using a cloud quantum computer with a limited amount of runs. As a solution, we propose to use validating circuits, that is, to run independent controlled quantum circuits of similar complexity to the circuit of interest.
ABSTRACT Arsenic (+3 oxidation state) methyltransferase (AS3MT) is the key enzyme in the pathway for methylation of arsenicals. A common polymorphism in the AS3MT gene that replaces a threonyl residue in position 287 with a methionyl residue (AS3MT/M287T) occurs at a frequency...
Geologic Mapping in Southern Margaritifer Terra
NASA Technical Reports Server (NTRS)
Irwin, R. P., III; Grant, J. A.
2010-01-01
Margaritifer Terra records a complex geologic history [1-5], and the area from Holden crater through Ladon Valles, Ladon basin, and up to Morava Valles is no exception [e.g., 6-13]. The 1:500,000 geologic map of MTM quadrangles -15027, -20027, -25027, and -25032 (Figs. 1 and 2 [14]) identifies a range of units that delineate the history of water-related activity and regional geologic context.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-15
... MTM 41534] Public Land Order No. 7745; Partial Revocation of Power Site Reserve Nos. 510 and No. 515... of public lands withdrawn for Power Site Reserve Nos. 510 and 515. This order also opens the lands to... INFORMATION: The Bureau of Land Management has determined that portions of Power Site Reserve Nos. 510 and 515...
Point-vortex stability under the influence of an external periodic flow
NASA Astrophysics Data System (ADS)
Ortega, Rafael; Ortega, Víctor; Torres, Pedro J.
2018-05-01
We provide sufficient conditions for the stability of the particle advection around a fixed vortex in a two-dimensional ideal fluid under the action of a periodic background flow. The proof relies on the identification of closed invariant curves around the origin by means of Moser’s invariant curve theorem. Partially supported by Spanish MINECO and ERDF project MTM2014-52232-P.
CASK and CaMKII function in the mushroom body α'/β' neurons during Drosophila memory formation.
Malik, Bilal R; Gillespie, John Michael; Hodge, James J L
2013-01-01
Ca(2+)/CaM serine/threonine kinase II (CaMKII) is a central molecule in mechanisms of synaptic plasticity and memory. A vital feature of CaMKII in plasticity is its ability to switch to a calcium (Ca(2+)) independent constitutively active state after autophosphorylation at threonine 287 (T287). A second pair of sites, T306 T307 in the calmodulin (CaM) binding region once autophosphorylated, prevent subsequent CaM binding and inactivates the kinase during synaptic plasticity and memory. Recently a synaptic molecule called Ca(2+)/CaM-dependent serine protein kinase (CASK) has been shown to control both sets of CaMKII autophosphorylation events and hence is well poised to be a key regulator of memory. We show deletion of full length CASK or just its CaMK-like and L27 domains disrupts middle-term memory (MTM) and long-term memory (LTM), with CASK function in the α'/β' subset of mushroom body neurons being required for memory. Likewise directly changing the levels of CaMKII autophosphorylation in these neurons removed MTM and LTM. The requirement of CASK and CaMKII autophosphorylation was not developmental as their manipulation just in the adult α'/β' neurons was sufficient to remove memory. Overexpression of CASK or CaMKII in the α'/β' neurons also occluded MTM and LTM. Overexpression of either Drosophila or human CASK in the α'/β' neurons of the CASK mutant completely rescued memory, confirming that CASK signaling in α'/β' neurons is necessary and sufficient for Drosophila memory formation and that the neuronal function of CASK is conserved between Drosophila and human. At the cellular level CaMKII overexpression in the α'/β' neurons increased activity dependent Ca(2+) responses while reduction of CaMKII decreased it. Likewise reducing CASK or directly expressing a phosphomimetic CaMKII T287D transgene in the α'/β' similarly decreased Ca(2+) signaling. Our results are consistent with CASK regulating CaMKII autophosphorylation in a pathway required for memory formation that involves activity dependent changes in Ca(2+) signaling in the α'/β' neurons.
Taşlı, Nurdan Gamze; Çimen, Ferda Keskin; Karakurt, Yücel; Uçak, Turgay; Mammadov, Renad; Süleyman, Bahadır; Kurt, Nezahat; Süleyman, Halis
2018-01-01
To determine the effects of Rutin on methanol induced optic neuropathy and compare the results with the effects of ethanol. Totally 30 rats were divided into 5 groups, with 6 rats in each group as follows: healthy controls (C), methotrexate (MTX), methotrexate+methanol (MTM), methotrexate+methanol+ethanol (MTME) and methotrexate+ methanol+Rutin (MTMR). In all rabbits except those of the control group, MTX, diluted in sterile serum physiologic, 0.3 mg/kg per oral was applied for 7d by the aid of a tube. After this procedure to the rats of MTM, MTME and MTMR groups, 20% methanol with a dose of 3 g/kg per oral was given by the aid of a tube. In MTME group, 4h after the application of methanol, 20% ethanol was applied by the same way with a dose of 0.5 g/kg. On the other hand, in MTMR group 4h after the application of methanol, Rutin, which was dissolved in distilled water, was applied by the same way with a dose of 50 mg/kg. There were statistically significant differences in tissue 8-hydroxy-2 deoxyguanine (8-OHdG), interleukin-1β (IL-1β), tumor necrosis factor-alpha (TNF-α), malondialdehyde (MDA), myeloperoxidase (MPO). glutathione peroxidase (tGSH) and superoxide dismutase (SOD) levels between groups ( P <0.001). In MTMR group tissue 8-OHdG, IL-1β, MDA, and MPO levels were similar with the healthy controls but significantly different than the other groups. In histopathological evaluations, in MTX group there was moderate focal destruction, hemorrhage and decrease in number of astrocytes and oligodendrocytes; in MTM group there was severe destruction and edema with decrease in number of astrocytes and oligodendrocytes; in MTME group there was mild hemorrhage, mild edema, mildly dilated blood vessels with congestion while in MTMR group, optic nerve tissue was resembling the healthy controls. Rutin may prevent methanol-induced optic neuropathy via anti-inflammatory effects and decreasing the oxidative stress. New treatment options are warranted in this disease to avoid loss of vision in patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weavers, P; Shu, Y; Tao, S
Purpose: A high-performance head-only magnetic resonance imaging gradient system with an acquisition volume of 26 cm employing an asymmetric design for the transverse coils has been developed. It is able to reach a magnitude of 85 mT/m at a slew rate of 700 T/m/s, but operated at 80 mT/m and 500 T/m/s for this test. A challenge resulting from this asymmetric design is that the gradient nonlinearly exhibits both odd- and even-ordered terms, and as the full imaging field of view is often used, the nonlinearity is pronounced. The purpose of this work is to show the system can producemore » clinically useful images after an on-site gradient nonlinearity calibration and correction, and show that acoustic noise levels fall within non-significant risk (NSR) limits for standard clinical pulse sequences. Methods: The head-only gradient system was inserted into a standard 3T wide-bore scanner without acoustic damping. The ACR phantom was scanned in an 8-channel receive-only head coil and the standard American College of Radiology (ACR) MRI quality control (QC) test was performed. Acoustic noise levels were measured for several standard pulse sequences. Results: Images acquired with the head-only gradient system passed all ACR MR image quality tests; Both even and odd-order gradient distortion correction terms were required for the asymmetric gradients to pass. Acoustic noise measurements were within FDA NSR guidelines of 99 dBA (with assumed 20 dBA hearing protection) A-weighted and 140 dB for peak for all but one sequence. Note the gradient system was installed without any shroud or acoustic batting. We expect final system integration to greatly reduce noise experienced by the patient. Conclusion: A high-performance head-only asymmetric gradient system operating at 80 mT/m and 500 T/m/s conforms to FDA acoustic noise limits in all but one case, and passes all the ACR MR image quality control tests. This work was supported in part by the NIH grant 5R01EB010065.« less
Self-Scheduling Parallel Methods for Multiple Serial Codes with Application to WOPWOP
NASA Technical Reports Server (NTRS)
Long, Lyle N.; Brentner, Kenneth S.
2000-01-01
This paper presents a scheme for efficiently running a large number of serial jobs on parallel computers. Two examples are given of computer programs that run relatively quickly, but often they must be run numerous times to obtain all the results needed. It is very common in science and engineering to have codes that are not massive computing challenges in themselves, but due to the number of instances that must be run, they do become large-scale computing problems. The two examples given here represent common problems in aerospace engineering: aerodynamic panel methods and aeroacoustic integral methods. The first example simply solves many systems of linear equations. This is representative of an aerodynamic panel code where someone would like to solve for numerous angles of attack. The complete code for this first example is included in the appendix so that it can be readily used by others as a template. The second example is an aeroacoustics code (WOPWOP) that solves the Ffowcs Williams Hawkings equation to predict the far-field sound due to rotating blades. In this example, one quite often needs to compute the sound at numerous observer locations, hence parallelization is utilized to automate the noise computation for a large number of observers.
Counterfactual quantum computation through quantum interrogation
NASA Astrophysics Data System (ADS)
Hosten, Onur; Rakher, Matthew T.; Barreiro, Julio T.; Peters, Nicholas A.; Kwiat, Paul G.
2006-02-01
The logic underlying the coherent nature of quantum information processing often deviates from intuitive reasoning, leading to surprising effects. Counterfactual computation constitutes a striking example: the potential outcome of a quantum computation can be inferred, even if the computer is not run. Relying on similar arguments to interaction-free measurements (or quantum interrogation), counterfactual computation is accomplished by putting the computer in a superposition of `running' and `not running' states, and then interfering the two histories. Conditional on the as-yet-unknown outcome of the computation, it is sometimes possible to counterfactually infer information about the solution. Here we demonstrate counterfactual computation, implementing Grover's search algorithm with an all-optical approach. It was believed that the overall probability of such counterfactual inference is intrinsically limited, so that it could not perform better on average than random guesses. However, using a novel `chained' version of the quantum Zeno effect, we show how to boost the counterfactual inference probability to unity, thereby beating the random guessing limit. Our methods are general and apply to any physical system, as illustrated by a discussion of trapped-ion systems. Finally, we briefly show that, in certain circumstances, counterfactual computation can eliminate errors induced by decoherence.
4273π: Bioinformatics education on low cost ARM hardware
2013-01-01
Background Teaching bioinformatics at universities is complicated by typical computer classroom settings. As well as running software locally and online, students should gain experience of systems administration. For a future career in biology or bioinformatics, the installation of software is a useful skill. We propose that this may be taught by running the course on GNU/Linux running on inexpensive Raspberry Pi computer hardware, for which students may be granted full administrator access. Results We release 4273π, an operating system image for Raspberry Pi based on Raspbian Linux. This includes minor customisations for classroom use and includes our Open Access bioinformatics course, 4273π Bioinformatics for Biologists. This is based on the final-year undergraduate module BL4273, run on Raspberry Pi computers at the University of St Andrews, Semester 1, academic year 2012–2013. Conclusions 4273π is a means to teach bioinformatics, including systems administration tasks, to undergraduates at low cost. PMID:23937194
4273π: bioinformatics education on low cost ARM hardware.
Barker, Daniel; Ferrier, David Ek; Holland, Peter Wh; Mitchell, John Bo; Plaisier, Heleen; Ritchie, Michael G; Smart, Steven D
2013-08-12
Teaching bioinformatics at universities is complicated by typical computer classroom settings. As well as running software locally and online, students should gain experience of systems administration. For a future career in biology or bioinformatics, the installation of software is a useful skill. We propose that this may be taught by running the course on GNU/Linux running on inexpensive Raspberry Pi computer hardware, for which students may be granted full administrator access. We release 4273π, an operating system image for Raspberry Pi based on Raspbian Linux. This includes minor customisations for classroom use and includes our Open Access bioinformatics course, 4273π Bioinformatics for Biologists. This is based on the final-year undergraduate module BL4273, run on Raspberry Pi computers at the University of St Andrews, Semester 1, academic year 2012-2013. 4273π is a means to teach bioinformatics, including systems administration tasks, to undergraduates at low cost.
Statistical fingerprinting for malware detection and classification
Prowell, Stacy J.; Rathgeb, Christopher T.
2015-09-15
A system detects malware in a computing architecture with an unknown pedigree. The system includes a first computing device having a known pedigree and operating free of malware. The first computing device executes a series of instrumented functions that, when executed, provide a statistical baseline that is representative of the time it takes the software application to run on a computing device having a known pedigree. A second computing device executes a second series of instrumented functions that, when executed, provides an actual time that is representative of the time the known software application runs on the second computing device. The system detects malware when there is a difference in execution times between the first and the second computing devices.
HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies
NASA Astrophysics Data System (ADS)
De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.
2017-10-01
PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.
Running R Statistical Computing Environment Software on the Peregrine
for the development of new statistical methodologies and enjoys a large user base. Please consult the distribution details. Natural language support but running in an English locale R is a collaborative project programming paradigms to better leverage modern HPC systems. The CRAN task view for High Performance Computing
High Resolution Nature Runs and the Big Data Challenge
NASA Technical Reports Server (NTRS)
Webster, W. Phillip; Duffy, Daniel Q.
2015-01-01
NASA's Global Modeling and Assimilation Office at Goddard Space Flight Center is undertaking a series of very computationally intensive Nature Runs and a downscaled reanalysis. The nature runs use the GEOS-5 as an Atmospheric General Circulation Model (AGCM) while the reanalysis uses the GEOS-5 in Data Assimilation mode. This paper will present computational challenges from three runs, two of which are AGCM and one is downscaled reanalysis using the full DAS. The nature runs will be completed at two surface grid resolutions, 7 and 3 kilometers and 72 vertical levels. The 7 km run spanned 2 years (2005-2006) and produced 4 PB of data while the 3 km run will span one year and generate 4 BP of data. The downscaled reanalysis (MERRA-II Modern-Era Reanalysis for Research and Applications) will cover 15 years and generate 1 PB of data. Our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS), a specialization of the concept of business process-as-a-service that is an evolving extension of IaaS, PaaS, and SaaS enabled by cloud computing. In this presentation, we will describe two projects that demonstrate this shift. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS. MERRA/AS enables MapReduce analytics over MERRA reanalysis data collection by bringing together the high-performance computing, scalable data management, and a domain-specific climate data services API. NASA's High-Performance Science Cloud (HPSC) is an example of the type of compute-storage fabric required to support CAaaS. The HPSC comprises a high speed Infinib and network, high performance file systems and object storage, and a virtual system environments specific for data intensive, science applications. These technologies are providing a new tier in the data and analytic services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. In our experience, CAaaS lowers the barriers and risk to organizational change, fosters innovation and experimentation, and provides the agility required to meet our customers' increasing and changing needs
Mount, D W; Conrad, B
1986-01-01
We have previously described programs for a variety of types of sequence analysis (1-4). These programs have now been integrated into a single package. They are written in the standard C programming language and run on virtually any computer system with a C compiler, such as the IBM/PC and other computers running under the MS/DOS and UNIX operating systems. The programs are widely distributed and may be obtained from the authors as described below. PMID:3753780
NASA Astrophysics Data System (ADS)
Decyk, Viktor K.; Dauger, Dean E.
We have constructed a parallel cluster consisting of Apple Macintosh G4 computers running both Classic Mac OS as well as the Unix-based Mac OS X, and have achieved very good performance on numerically intensive, parallel plasma particle-in-cell simulations. Unlike other Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. This enables us to move parallel computing from the realm of experts to the mainstream of computing.
Topographic map of the Coronae Montes region of Mars - MTM 500k -35/087E OMKTT
Rosiek, Mark R.; Redding, Bonnie L.; Galuszca, Donna M.
2005-01-01
This map is part of a series of topographic maps of areas of special scientific interest on Mars. The topography was compiled photogrammetrically using Viking Orbiter stereo image pairs. The contour interval is 250 m. Horizontal and vertical control was established using the USGS Mars Digital Image Model 2.0 (MDIM 2.0) and data from the Mars Orbiter Laser Altimeter (MOLA).
Methods Engineering Workshop for the Shipbuilding Industry
1985-09-01
physiological process that reduces the performance capacity and motivation of humans sub- jected to excessive or repeated work stresses. fatigue allowance, Time...often involves basic business fundamentals, motivation , common sense, and the plain hard work of one person doing something. But the accomplishment that...utilize a batch mode form. ” (3) SERGE A. BIRN HAS A TRAINING SCHOOL IN CINCINNATI FOR MTM, MSD, MCD, MOTIVATION -PRODUCTIVITY , TIME STUDY, METHODS
Topographic Map of the Northeast Ascraeus Mons Region of Mars - MTM 500k 15/257E OMKT
,
2004-01-01
This map is part of a series of topographic maps of areas of special scientific interest on Mars. The topography was compiled photogrammetically using Viking Orbiter stereo image pairs. The contour interval is 250 meters. Horizontal and vertical control was established using the USGS Mars Digital Image Model 2.0 (MDIM 2.0) and data from the Mars Orbiter Laser Altimeter (MOLA).
Topographic Map of the Northwest Ascraeus Mons Region of Mars - MTM 500k 15/252E OMKT
,
2004-01-01
This map is part of a series of topographic maps of areas of special scientific interest on Mars. The topography was compiled photogrammetically using Viking Orbiter stereo image pairs. The contour interval is 250 meters. Horizontal and vertical control was established using the USGS Mars Digital Image Model 2.0 (MDIM 2.0) and data from the Mars Orbiter Laser Altimeter (MOLA).
Topographic Map of the Southeast Ascraeus Mons Region of Mars - MTM 500k 10/257E OMKT
,
2004-01-01
This map is part of a series of topographic maps of areas of special scientific interest on Mars. The topography was compiled photogrammetically using Viking Orbiter stereo image pairs. The contour interval is 250 meters. Horizontal and vertical control was established using the USGS Mars Digital Image Model 2.0 (MDIM 2.0) and data from the Mars Orbiter Laser Altimeter (MOLA).
Topographic Map of the Southwest Ascraeus Mons Region of Mars - MTM 500k 10/252E OMKT
,
2004-01-01
This map is part of a series of topographic maps of areas of special scientific interest on Mars. The topography was compiled photogrammetically using Viking Orbiter stereo image pairs. The contour interval is 250 meters. Horizontal and vertical control was established using the USGS Mars Digital Image Model 2.0 (MDIM 2.0) and data from the Mars Orbiter Laser Altimeter (MOLA).
ERIC Educational Resources Information Center
Matsumoto, Yukihisa; Sandoz, Jean-Christophe; Devaud, Jean-Marc; Lormant, Flore; Mizunami, Makoto; Giurfa, Martin
2014-01-01
Memory is a dynamic process that allows encoding, storage, and retrieval of information acquired through individual experience. In the honeybee "Apis mellifera," olfactory conditioning of the proboscis extension response (PER) has shown that besides short-term memory (STM) and mid-term memory (MTM), two phases of long-term memory (LTM)…
Satisfaction With Medication Therapy Management Services at a University Ambulatory Care Clinic.
Kim, Shiyun; Martin, Michelle T; Pierce, Andrea L; Zueger, Patrick
2016-06-01
A survey was issued to patients enrolled in the Medication Therapy Management Clinic (MTMC) at University of Illinois Hospital and Health Sciences (June 2011-January 2012) in order to assess satisfaction with pharmacy services provided by pharmacists. A 23-item survey was offered to 65 patients in the MTMC program before or after clinic visits. Since there is a paucity of data indicating the level of satisfaction with MTM services provided by pharmacists, this survey may contribute to the process of building a greater collaboration between the pharmacist and patient. Sixty-two of 65 patients completed the survey; satisfaction with MTMC pharmacists was demonstrated to be significantly positively correlated with overall satisfaction with the MTMC. Patient satisfaction is not significantly different according to age, gender, ethnicity, or number of disease states. Satisfaction with the pillbox service is not significantly different between younger and older patients. It was also noted that patients taking a greater number of medications had higher levels of satisfaction. Most patients indicated that they were satisfied with the MTMC pharmacists and services; further study linking patient satisfaction with MTM services to improved patient outcomes may allow our MTMC to serve as a model for other pharmacist-managed MTMCs serving similar patient populations. © The Author(s) 2014.
Irwin, Rossman P.; Grant, John A.
2013-01-01
Mars Transverse Mercator (MTM) quadrangles −15027, −20027, −25027, and −25032 (lat 12.5°−28° S., long 330°−335° E. and lat 22.5°−28° S., long 324.5°−330° E.) in southwestern Margaritifer Terra include diverse erosional landforms, sedimentary deposits, and tectonic structures that record a long geologic and geomorphic history. The northeastern regional slope of the pre-Noachian crustal dichotomy (as expressed along the Chryse trough) and structures of the informally named Middle Noachian or older Holden and Ladon impact basins dominate the topography of the map area. A series of mesoscale outflow channels, Uzboi, Ladon, and Morava Valles, integrated these formerly enclosed basins by overflow and incision around the Noachian/Hesperian transition, although some flooding may have occurred earlier. The area includes excellent examples of Late Noachian to Hesperian valley networks, dissected crater rims, alluvial fans, deltas, and light-toned layered deposits, particularly in Holden and Eberswalde craters. Structural forms include Tharsis-radial grabens, Hesperian wrinkle ridges, floor-fractured impact craters, and severely disrupted chaotic terrains. These well-preserved landforms and sedimentary deposits represent multiple erosional epochs and discrete flooding events, which provide significant insight into the geomorphic processes and climate change on early Mars.
Large-scale, thick, self-assembled, nacre-mimetic brick-walls as fire barrier coatings on textiles
NASA Astrophysics Data System (ADS)
Das, Paramita; Thomas, Helga; Moeller, Martin; Walther, Andreas
2017-01-01
Highly loaded polymer/clay nanocomposites with layered structures are emerging as robust fire retardant surface coatings. However, time-intensive sequential deposition processes, e.g. layer-by-layer strategies, hinders obtaining large coating thicknesses and complicates an implementation into existing technologies. Here, we demonstrate a single-step, water-borne approach to prepare thick, self-assembling, hybrid fire barrier coatings of sodium carboxymethyl cellulose (CMC)/montmorillonite (MTM) with well-defined, bioinspired brick-wall nanostructure, and showcase their application on textile. The coating thickness on the textile is tailored using different concentrations of CMC/MTM (1-5 wt%) in the coating bath. While lower concentrations impart conformal coatings of fibers, thicker continuous coatings are obtained on the textile surface from highest concentration. Comprehensive fire barrier and fire retardancy tests elucidate the increasing fire barrier and retardancy properties with increasing coating thickness. The materials are free of halogen and heavy metal atoms, and are sourced from sustainable and partly even renewable building blocks. We further introduce an amphiphobic surface modification on the coating to impart oil and water repellency, as well as self-cleaning features. Hence, our study presents a generic, environmentally friendly, scalable, and one-pot coating approach that can be introduced into existing technologies to prepare bioinspired, thick, fire barrier nanocomposite coatings on diverse surfaces.
User's instructions for the cardiovascular Walters model
NASA Technical Reports Server (NTRS)
Croston, R. C.
1973-01-01
The model is a combined, steady-state cardiovascular and thermal model. It was originally developed for interactive use, but was converted to batch mode simulation for the Sigma 3 computer. The model has the purpose to compute steady-state circulatory and thermal variables in response to exercise work loads and environmental factors. During a computer simulation run, several selected variables are printed at each time step. End conditions are also printed at the completion of the run.
A Quantum Computing Approach to Model Checking for Advanced Manufacturing Problems
2014-07-01
amount of time. In summary, the tool we developed succeeded in allowing us to produce good solutions for optimization problems that did not fit ...We compared the value of the objective obtained in each run with the known optimal value, and used this information to compute the probability of ...success for each given instance. Then we used this information to compute the expected number of repetitions (or runs) needed to obtain the optimal
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
Running Batch Jobs on Peregrine | High-Performance Computing | NREL
Using Resource Feature to Request Different Node Types Peregrine has several types of compute nodes incompatibility and get the job running. More information about requesting different node types in Peregrine is available. Queues In order to meet the needs of different types of jobs, nodes on Peregrine are available
Host-Nation Operations: Soldier Training on Governance (HOST-G) Training Support Package
2011-07-01
restricted this webpage from running scripts or ActiveX controls that could access your computer. Click here for options…” • If this occurs, select that...scripts and ActiveX controls can be useful, but active content might also harm your computer. Are you sure you want to let this file run active
24 CFR 15.110 - What fees will HUD charge?
Code of Federal Regulations, 2013 CFR
2013-04-01
... duplicating machinery. The computer run time includes the cost of operating a central processing unit for that... Applies. (6) Computer run time (includes only mainframe search time not printing) The direct cost of... estimated fee is more than $250.00 or you have a history of failing to pay FOIA fees to HUD in a timely...
NASA Technical Reports Server (NTRS)
Roberts, Floyd E., III
1994-01-01
Software provides for control and acquisition of data from optical pyrometer. There are six individual programs in PYROLASER package. Provides quick and easy way to set up, control, and program standard Pyrolaser. Temperature and emisivity measurements either collected as if Pyrolaser in manual operating mode or displayed on real-time strip charts and stored in standard spreadsheet format for posttest analysis. Shell supplied to allow macros, which are test-specific, added to system easily. Written using Labview software for use on Macintosh-series computers running System 6.0.3 or later, Sun Sparc-series computers running Open-Windows 3.0 or MIT's X Window System (X11R4 or X11R5), and IBM PC or compatible computers running Microsoft Windows 3.1 or later.
NASA Technical Reports Server (NTRS)
1972-01-01
The IDAPS (Image Data Processing System) is a user-oriented, computer-based, language and control system, which provides a framework or standard for implementing image data processing applications, simplifies set-up of image processing runs so that the system may be used without a working knowledge of computer programming or operation, streamlines operation of the image processing facility, and allows multiple applications to be run in sequence without operator interaction. The control system loads the operators, interprets the input, constructs the necessary parameters for each application, and cells the application. The overlay feature of the IBSYS loader (IBLDR) provides the means of running multiple operators which would otherwise overflow core storage.
Identification of Program Signatures from Cloud Computing System Telemetry Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Nicole M.; Greaves, Mark T.; Smith, William P.
Malicious cloud computing activity can take many forms, including running unauthorized programs in a virtual environment. Detection of these malicious activities while preserving the privacy of the user is an important research challenge. Prior work has shown the potential viability of using cloud service billing metrics as a mechanism for proxy identification of malicious programs. Previously this novel detection method has been evaluated in a synthetic and isolated computational environment. In this paper we demonstrate the ability of billing metrics to identify programs, in an active cloud computing environment, including multiple virtual machines running on the same hypervisor. The openmore » source cloud computing platform OpenStack, is used for private cloud management at Pacific Northwest National Laboratory. OpenStack provides a billing tool (Ceilometer) to collect system telemetry measurements. We identify four different programs running on four virtual machines under the same cloud user account. Programs were identified with up to 95% accuracy. This accuracy is dependent on the distinctiveness of telemetry measurements for the specific programs we tested. Future work will examine the scalability of this approach for a larger selection of programs to better understand the uniqueness needed to identify a program. Additionally, future work should address the separation of signatures when multiple programs are running on the same virtual machine.« less
Providing Assistive Technology Applications as a Service Through Cloud Computing.
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
2015-01-01
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
MATH77 - A LIBRARY OF MATHEMATICAL SUBPROGRAMS FOR FORTRAN 77, RELEASE 4.0
NASA Technical Reports Server (NTRS)
Lawson, C. L.
1994-01-01
MATH77 is a high quality library of ANSI FORTRAN 77 subprograms implementing contemporary algorithms for the basic computational processes of science and engineering. The portability of MATH77 meets the needs of present-day scientists and engineers who typically use a variety of computing environments. Release 4.0 of MATH77 contains 454 user-callable and 136 lower-level subprograms. Usage of the user-callable subprograms is described in 69 sections of the 416 page users' manual. The topics covered by MATH77 are indicated by the following list of chapter titles in the users' manual: Mathematical Functions, Pseudo-random Number Generation, Linear Systems of Equations and Linear Least Squares, Matrix Eigenvalues and Eigenvectors, Matrix Vector Utilities, Nonlinear Equation Solving, Curve Fitting, Table Look-Up and Interpolation, Definite Integrals (Quadrature), Ordinary Differential Equations, Minimization, Polynomial Rootfinding, Finite Fourier Transforms, Special Arithmetic , Sorting, Library Utilities, Character-based Graphics, and Statistics. Besides subprograms that are adaptations of public domain software, MATH77 contains a number of unique packages developed by the authors of MATH77. Instances of the latter type include (1) adaptive quadrature, allowing for exceptional generality in multidimensional cases, (2) the ordinary differential equations solver used in spacecraft trajectory computation for JPL missions, (3) univariate and multivariate table look-up and interpolation, allowing for "ragged" tables, and providing error estimates, and (4) univariate and multivariate derivative-propagation arithmetic. MATH77 release 4.0 is a subroutine library which has been carefully designed to be usable on any computer system that supports the full ANSI standard FORTRAN 77 language. It has been successfully implemented on a CRAY Y/MP computer running UNICOS, a UNISYS 1100 computer running EXEC 8, a DEC VAX series computer running VMS, a Sun4 series computer running SunOS, a Hewlett-Packard 720 computer running HP-UX, a Macintosh computer running MacOS, and an IBM PC compatible computer running MS-DOS. Accompanying the library is a set of 196 "demo" drivers that exercise all of the user-callable subprograms. The FORTRAN source code for MATH77 comprises 109K lines of code in 375 files with a total size of 4.5Mb. The demo drivers comprise 11K lines of code and 418K. Forty-four percent of the lines of the library code and 29% of those in the demo code are comment lines. The standard distribution medium for MATH77 is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 9track 1600 BPI magnetic tape in VAX BACKUP format and a TK50 tape cartridge in VAX BACKUP format. An electronic copy of the documentation is included on the distribution media. Previous releases of MATH77 have been used over a number of years in a variety of JPL applications. MATH77 Release 4.0 was completed in 1992. MATH77 is a copyrighted work with all copyright vested in NASA.
Computer Simulation of Great Lakes-St. Lawrence Seaway Icebreaker Requirements.
1980-01-01
of Run No. 1 for Taconite Task Command ... ....... 6-41 6.22d Results of Run No. I for Oil Can Task Command ........ ... 6-42 6.22e Results of Run No...Port and Period for Run No. 2 ... .. ... ... 6-47 6.23c Results of Run No. 2 for Taconite Task Command ... ....... 6-48 6.23d Results of Run No. 2 for...6-53 6.24b Predicted Icebreaker Fleet by Home Port and Period for Run No. 3 6-54 6.24c Results of Run No. 3 for Taconite Task Command. ....... 6
O'Shea, Terrence E; Zarowitz, Barbara J; Erwin, W Gary
2017-01-01
Since 2013, Part D sponsors have been required to offer comprehensive medication reviews (CMRs) to all beneficiaries enrolled in their medication therapy management (MTM) programs at least annually, including those in long-term care (LTC) settings. Since that time, MTM providers have found that accessing and completing CMRs with beneficiaries is frequently prohibitively complex, since the process often requires a live, face-to-face interactive interview where the beneficiary resides. However, with the migration of the CMR completion rate from a star ratings display measure to an active measure, coupled with the new CMR completion rate cutpoints for 2016, accessing this population for CMR completion has heightened importance. Our proprietary consultant pharmacist (CP) software was programmed in 2012 to produce a cover letter, medication action plan, and personal medication list per CMS standardized format specifications. Using this system, CPs were trained to perform and document CMRs and the interactive interviews. MTM-eligible Part D beneficiaries, identified by several contracted clients as residing in LTC serviced by Omnicare, were provided CMRs and summaries written in CMS standardized format by CPs. Residents with cognitive impairment were identified using 3 data elements in the Minimum Data Set (MDS). In 2015, 7,935 MTM-eligible beneficiaries were identified as receiving medications from an Omnicare pharmacy. After excluding those who were disenrolled by their prescription drug plans, discharged from the LTC facility, or resided in a LTC facility no longer serviced by Omnicare, 5,593 residents were available for CMR completion. Of these, only 3% refused the CMR offer, and 5,392 CMRs (96%) were completed successfully. Thirty-nine percent of residents had cognitive impairment per MDS assessments; in those instances, CMRs were conducted with someone other than the beneficiary. Based on the CMRs and interactive interviews, 7,527 drug therapy problem recommendations were made to prescribers, about 50% of which resulted in an alteration in therapy, including reductions in polypharmacy and high-risk medications. The CMR process and written summary in CMS standardized format works effectively for residents in LTC when performed by CPs in the facility, as evidenced by high completion rates and drug therapy problem identification/resolution. Part D plans should further consider using CPs to conduct CMRs in LTC settings. No outside funding supported this research. All authors are employees of Omnicare, a CVS Health Company, and are stockholders of CVS Health. O'Shea and Zarowitz have received research funding (unrelated to the submitted work) from Acadia, AstraZeneca, and Sunovion. The abstract for this article was presented as a research poster at the Academy of Managed Care and Specialty Pharmacy 2016 Annual Meeting; April 21, 2016; San Francisco, California. Study concept and design were contributed by O'Shea and Zarowitz, along with Erwin. O'Shea collected the data, and data interpretation was performed primarily by O'Shea, along with Zarowitz and Erwin. The manuscript was written by O'Shea, along with Zarowitz, and revised primarily by Zarowitz, along with O'Shea and Erwin.
The rid-redundant procedure in C-Prolog
NASA Technical Reports Server (NTRS)
Chen, Huo-Yan; Wah, Benjamin W.
1987-01-01
C-Prolog can conveniently be used for logical inferences on knowledge bases. However, as similar to many search methods using backward chaining, a large number of redundant computation may be produced in recursive calls. To overcome this problem, the 'rid-redundant' procedure was designed to rid all redundant computations in running multi-recursive procedures. Experimental results obtained for C-Prolog on the Vax 11/780 computer show that there is an order of magnitude improvement in the running time and solvable problem size.
An Upgrade of the Aeroheating Software ''MINIVER''
NASA Technical Reports Server (NTRS)
Louderback, Pierce
2013-01-01
Detailed computational modeling: CFO often used to create and execute computational domains. Increasing complexity when moving from 20 to 30 geometries. Computational time increased as finer grids are used (accuracy). Strong tool, but takes time to set up and run. MINIVER: Uses theoretical and empirical correlations. Orders of magnitude faster to set up and run. Not as accurate as CFO, but gives reasonable estimations. MINIVER's Drawbacks: Rigid command-line interface. Lackluster, unorganized documentation. No central control; multiple versions exist and have diverged.
A Functional Description of the Geophysical Data Acquisition System
1990-08-10
less than 50 SPS nor greater than 250 SPS 3.0 SENSORS/TRANSDUCERS 3.1 CHAPTER OVERVIEW Most of the research supported by GDAS has primarily involved two...signal for the computer. The SRUN signal from the computer is fed to a retriggerable oneshot multivibrator on the board. SRUN consists of a pulse train...that is present when the computer is running. The oneshot output drives the RUN lamp on the front panel. Finally, one pin on the board edge connector is
Network support for system initiated checkpoints
Chen, Dong; Heidelberger, Philip
2013-01-29
A system, method and computer program product for supporting system initiated checkpoints in parallel computing systems. The system and method generates selective control signals to perform checkpointing of system related data in presence of messaging activity associated with a user application running at the node. The checkpointing is initiated by the system such that checkpoint data of a plurality of network nodes may be obtained even in the presence of user applications running on highly parallel computers that include ongoing user messaging activity.
Convergence properties of simple genetic algorithms
NASA Technical Reports Server (NTRS)
Bethke, A. D.; Zeigler, B. P.; Strauss, D. M.
1974-01-01
The essential parameters determining the behaviour of genetic algorithms were investigated. Computer runs were made while systematically varying the parameter values. Results based on the progress curves obtained from these runs are presented along with results based on the variability of the population as the run progresses.
Modeling Subsurface Reactive Flows Using Leadership-Class Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mills, Richard T; Hammond, Glenn; Lichtner, Peter
2009-01-01
We describe our experiences running PFLOTRAN - a code for simulation of coupled hydro-thermal-chemical processes in variably saturated, non-isothermal, porous media - on leadership-class supercomputers, including initial experiences running on the petaflop incarnation of Jaguar, the Cray XT5 at the National Center for Computational Sciences at Oak Ridge National Laboratory. PFLOTRAN utilizes fully implicit time-stepping and is built on top of the Portable, Extensible Toolkit for Scientific Computation (PETSc). We discuss some of the hurdles to 'at scale' performance with PFLOTRAN and the progress we have made in overcoming them on leadership-class computer architectures.
Williams, Paul T
2012-01-01
Current physical activity recommendations assume that different activities can be exchanged to produce the same weight-control benefits so long as total energy expended remains the same (exchangeability premise). To this end, they recommend calculating energy expenditure as the product of the time spent performing each activity and the activity's metabolic equivalents (MET), which may be summed to achieve target levels. The validity of the exchangeability premise was assessed using data from the National Runners' Health Study. Physical activity dose was compared to body mass index (BMI) and body circumferences in 33,374 runners who reported usual distance run and pace, and usual times spent running and other exercises per week. MET hours per day (METhr/d) from running was computed from: a) time and intensity, and b) reported distance run (1.02 MET • hours per km). When computed from time and intensity, the declines (slope±SE) per METhr/d were significantly greater (P<10(-15)) for running than non-running exercise for BMI (slopes±SE, male: -0.12 ± 0.00 vs. 0.00±0.00; female: -0.12 ± 0.00 vs. -0.01 ± 0.01 kg/m(2) per METhr/d) and waist circumference (male: -0.28 ± 0.01 vs. -0.07±0.01; female: -0. 31±0.01 vs. -0.05 ± 0.01 cm per METhr/d). Reported METhr/d of running was 38% to 43% greater when calculated from time and intensity than distance. Moreover, the declines per METhr/d run were significantly greater when estimated from reported distance for BMI (males: -0.29 ± 0.01; females: -0.27 ± 0.01 kg/m(2) per METhr/d) and waist circumference (males: -0.67 ± 0.02; females: -0.69 ± 0.02 cm per METhr/d) than when computed from time and intensity (cited above). The exchangeability premise was not supported for running vs. non-running exercise. Moreover, distance-based running prescriptions may provide better weight control than time-based prescriptions for running or other activities. Additional longitudinal studies and randomized clinical trials are required to verify these results prospectively.
A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software
NASA Astrophysics Data System (ADS)
Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.
2017-10-01
Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.
Creating a Parallel Version of VisIt for Microsoft Windows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitlock, B J; Biagas, K S; Rawson, P L
2011-12-07
VisIt is a popular, free interactive parallel visualization and analysis tool for scientific data. Users can quickly generate visualizations from their data, animate them through time, manipulate them, and save the resulting images or movies for presentations. VisIt was designed from the ground up to work on many scales of computers from modest desktops up to massively parallel clusters. VisIt is comprised of a set of cooperating programs. All programs can be run locally or in client/server mode in which some run locally and some run remotely on compute clusters. The VisIt program most able to harness today's computing powermore » is the VisIt compute engine. The compute engine is responsible for reading simulation data from disk, processing it, and sending results or images back to the VisIt viewer program. In a parallel environment, the compute engine runs several processes, coordinating using the Message Passing Interface (MPI) library. Each MPI process reads some subset of the scientific data and filters the data in various ways to create useful visualizations. By using MPI, VisIt has been able to scale well into the thousands of processors on large computers such as dawn and graph at LLNL. The advent of multicore CPU's has made parallelism the 'new' way to achieve increasing performance. With today's computers having at least 2 cores and in many cases up to 8 and beyond, it is more important than ever to deploy parallel software that can use that computing power not only on clusters but also on the desktop. We have created a parallel version of VisIt for Windows that uses Microsoft's MPI implementation (MSMPI) to process data in parallel on the Windows desktop as well as on a Windows HPC cluster running Microsoft Windows Server 2008. Initial desktop parallel support for Windows was deployed in VisIt 2.4.0. Windows HPC cluster support has been completed and will appear in the VisIt 2.5.0 release. We plan to continue supporting parallel VisIt on Windows so our users will be able to take full advantage of their multicore resources.« less
NASA Astrophysics Data System (ADS)
Varela Rodriguez, F.
2011-12-01
The control system of each of the four major Experiments at the CERN Large Hadron Collider (LHC) is distributed over up to 160 computers running either Linux or Microsoft Windows. A quick response to abnormal situations of the computer infrastructure is crucial to maximize the physics usage. For this reason, a tool was developed to supervise, identify errors and troubleshoot such a large system. Although the monitoring of the performance of the Linux computers and their processes was available since the first versions of the tool, it is only recently that the software package has been extended to provide similar functionality for the nodes running Microsoft Windows as this platform is the most commonly used in the LHC detector control systems. In this paper, the architecture and the functionality of the Windows Management Instrumentation (WMI) client developed to provide centralized monitoring of the nodes running different flavour of the Microsoft platform, as well as the interface to the SCADA software of the control systems are presented. The tool is currently being commissioned by the Experiments and it has already proven to be very efficient optimize the running systems and to detect misbehaving processes or nodes.
Computational steering of GEM based detector simulations
NASA Astrophysics Data System (ADS)
Sheharyar, Ali; Bouhali, Othmane
2017-10-01
Gas based detector R&D relies heavily on full simulation of detectors and their optimization before final prototypes can be built and tested. These simulations in particular those with complex scenarios such as those involving high detector voltages or gas with larger gains are computationally intensive may take several days or weeks to complete. These long-running simulations usually run on the high-performance computers in batch mode. If the results lead to unexpected behavior, then the simulation might be rerun with different parameters. However, the simulations (or jobs) may have to wait in a queue until they get a chance to run again because the supercomputer is a shared resource that maintains a queue of other user programs as well and executes them as time and priorities permit. It may result in inefficient resource utilization and increase in the turnaround time for the scientific experiment. To overcome this issue, the monitoring of the behavior of a simulation, while it is running (or live), is essential. In this work, we employ the computational steering technique by coupling the detector simulations with a visualization package named VisIt to enable the exploration of the live data as it is produced by the simulation.
CERN openlab: Engaging industry for innovation in the LHC Run 3-4 R&D programme
NASA Astrophysics Data System (ADS)
Girone, M.; Purcell, A.; Di Meglio, A.; Rademakers, F.; Gunne, K.; Pachou, M.; Pavlou, S.
2017-10-01
LHC Run3 and Run4 represent an unprecedented challenge for HEP computing in terms of both data volume and complexity. New approaches are needed for how data is collected and filtered, processed, moved, stored and analysed if these challenges are to be met with a realistic budget. To develop innovative techniques we are fostering relationships with industry leaders. CERN openlab is a unique resource for public-private partnership between CERN and leading Information Communication and Technology (ICT) companies. Its mission is to accelerate the development of cutting-edge solutions to be used by the worldwide HEP community. In 2015, CERN openlab started its phase V with a strong focus on tackling the upcoming LHC challenges. Several R&D programs are ongoing in the areas of data acquisition, networks and connectivity, data storage architectures, computing provisioning, computing platforms and code optimisation and data analytics. This paper gives an overview of the various innovative technologies that are currently being explored by CERN openlab V and discusses the long-term strategies that are pursued by the LHC communities with the help of industry in closing the technological gap in processing and storage needs expected in Run3 and Run4.
NASA Technical Reports Server (NTRS)
Yang, Guowei; Pasareanu, Corina S.; Khurshid, Sarfraz
2012-01-01
This paper introduces memoized symbolic execution (Memoise), a novel approach for more efficient application of forward symbolic execution, which is a well-studied technique for systematic exploration of program behaviors based on bounded execution paths. Our key insight is that application of symbolic execution often requires several successive runs of the technique on largely similar underlying problems, e.g., running it once to check a program to find a bug, fixing the bug, and running it again to check the modified program. Memoise introduces a trie-based data structure that stores the key elements of a run of symbolic execution. Maintenance of the trie during successive runs allows re-use of previously computed results of symbolic execution without the need for re-computing them as is traditionally done. Experiments using our prototype embodiment of Memoise show the benefits it holds in various standard scenarios of using symbolic execution, e.g., with iterative deepening of exploration depth, to perform regression analysis, or to enhance coverage.
E2 Cargo Transport--The Necessary Inclusion of Primary Oceanic Airlift
2013-06-01
Energy Initiatives (L: Lead-; F: Follow-; W: Watch- Industry ) (Maybury, 2010:6...at the same time halting all supplies—including coal, food and fresh milk—which were drawn from the Soviet Sector…There was no longer any doubt...which consists of 223 C-17s and 111 C- 5s , provides a capacity of 35.9 million ton-miles per day (MTM/D). The peak for MCRS Case 1, which
Universal Zero Specular Reflection Curves for MetaMaterials
2012-09-01
with their handedness, RH or LH. r 5 According to Holloway [7], we find that metasurfaces (or metafilms) can be used in place of MTMs in many...applications. Metafilm refers to thin metamaterial that is only one unit-cell thick. “ Metasurfaces have the advantage of taking up less physical space...than do full three-dimensional MTM structures.” [7] Further studies were conducted on metasurface characterization, various applications, and how
NATO Mobilization and Reinforcement: Can We Get There from Here?
1990-05-30
of the USSR," Military Power , 1986, p . 93. 27. Pettavino. Paul J., Ibid. p . 1. 28. Carlucci, Frank C ., Ibid, p . 29. 29. Rowden. William H., Ibid, p ...mobility objectives (requirements) for the European theater: o Sixty-six million ton-miles-per-day (MTM/ D ) of cargo airlift. 25 o Sealift for one...Aircraft In Mili- tary Use," Independent Review of Economic, Political and Military Power , VOL. 33, Dec 88.
Simple, efficient allocation of modelling runs on heterogeneous clusters with MPI
Donato, David I.
2017-01-01
In scientific modelling and computation, the choice of an appropriate method for allocating tasks for parallel processing depends on the computational setting and on the nature of the computation. The allocation of independent but similar computational tasks, such as modelling runs or Monte Carlo trials, among the nodes of a heterogeneous computational cluster is a special case that has not been specifically evaluated previously. A simulation study shows that a method of on-demand (that is, worker-initiated) pulling from a bag of tasks in this case leads to reliably short makespans for computational jobs despite heterogeneity both within and between cluster nodes. A simple reference implementation in the C programming language with the Message Passing Interface (MPI) is provided.
Evaluating the Efficacy of the Cloud for Cluster Computation
NASA Technical Reports Server (NTRS)
Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom
2012-01-01
Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.
Layer-by-layer assembly of nanostructured composites: Mechanics and applications
NASA Astrophysics Data System (ADS)
Podsiadlo, Paul
The development of efficient methods for preparation of nanometer-sized materials and our evolving ability to manipulate the nanoscale objects have brought about a scientific and technological revolution called: nanotechnology. This revolution has been especially driven by discovery of unique nanoscale properties of the nanomaterials which are governed by their inherent size. Today, the total societal impact of nanotechnology is expected to be greater than the combined influences that the silicon integrated circuit, medical imaging, computer-aided engineering, and man-made polymers have had in the last century. Many nanomaterials were also found to possess exceptional mechanical properties. This led to tremendous interest into developing composite materials by exploiting the mechanical properties of these building blocks. In spite of a tremendous volume of work done in the field, preparation of such nanocomposites (NCs) has proven to be elusive due to inability of traditional "top-down" fabrication approaches to effectively harness properties of the nano-scale building blocks. This thesis focuses on preparation of organic/inorganic and solely organic NCs via a bottom-up nano-manufacturing approach called the layer-by-layer (LBL) assembly. Two natural and inexpensive nanoscale building blocks are explored: nanosheets of Na+-montmorillonite clay (MTM) and rod-shaped nanocrystals of cellulose (CNRs). In the first part of the thesis, we present results from systematic study of mechanics of MTM-based NCs. Different compositions are explored with a goal of understanding the nanoscale mechanics. Ultimately, development of a transparent composite with record-high strength and stiffness is presented. In the second part, we present results from LBL assembly of the CNRs. We demonstrate feasibility of assembly and mechanical properties of the resulting films. We also demonstrate preparation of LBL films with anti- reflective properties from tunicate (a sea animal) CNRs. In the final part, we show preparation of high toughness and hierarchically organized NCs using two concepts: "exponential" LBL (e-LBL) assembly and charged polyurethanes. We show preparation of novel e-LBL structures and highly flexible LBL multilayers. We also demonstrate preparation of macro-scale composites from hierarchical, post-assembly consolidation of LBL sheets. This last result represents a potential paradigm change in the practice of LBL assembly by enabling transformation of the thin-films into macro-scale structures.
Controlling Laboratory Processes From A Personal Computer
NASA Technical Reports Server (NTRS)
Will, H.; Mackin, M. A.
1991-01-01
Computer program provides natural-language process control from IBM PC or compatible computer. Sets up process-control system that either runs without operator or run by workers who have limited programming skills. Includes three smaller programs. Two of them, written in FORTRAN 77, record data and control research processes. Third program, written in Pascal, generates FORTRAN subroutines used by other two programs to identify user commands with device-driving routines written by user. Also includes set of input data allowing user to define user commands to be executed by computer. Requires personal computer operating under MS-DOS with suitable hardware interfaces to all controlled devices. Also requires FORTRAN 77 compiler and device drivers written by user.
WinHPC System Programming | High-Performance Computing | NREL
Programming WinHPC System Programming Learn how to build and run an MPI (message passing interface (mpi.h) and library (msmpi.lib) are. To build from the command line, run... Start > Intel Software Development Tools > Intel C++ Compiler Professional... > C++ Build Environment for applications running
Computer-based testing of the modified essay question: the Singapore experience.
Lim, Erle Chuen-Hian; Seet, Raymond Chee-Seong; Oh, Vernon M S; Chia, Boon-Lock; Aw, Marion; Quak, Seng-Hock; Ong, Benjamin K C
2007-11-01
The modified essay question (MEQ), featuring an evolving case scenario, tests a candidate's problem-solving and reasoning ability, rather than mere factual recall. Although it is traditionally conducted as a pen-and-paper examination, our university has run the MEQ using computer-based testing (CBT) since 2003. We describe our experience with running the MEQ examination using the IVLE, or integrated virtual learning environment (https://ivle.nus.edu.sg), provide a blueprint for universities intending to conduct computer-based testing of the MEQ, and detail how our MEQ examination has evolved since its inception. An MEQ committee, comprising specialists in key disciplines from the departments of Medicine and Paediatrics, was formed. We utilized the IVLE, developed for our university in 1998, as the online platform on which we ran the MEQ. We calculated the number of man-hours (academic and support staff) required to run the MEQ examination, using either a computer-based or pen-and-paper format. With the support of our university's information technology (IT) specialists, we have successfully run the MEQ examination online, twice a year, since 2003. Initially, we conducted the examination with short-answer questions only, but have since expanded the MEQ examination to include multiple-choice and extended matching questions. A total of 1268 man-hours was spent in preparing for, and running, the MEQ examination using CBT, compared to 236.5 man-hours to run it using a pen-and-paper format. Despite being more labour-intensive, our students and staff prefer CBT to the pen-and-paper format. The MEQ can be conducted using a computer-based testing scenario, which offers several advantages over a pen-and-paper format. We hope to increase the number of questions and incorporate audio and video files, featuring clinical vignettes, to the MEQ examination in the near future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kopp, H.J.; Mortensen, G.A.
1978-04-01
Approximately 60% of the full CDC 6600/7600 Datatran 2.0 capability was made operational on IBM 360/370 equipment. Sufficient capability was made operational to demonstrate adequate performance for modular program linking applications. Also demonstrated were the basic capabilities and performance required to support moderate-sized data base applications and moderately active scratch input/output applications. Approximately one to two calendar years are required to develop DATATRAN 2.0 capabilities fully for the entire spectrum of applications proposed. Included in the next stage of conversion should be syntax checking and syntax conversion features that would foster greater FORTRAN compatibility between IBM and CDC developed modules.more » The batch portion of the JOSHUA Modular System, which was developed by Savannah River Laboratory to run on an IBM computer, was examined for the feasibility of conversion to run on a Control Data Corporation (CDC) computer. Portions of the JOSHUA Precompiler were changed so as to be operable on the CDC computer. The Data Manager and Batch Monitor were also examined for conversion feasibility, but no changes were made in them. It appears to be feasible to convert the batch portion of the JOSHUA Modular System to run on a CDC computer with an estimated additional two to three man-years of effort. 9 tables.« less
Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.
Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P
2010-01-15
A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.
Katz, Jonathan E
2017-01-01
Laboratories tend to be amenable environments for long-term reliable operation of scientific measurement equipment. Indeed, it is not uncommon to find equipment 5, 10, or even 20+ years old still being routinely used in labs. Unfortunately, the Achilles heel for many of these devices is the control/data acquisition computer. Often these computers run older operating systems (e.g., Windows XP) and, while they might only use standard network, USB or serial ports, they require proprietary software to be installed. Even if the original installation disks can be found, it is a burdensome process to reinstall and is fraught with "gotchas" that can derail the process-lost license keys, incompatible hardware, forgotten configuration settings, etc. If you have running legacy instrumentation, the computer is the ticking time bomb waiting to put a halt to your operation.In this chapter, I describe how to virtualize your currently running control computer. This virtualized computer "image" is easy to maintain, easy to back up and easy to redeploy. I have used this multiple times in my own lab to greatly improve the robustness of my legacy devices.After completing the steps in this chapter, you will have your original control computer as well as a virtual instance of that computer with all the software installed ready to control your hardware should your original computer ever be decommissioned.
Resource Efficient Hardware Architecture for Fast Computation of Running Max/Min Filters
Torres-Huitzil, Cesar
2013-01-01
Running max/min filters on rectangular kernels are widely used in many digital signal and image processing applications. Filtering with a k × k kernel requires of k 2 − 1 comparisons per sample for a direct implementation; thus, performance scales expensively with the kernel size k. Faster computations can be achieved by kernel decomposition and using constant time one-dimensional algorithms on custom hardware. This paper presents a hardware architecture for real-time computation of running max/min filters based on the van Herk/Gil-Werman (HGW) algorithm. The proposed architecture design uses less computation and memory resources than previously reported architectures when targeted to Field Programmable Gate Array (FPGA) devices. Implementation results show that the architecture is able to compute max/min filters, on 1024 × 1024 images with up to 255 × 255 kernels, in around 8.4 milliseconds, 120 frames per second, at a clock frequency of 250 MHz. The implementation is highly scalable for the kernel size with good performance/area tradeoff suitable for embedded applications. The applicability of the architecture is shown for local adaptive image thresholding. PMID:24288456
Energy Frontier Research With ATLAS: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, John; Black, Kevin; Ahlen, Steve
2016-06-14
The Boston University (BU) group is playing key roles across the ATLAS experiment: in detector operations, the online trigger, the upgrade, computing, and physics analysis. Our team has been critical to the maintenance and operations of the muon system since its installation. During Run 1 we led the muon trigger group and that responsibility continues into Run 2. BU maintains and operates the ATLAS Northeast Tier 2 computing center. We are actively engaged in the analysis of ATLAS data from Run 1 and Run 2. Physics analyses we have contributed to include Standard Model measurements (W and Z cross sections,more » t\\bar{t} differential cross sections, WWW^* production), evidence for the Higgs decaying to \\tau^+\\tau^-, and searches for new phenomena (technicolor, Z' and W', vector-like quarks, dark matter).« less
Automatic Data Filter Customization Using a Genetic Algorithm
NASA Technical Reports Server (NTRS)
Mandrake, Lukas
2013-01-01
This work predicts whether a retrieval algorithm will usefully determine CO2 concentration from an input spectrum of GOSAT (Greenhouse Gases Observing Satellite). This was done to eliminate needless runtime on atmospheric soundings that would never yield useful results. A space of 50 dimensions was examined for predictive power on the final CO2 results. Retrieval algorithms are frequently expensive to run, and wasted effort defeats requirements and expends needless resources. This algorithm could be used to help predict and filter unneeded runs in any computationally expensive regime. Traditional methods such as the Fischer discriminant analysis and decision trees can attempt to predict whether a sounding will be properly processed. However, this work sought to detect a subsection of the dimensional space that can be simply filtered out to eliminate unwanted runs. LDAs (linear discriminant analyses) and other systems examine the entire data and judge a "best fit," giving equal weight to complex and problematic regions as well as simple, clear-cut regions. In this implementation, a genetic space of "left" and "right" thresholds outside of which all data are rejected was defined. These left/right pairs are created for each of the 50 input dimensions. A genetic algorithm then runs through countless potential filter settings using a JPL computer cluster, optimizing the tossed-out data s yield (proper vs. improper run removal) and number of points tossed. This solution is robust to an arbitrary decision boundary within the data and avoids the global optimization problem of whole-dataset fitting using LDA or decision trees. It filters out runs that would not have produced useful CO2 values to save needless computation. This would be an algorithmic preprocessing improvement to any computationally expensive system.
Abstracts of the Annual Meeting of Planetary Geologic Mappers, Flagstaff, AZ, 2010
NASA Technical Reports Server (NTRS)
Bleamaster, Leslie F., III (Editor); Tanaka, Kenneth L. (Editor); Kelley, Michael S. (Editor)
2010-01-01
Topics covered include: Detailed Analysis of the Intra-Ejecta Dark Plains of Caloris Basin, Mercury; The Formation and Evolution of Tessera and Insights into the Beginning of Recorded History on Venus: Geology of the Fortuna Tessera Quadrangle (V-2); Geologic Map of the Snegurochka Planitia Quadrangle (V-1): Implications for the Volcanic History of the North Polar Region of Venus; Geological Map of the Fredegonade (V-57) Quadrangle, Venus: Status Report; Geologic Mapping of V-19; Geology of the Lachesis Tessera Quadrangle (V-18), Venus; Comparison of Mapping Tessera Terrain in the Phoebe Regio (V-41) and Tellus Tessera (V-10) Quadrangles; Geologic Mapping of the Devana Chasma (V-29) Quadrangle, Venus; Geologic Mapping of the Aristarchus Plateau Region on the Moon; Geologic Mapping of the Lunar South Pole Quadrangle (LQ-30); The Pilot Lunar Geologic Mapping Project: Summary Results and Recommendations from the Copernicus Quadrangle; Geologic Mapping of the Nili Fossae Region of Mars: MTM Quadrangles 20287, 20282, 25287, 25282, 30287, and 30282; Geologic Mapping of the Mawrth Vallis Region, Mars: MTM Quadrangles 25022, 25017, 25012, 20022, 20017, and 20012; Evidence for an Ancient Buried Landscape on the NW Rim of Hellas Basin, Mars; New Geologic Map of the Argyre Region of Mars: Deciphering the Geologic History Through Mars Global Surveyor, Mars Odyssey, and Mars Express Data; Geologic Mapping in the Hesperia Planum Region of Mars; Geologic Mapping of the Meridiani Region of Mars; Geologic Mapping in Southern Margaritifer Terra; Geology of -30247, -35247, and -40247 Quadrangles, Southern Hesperia Planum, Mars; The Interaction of Impact Melt, Impact-Derived Sediment, and Volatiles at Crater Tooting, Mars; Geologic Map of the Olympia Cavi Region of Mars (MTM 85200): A Summary of Tactical Approaches; Geology of the Terra Cimmeria-Utopia Planitia Highland Lowland Transitional Zone: Final Technical Approach and Scientific Results; Geology of Libya Montes and the Interbasin Plains of Northern Tyrrhena Terra, Mars: First Year Results and Second Year Work Plan; Mars Global Geologic Mapping Progress and Suggested Geographic-Based Hierarchal Systems for Unit Grouping and Naming; Progress in the Scandia Region Geologic Map of Mars; Geomorphic Mapping of MTMS -20022 and -20017; Geologic Mapping of the Medusae Fossae Formation, Mars, and the Northern Lowland Plains, Venus; Volcanism on Io: Results from Global Geologic Mapping; Employing Geodatabases for Planetary Mapping Conduct - Requirements, Concepts and Solutions; and Planetary Geologic Mapping Handbook - 2010.
CASK and CaMKII function in the mushroom body α′/β′ neurons during Drosophila memory formation
Malik, Bilal R.; Gillespie, John Michael; Hodge, James J. L.
2013-01-01
Ca2+/CaM serine/threonine kinase II (CaMKII) is a central molecule in mechanisms of synaptic plasticity and memory. A vital feature of CaMKII in plasticity is its ability to switch to a calcium (Ca2+) independent constitutively active state after autophosphorylation at threonine 287 (T287). A second pair of sites, T306 T307 in the calmodulin (CaM) binding region once autophosphorylated, prevent subsequent CaM binding and inactivates the kinase during synaptic plasticity and memory. Recently a synaptic molecule called Ca2+/CaM-dependent serine protein kinase (CASK) has been shown to control both sets of CaMKII autophosphorylation events and hence is well poised to be a key regulator of memory. We show deletion of full length CASK or just its CaMK-like and L27 domains disrupts middle-term memory (MTM) and long-term memory (LTM), with CASK function in the α′/β′ subset of mushroom body neurons being required for memory. Likewise directly changing the levels of CaMKII autophosphorylation in these neurons removed MTM and LTM. The requirement of CASK and CaMKII autophosphorylation was not developmental as their manipulation just in the adult α′/β′ neurons was sufficient to remove memory. Overexpression of CASK or CaMKII in the α′/β′ neurons also occluded MTM and LTM. Overexpression of either Drosophila or human CASK in the α′/β′ neurons of the CASK mutant completely rescued memory, confirming that CASK signaling in α′/β′ neurons is necessary and sufficient for Drosophila memory formation and that the neuronal function of CASK is conserved between Drosophila and human. At the cellular level CaMKII overexpression in the α′/β′ neurons increased activity dependent Ca2+ responses while reduction of CaMKII decreased it. Likewise reducing CASK or directly expressing a phosphomimetic CaMKII T287D transgene in the α′/β′ similarly decreased Ca2+ signaling. Our results are consistent with CASK regulating CaMKII autophosphorylation in a pathway required for memory formation that involves activity dependent changes in Ca2+ signaling in the α′/β′ neurons. PMID:23543616
Spatial-frequency variability of the eddy kinetic energy in the South Atlantic Ocean
NASA Astrophysics Data System (ADS)
Cecilio, C. M.; Gherardi, D. F.; Souza, R.; Correa-Ramirez, M.
2013-05-01
In the South Atlantic Ocean (SAO) part of the inter-oceanic flow is accomplished through the issuance of anticyclonic eddies by the Agulhas Retroflection. This region, known as Agulhas Leakage (AL), is responsible by the intermittent shedding of eddies in the SAO. The propagation of these eddies into the SAO induces wave processes that allows the interaction between modes of variability of different basins, ranging from high to low frequency. Modelling studies suggests that the Indian-Atlantic inter-ocean exchange is strongly related to the structure of the wind field, in particular with the position of the maximum Southern Hemisphere westerly winds. This study aims to investigate the variations of the large-scale and regional mesoscale eddy field over the SAO using a frequency domain technique, Multiple Taper Method with Singular Value Decomposition (MTM-SVD). The MTM-SVD approach is applied to examine the individual and joint spatiotemporal variability modes of eddy kinetic energy (EKE) and winds stress. The EKE is estimated from geostrophic velocity anomalies data distributed by Aviso and winds stress from winds dataset of Cross-Calibrated Multi-Platform (CCMP) project from PO.DAAC. The impact of the AL in the SAO, was assessed first for the entire region and subsequently applied in the regions of higher mesoscale activity, which are the Brazil-Malvinas Confluence (BMC), the AL, and the Brazilian Current (BC) region. The results of local fractional variance (LFV) of EKE obtained by the MTM-SVD method show a strong significant annual variability in SAO and BC region while in BMC and in AL this frequency is weaker. In the most energetic mesoscale activity regions (BMC and AL) the pattern of variability is distinct. In the BMC region the interannual variability is dominated while in the AL region the most part of variability is associated by high frequency. The joint LFV spectrum of wind and EKE show an out-of-phase relationship between the AL region and BMC region in the interannual frequencies (3 to 5 years). The dominant frequencies can be seen in 1,5 to 3 years period band and in the intrasazonal frequencies, 0,3 to 0,5 years. The results suggests that the EKE variability patterns are different in the SAO wich might be related to the influence of eddies from AL.
Silverman, Michelle; Sherpa, Dawa Phuti; Naegle, Madeline A; Kim, Hyorim; Coffman, Donna L; Ferdschneider, Marcy
2017-01-01
Background An increasing number of mobile app interventions have been developed for problem drinking among college students; however, few studies have examined the integration of a mobile app with continuous physiological monitoring and alerting of affective states related to drinking behaviors. Objective The aim of this paper was to evaluate the acceptability and feasibility of Mind the Moment (MtM), a theoretically based intervention for female college students with problem drinking that combines brief, in-person counseling with ecological momentary intervention (EMI) on a mobile app integrated with a wearable sensorband. Methods We recruited 10 non-treatment seeking, female undergraduates from a university health clinic who scored a 3 or higher on the Alcohol Use Disorders Identification Test–Consumption (AUDIT-C) to participate in this pilot study. Study activities involved an in-person baseline intake and 1 follow-up assessment, 2 in-person alcohol brief intervention counseling sessions, and use of MtM technology components (sensorband and EMI on a mobile app) for approximately 3-4 weeks. The intervention used motivational interviewing (MI) and cognitive behavioral therapy (CBT) strategies for reducing risks associated with drinking. We used both qualitative and quantitative assessments to measure acceptability of the intervention and feasibility of delivery. Use patterns of the sensorband and mobile app were also collected. Results Quantitative and qualitative data indicated high levels of acceptability for the MtM intervention. Altogether, participants made reports on the app on 26.7% (78/292) the days the technology was available to them and completed a total of 325 reports with wide variation between participants. Qualitative findings indicated that sensorband-elicited alerts promoted an increase in awareness of thoughts, feelings, and behaviors related to current environmental stressors and drinking behaviors in theoretically meaningful ways. Specific challenges related to functionality and form of the sensorband were identified. Conclusions Delivering intervention material “just-in-time,” at the moment participants need to use behavioral strategies has great potential to individualize behavioral interventions for reducing problem drinking and other health behaviors. These findings provide initial evidence for the promise of wearable sensors for increasing potency of theoretically grounded mobile health interventions and point to directions for future research and uptake of these technologies. PMID:28687533
Open-source meteor detection software for low-cost single-board computers
NASA Astrophysics Data System (ADS)
Vida, D.; Zubović, D.; Šegon, D.; Gural, P.; Cupec, R.
2016-01-01
This work aims to overcome the current price threshold of meteor stations which can sometimes deter meteor enthusiasts from owning one. In recent years small card-sized computers became widely available and are used for numerous applications. To utilize such computers for meteor work, software which can run on them is needed. In this paper we present a detailed description of newly-developed open-source software for fireball and meteor detection optimized for running on low-cost single board computers. Furthermore, an update on the development of automated open-source software which will handle video capture, fireball and meteor detection, astrometry and photometry is given.
How to Build an AppleSeed: A Parallel Macintosh Cluster for Numerically Intensive Computing
NASA Astrophysics Data System (ADS)
Decyk, V. K.; Dauger, D. E.
We have constructed a parallel cluster consisting of a mixture of Apple Macintosh G3 and G4 computers running the Mac OS, and have achieved very good performance on numerically intensive, parallel plasma particle-incell simulations. A subset of the MPI message-passing library was implemented in Fortran77 and C. This library enabled us to port code, without modification, from other parallel processors to the Macintosh cluster. Unlike Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. This enables us to move parallel computing from the realm of experts to the main stream of computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkata, Manjunath Gorentla; Aderholdt, William F
The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less
User's guide to the Octopus computer network (the SHOC manual)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, C.; Thompson, D.; Whitten, G.
1977-07-18
This guide explains how to enter, run, and debug programs on the Octopus network. It briefly describes the network's operation, and directs the reader to other documents for further information. It stresses those service programs that will be most useful in the long run; ''quick'' methods that have little flexibility are not discussed. The Octopus timesharing network gives the user access to four CDC 7600 computers, two CDC STAR computers, and a broad array of peripheral equipment, from any of 800 or so remote terminals. 16 figures, 7 tables.
User's guide to the Octopus computer network (the SHOC manual)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, C.; Thompson, D.; Whitten, G.
1976-10-07
This guide explains how to enter, run, and debug programs on the Octopus network. It briefly describes the network's operation, and directs the reader to other documents for further information. It stresses those service programs that will be most useful in the long run; ''quick'' methods that have little flexibility are not discussed. The Octopus timesharing network gives the user access to four CDC 7600 computers, two CDC STAR computers, and a broad array of peripheral equipment, from any of 800 or so remote terminals. 8 figures, 4 tables.
User's guide to the Octopus computer network (the SHOC manual)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, C.; Thompson, D.; Whitten, G.
1975-06-02
This guide explains how to enter, run, and debug programs on the Octopus network. It briefly describes the network's operation, and directs the reader to other documents for further information. It stresses those service programs that will be most useful in the long run; ''quick'' methods that have little flexibility are not discussed. The Octopus timesharing network gives the user access to four CDC 7600 computers and a broad array of peripheral equipment, from any of 800 remote terminals. Octopus will soon include the Laboratory's STAR-100 computers. 9 figures, 5 tables. (auth)
Massively parallel quantum computer simulator
NASA Astrophysics Data System (ADS)
De Raedt, K.; Michielsen, K.; De Raedt, H.; Trieu, B.; Arnold, G.; Richter, M.; Lippert, Th.; Watanabe, H.; Ito, N.
2007-01-01
We describe portable software to simulate universal quantum computers on massive parallel computers. We illustrate the use of the simulation software by running various quantum algorithms on different computer architectures, such as a IBM BlueGene/L, a IBM Regatta p690+, a Hitachi SR11000/J1, a Cray X1E, a SGI Altix 3700 and clusters of PCs running Windows XP. We study the performance of the software by simulating quantum computers containing up to 36 qubits, using up to 4096 processors and up to 1 TB of memory. Our results demonstrate that the simulator exhibits nearly ideal scaling as a function of the number of processors and suggest that the simulation software described in this paper may also serve as benchmark for testing high-end parallel computers.
JAX Colony Management System (JCMS): an extensible colony and phenotype data management system.
Donnelly, Chuck J; McFarland, Mike; Ames, Abigail; Sundberg, Beth; Springer, Dave; Blauth, Peter; Bult, Carol J
2010-04-01
The Jackson Laboratory Colony Management System (JCMS) is a software application for managing data and information related to research mouse colonies, associated biospecimens, and experimental protocols. JCMS runs directly on computers that run one of the PC Windows operating systems, but can be accessed via web browser interfaces from any computer running a Windows, Macintosh, or Linux operating system. JCMS can be configured for a single user or multiple users in small- to medium-size work groups. The target audience for JCMS includes laboratory technicians, animal colony managers, and principal investigators. The application provides operational support for colony management and experimental workflows, sample and data tracking through transaction-based data entry forms, and date-driven work reports. Flexible query forms allow researchers to retrieve database records based on user-defined criteria. Recent advances in handheld computers with integrated barcode readers, middleware technologies, web browsers, and wireless networks add to the utility of JCMS by allowing real-time access to the database from any networked computer.
Fabrication of a First Article Lightweight Composite Technology Demonstrator - Exospine
2014-01-01
core, (b) 0/90, and (c) ± 45 ply cuts of ACG-MTM 45-1/CF0526 prepreg fabric...onboard diagnostics. 2. Experimental 2.1 Materials Plain woven carbon fiber/epoxy prepreg and a low-density foam core were provided to ARL for the...fabrication of the exospine technology demonstrator by UD-CCM. The prepreg was ACG - MTM∗ 45-1/CF0526 and has a cured ply thickness of 0.201 mm. It is
Large-scale, thick, self-assembled, nacre-mimetic brick-walls as fire barrier coatings on textiles
Das, Paramita; Thomas, Helga; Moeller, Martin; Walther, Andreas
2017-01-01
Highly loaded polymer/clay nanocomposites with layered structures are emerging as robust fire retardant surface coatings. However, time-intensive sequential deposition processes, e.g. layer-by-layer strategies, hinders obtaining large coating thicknesses and complicates an implementation into existing technologies. Here, we demonstrate a single-step, water-borne approach to prepare thick, self-assembling, hybrid fire barrier coatings of sodium carboxymethyl cellulose (CMC)/montmorillonite (MTM) with well-defined, bioinspired brick-wall nanostructure, and showcase their application on textile. The coating thickness on the textile is tailored using different concentrations of CMC/MTM (1–5 wt%) in the coating bath. While lower concentrations impart conformal coatings of fibers, thicker continuous coatings are obtained on the textile surface from highest concentration. Comprehensive fire barrier and fire retardancy tests elucidate the increasing fire barrier and retardancy properties with increasing coating thickness. The materials are free of halogen and heavy metal atoms, and are sourced from sustainable and partly even renewable building blocks. We further introduce an amphiphobic surface modification on the coating to impart oil and water repellency, as well as self-cleaning features. Hence, our study presents a generic, environmentally friendly, scalable, and one-pot coating approach that can be introduced into existing technologies to prepare bioinspired, thick, fire barrier nanocomposite coatings on diverse surfaces. PMID:28054589
Steering of aggregating magnetic microparticles using propulsion gradients coils in an MRI Scanner.
Mathieu, Jean-Baptiste; Martel, Sylvain
2010-05-01
Upgraded gradient coils can effectively enhance the MRI steering of magnetic microparticles in a branching channel. Applications of this method include MRI targeting of magnetic embolization agents for oncologic therapy. A magnetic suspension of Fe(3)O(4) magnetic particles was injected inside a y-shaped microfluidic channel. Magnetic gradients of 0, 50, 100, 200, and 400 mT/m were applied to the magnetic particles perpendicularly to the flow by a custom-built gradient coil inside a 1.5-T MRI scanner. Measurement of the steering ratio was performed both by video analyses and quantification of the mass of the particles collected at each outlet of the microfluidic channel, using atomic absorption spectroscopy. Magnetic particles steering ratios of 0.99 and 0.75 were reached with 400 mT/m gradient amplitude and measured by video analyses and atomic absorption spectroscopy, respectively. Experimental data shows that the steering ratio increases with higher magnetic gradients. Moreover, theory suggests that larger particles (or aggregates), higher magnetizations, and lower flows can also be used to improve the steering ratio. The technological limitation of the approach is that an MRI gradient amplitude increase to a few hundred milliteslas per meter is needed. A simple analytical method based on magnetophoretic velocity predictions and geometric considerations is proposed for steering ratio calculation. (c) 2010 Wiley-Liss, Inc.
Chenal, C; Legue, F; Nourgalieva, K
2006-10-01
During 42 years several hundred nuclear tests were performed by the former USSR at the Semipalatinsk Test Site (STS, Kazakhstan), of which more than 100 were done in the atmosphere. We report here the late genetic damage of external exposure to radiation and environmental radioactive contamination in people living in Dolon, a small settlement situated in the vicinity of the STS. The comet assay was applied on DNA lymphocytes of 20 exposed women and 32 non-exposed women living at 500 km from the STS. We observed a statistically significant difference between the exposed and control groups for mean tail moment (MTM) and DNA% in the tail. The mean values of all comet assay parameters (MTM, DNA% in the tail and score) were higher in the group of women born before 1949 as compared to those born after 1950, which could reflect an effect of external irradiation in 1949 due to the most contaminating explosion. These results suggest that people exposed 50 years ago to relatively small doses of external irradiation and/or still living in an environment contaminated by small amounts of long life radionuclides, still present DNA damage which is in agreement with other cytogenetical studies performed at the same site, on the same population.
The NEST Dry-Run Mode: Efficient Dynamic Analysis of Neuronal Network Simulation Code.
Kunkel, Susanne; Schenck, Wolfram
2017-01-01
NEST is a simulator for spiking neuronal networks that commits to a general purpose approach: It allows for high flexibility in the design of network models, and its applications range from small-scale simulations on laptops to brain-scale simulations on supercomputers. Hence, developers need to test their code for various use cases and ensure that changes to code do not impair scalability. However, running a full set of benchmarks on a supercomputer takes up precious compute-time resources and can entail long queuing times. Here, we present the NEST dry-run mode, which enables comprehensive dynamic code analysis without requiring access to high-performance computing facilities. A dry-run simulation is carried out by a single process, which performs all simulation steps except communication as if it was part of a parallel environment with many processes. We show that measurements of memory usage and runtime of neuronal network simulations closely match the corresponding dry-run data. Furthermore, we demonstrate the successful application of the dry-run mode in the areas of profiling and performance modeling.
The NEST Dry-Run Mode: Efficient Dynamic Analysis of Neuronal Network Simulation Code
Kunkel, Susanne; Schenck, Wolfram
2017-01-01
NEST is a simulator for spiking neuronal networks that commits to a general purpose approach: It allows for high flexibility in the design of network models, and its applications range from small-scale simulations on laptops to brain-scale simulations on supercomputers. Hence, developers need to test their code for various use cases and ensure that changes to code do not impair scalability. However, running a full set of benchmarks on a supercomputer takes up precious compute-time resources and can entail long queuing times. Here, we present the NEST dry-run mode, which enables comprehensive dynamic code analysis without requiring access to high-performance computing facilities. A dry-run simulation is carried out by a single process, which performs all simulation steps except communication as if it was part of a parallel environment with many processes. We show that measurements of memory usage and runtime of neuronal network simulations closely match the corresponding dry-run data. Furthermore, we demonstrate the successful application of the dry-run mode in the areas of profiling and performance modeling. PMID:28701946
ATLAS@Home: Harnessing Volunteer Computing for HEP
NASA Astrophysics Data System (ADS)
Adam-Bourdarios, C.; Cameron, D.; Filipčič, A.; Lancon, E.; Wu, W.; ATLAS Collaboration
2015-12-01
A recent common theme among HEP computing is exploitation of opportunistic resources in order to provide the maximum statistics possible for Monte Carlo simulation. Volunteer computing has been used over the last few years in many other scientific fields and by CERN itself to run simulations of the LHC beams. The ATLAS@Home project was started to allow volunteers to run simulations of collisions in the ATLAS detector. So far many thousands of members of the public have signed up to contribute their spare CPU cycles for ATLAS, and there is potential for volunteer computing to provide a significant fraction of ATLAS computing resources. Here we describe the design of the project, the lessons learned so far and the future plans.
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...
2015-02-19
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
RSTensorFlow: GPU Enabled TensorFlow for Deep Learning on Commodity Android Devices
Alzantot, Moustafa; Wang, Yingnan; Ren, Zhengshuang; Srivastava, Mani B.
2018-01-01
Mobile devices have become an essential part of our daily lives. By virtue of both their increasing computing power and the recent progress made in AI, mobile devices evolved to act as intelligent assistants in many tasks rather than a mere way of making phone calls. However, popular and commonly used tools and frameworks for machine intelligence are still lacking the ability to make proper use of the available heterogeneous computing resources on mobile devices. In this paper, we study the benefits of utilizing the heterogeneous (CPU and GPU) computing resources available on commodity android devices while running deep learning models. We leveraged the heterogeneous computing framework RenderScript to accelerate the execution of deep learning models on commodity Android devices. Our system is implemented as an extension to the popular open-source framework TensorFlow. By integrating our acceleration framework tightly into TensorFlow, machine learning engineers can now easily make benefit of the heterogeneous computing resources on mobile devices without the need of any extra tools. We evaluate our system on different android phones models to study the trade-offs of running different neural network operations on the GPU. We also compare the performance of running different models architectures such as convolutional and recurrent neural networks on CPU only vs using heterogeneous computing resources. Our result shows that although GPUs on the phones are capable of offering substantial performance gain in matrix multiplication on mobile devices. Therefore, models that involve multiplication of large matrices can run much faster (approx. 3 times faster in our experiments) due to GPU support. PMID:29629431
Multidisciplinary Simulation Acceleration using Multiple Shared-Memory Graphical Processing Units
NASA Astrophysics Data System (ADS)
Kemal, Jonathan Yashar
For purposes of optimizing and analyzing turbomachinery and other designs, the unsteady Favre-averaged flow-field differential equations for an ideal compressible gas can be solved in conjunction with the heat conduction equation. We solve all equations using the finite-volume multiple-grid numerical technique, with the dual time-step scheme used for unsteady simulations. Our numerical solver code targets CUDA-capable Graphical Processing Units (GPUs) produced by NVIDIA. Making use of MPI, our solver can run across networked compute notes, where each MPI process can use either a GPU or a Central Processing Unit (CPU) core for primary solver calculations. We use NVIDIA Tesla C2050/C2070 GPUs based on the Fermi architecture, and compare our resulting performance against Intel Zeon X5690 CPUs. Solver routines converted to CUDA typically run about 10 times faster on a GPU for sufficiently dense computational grids. We used a conjugate cylinder computational grid and ran a turbulent steady flow simulation using 4 increasingly dense computational grids. Our densest computational grid is divided into 13 blocks each containing 1033x1033 grid points, for a total of 13.87 million grid points or 1.07 million grid points per domain block. To obtain overall speedups, we compare the execution time of the solver's iteration loop, including all resource intensive GPU-related memory copies. Comparing the performance of 8 GPUs to that of 8 CPUs, we obtain an overall speedup of about 6.0 when using our densest computational grid. This amounts to an 8-GPU simulation running about 39.5 times faster than running than a single-CPU simulation.
Computing shifts to monitor ATLAS distributed computing infrastructure and operations
NASA Astrophysics Data System (ADS)
Adam, C.; Barberis, D.; Crépé-Renaudin, S.; De, K.; Fassi, F.; Stradling, A.; Svatos, M.; Vartapetian, A.; Wolters, H.
2017-10-01
The ATLAS Distributed Computing (ADC) group established a new Computing Run Coordinator (CRC) shift at the start of LHC Run 2 in 2015. The main goal was to rely on a person with a good overview of the ADC activities to ease the ADC experts’ workload. The CRC shifter keeps track of ADC tasks related to their fields of expertise and responsibility. At the same time, the shifter maintains a global view of the day-to-day operations of the ADC system. During Run 1, this task was accomplished by a person of the expert team called the ADC Manager on Duty (AMOD), a position that was removed during the shutdown period due to the reduced number and availability of ADC experts foreseen for Run 2. The CRC position was proposed to cover some of the AMODs former functions, while allowing more people involved in computing to participate. In this way, CRC shifters help with the training of future ADC experts. The CRC shifters coordinate daily ADC shift operations, including tracking open issues, reporting, and representing ADC in relevant meetings. The CRC also facilitates communication between the ADC experts team and the other ADC shifters. These include the Distributed Analysis Support Team (DAST), which is the first point of contact for addressing all distributed analysis questions, and the ATLAS Distributed Computing Shifters (ADCoS), which check and report problems in central services, sites, Tier-0 export, data transfers and production tasks. Finally, the CRC looks at the level of ADC activities on a weekly or monthly timescale to ensure that ADC resources are used efficiently.
RSTensorFlow: GPU Enabled TensorFlow for Deep Learning on Commodity Android Devices.
Alzantot, Moustafa; Wang, Yingnan; Ren, Zhengshuang; Srivastava, Mani B
2017-06-01
Mobile devices have become an essential part of our daily lives. By virtue of both their increasing computing power and the recent progress made in AI, mobile devices evolved to act as intelligent assistants in many tasks rather than a mere way of making phone calls. However, popular and commonly used tools and frameworks for machine intelligence are still lacking the ability to make proper use of the available heterogeneous computing resources on mobile devices. In this paper, we study the benefits of utilizing the heterogeneous (CPU and GPU) computing resources available on commodity android devices while running deep learning models. We leveraged the heterogeneous computing framework RenderScript to accelerate the execution of deep learning models on commodity Android devices. Our system is implemented as an extension to the popular open-source framework TensorFlow. By integrating our acceleration framework tightly into TensorFlow, machine learning engineers can now easily make benefit of the heterogeneous computing resources on mobile devices without the need of any extra tools. We evaluate our system on different android phones models to study the trade-offs of running different neural network operations on the GPU. We also compare the performance of running different models architectures such as convolutional and recurrent neural networks on CPU only vs using heterogeneous computing resources. Our result shows that although GPUs on the phones are capable of offering substantial performance gain in matrix multiplication on mobile devices. Therefore, models that involve multiplication of large matrices can run much faster (approx. 3 times faster in our experiments) due to GPU support.
NASA Technical Reports Server (NTRS)
Chawner, David M.; Gomez, Ray J.
2010-01-01
In the Applied Aerosciences and CFD branch at Johnson Space Center, computational simulations are run that face many challenges. Two of which are the ability to customize software for specialized needs and the need to run simulations as fast as possible. There are many different tools that are used for running these simulations and each one has its own pros and cons. Once these simulations are run, there needs to be software capable of visualizing the results in an appealing manner. Some of this software is called open source, meaning that anyone can edit the source code to make modifications and distribute it to all other users in a future release. This is very useful, especially in this branch where many different tools are being used. File readers can be written to load any file format into a program, to ease the bridging from one tool to another. Programming such a reader requires knowledge of the file format that is being read as well as the equations necessary to obtain the derived values after loading. When running these CFD simulations, extremely large files are being loaded and having values being calculated. These simulations usually take a few hours to complete, even on the fastest machines. Graphics processing units (GPUs) are usually used to load the graphics for computers; however, in recent years, GPUs are being used for more generic applications because of the speed of these processors. Applications run on GPUs have been known to run up to forty times faster than they would on normal central processing units (CPUs). If these CFD programs are extended to run on GPUs, the amount of time they would require to complete would be much less. This would allow more simulations to be run in the same amount of time and possibly perform more complex computations.
Soule, Pat LeRoy
1978-01-01
Water-surface profiles of the 25-, 50-, and 100-year recurrence interval discharges have been computed for all streams and reaches of channels in Fairfax County, Virginia, having a drainage area greater than 1 square mile except for Dogue Creek, Little Hunting Creek, and that portion of Cameron Run above Lake Barcroft. Maps having a 2-foot contour interval and a horizontal scale of 1 inch equals 100 feet were used for base on which flood boundaries were delineated for 25-, 50-, and 100-year floods to be expected in each basin under ultimate development conditions. This report is one of a series and presents a discussion of techniques employed in computing discharges and profiles as well as the flood profiles and maps on which flood boundaries have been delineated for the Occoquan River and its tributaries within Fairfax County and those streams on Mason Neck within Fairfax County tributary to the Potomac River. (Woodard-USGS)
ACON: a multipurpose production controller for plasma physics codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Snell, C.
1983-01-01
ACON is a BCON controller designed to run large production codes on the CTSS Cray-1 or the LTSS 7600 computers. ACON can also be operated interactively, with input from the user's terminal. The controller can run one code or a sequence of up to ten codes during the same job. Options are available to get and save Mass storage files, to perform Historian file updating operations, to compile and load source files, and to send out print and film files. Special features include ability to retry after Mass failures, backup options for saving files, startup messages for the various codes,more » and ability to reserve specified amounts of computer time after successive code runs. ACON's flexibility and power make it useful for running a number of different production codes.« less
Experimental Realization of High-Efficiency Counterfactual Computation.
Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng
2015-08-21
Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.
Experimental Realization of High-Efficiency Counterfactual Computation
NASA Astrophysics Data System (ADS)
Kong, Fei; Ju, Chenyong; Huang, Pu; Wang, Pengfei; Kong, Xi; Shi, Fazhan; Jiang, Liang; Du, Jiangfeng
2015-08-01
Counterfactual computation (CFC) exemplifies the fascinating quantum process by which the result of a computation may be learned without actually running the computer. In previous experimental studies, the counterfactual efficiency is limited to below 50%. Here we report an experimental realization of the generalized CFC protocol, in which the counterfactual efficiency can break the 50% limit and even approach unity in principle. The experiment is performed with the spins of a negatively charged nitrogen-vacancy color center in diamond. Taking advantage of the quantum Zeno effect, the computer can remain in the not-running subspace due to the frequent projection by the environment, while the computation result can be revealed by final detection. The counterfactual efficiency up to 85% has been demonstrated in our experiment, which opens the possibility of many exciting applications of CFC, such as high-efficiency quantum integration and imaging.
Running of scalar spectral index in multi-field inflation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gong, Jinn-Ouk, E-mail: jinn-ouk.gong@apctp.org
We compute the running of the scalar spectral index in general multi-field slow-roll inflation. By incorporating explicit momentum dependence at the moment of horizon crossing, we can find the running straightforwardly. At the same time, we can distinguish the contributions from the quasi de Sitter background and the super-horizon evolution of the field fluctuations.
NASA Astrophysics Data System (ADS)
Hill, M. C.; Jakeman, J.; Razavi, S.; Tolson, B.
2015-12-01
For many environmental systems model runtimes have remained very long as more capable computers have been used to add more processes and more time and space discretization. Scientists have also added more parameters and kinds of observations, and many model runs are needed to explore the models. Computational demand equals run time multiplied by number of model runs divided by parallelization opportunities. Model exploration is conducted using sensitivity analysis, optimization, and uncertainty quantification. Sensitivity analysis is used to reveal consequences of what may be very complex simulated relations, optimization is used to identify parameter values that fit the data best, or at least better, and uncertainty quantification is used to evaluate the precision of simulated results. The long execution times make such analyses a challenge. Methods for addressing this challenges include computationally frugal analysis of the demanding original model and a number of ingenious surrogate modeling methods. Both commonly use about 50-100 runs of the demanding original model. In this talk we consider the tradeoffs between (1) original model development decisions, (2) computationally frugal analysis of the original model, and (3) using many model runs of the fast surrogate model. Some questions of interest are as follows. If the added processes and discretization invested in (1) are compared with the restrictions and approximations in model analysis produced by long model execution times, is there a net benefit related of the goals of the model? Are there changes to the numerical methods that could reduce the computational demands while giving up less fidelity than is compromised by using computationally frugal methods or surrogate models for model analysis? Both the computationally frugal methods and surrogate models require that the solution of interest be a smooth function of the parameters or interest. How does the information obtained from the local methods typical of (2) and the global averaged methods typical of (3) compare for typical systems? The discussion will use examples of response of the Greenland glacier to global warming and surface and groundwater modeling.
Program Processes Thermocouple Readings
NASA Technical Reports Server (NTRS)
Quave, Christine A.; Nail, William, III
1995-01-01
Digital Signal Processor for Thermocouples (DART) computer program implements precise and fast method of converting voltage to temperature for large-temperature-range thermocouple applications. Written using LabVIEW software. DART available only as object code for use on Macintosh II FX or higher-series computers running System 7.0 or later and IBM PC-series and compatible computers running Microsoft Windows 3.1. Macintosh version of DART (SSC-00032) requires LabVIEW 2.2.1 or 3.0 for execution. IBM PC version (SSC-00031) requires LabVIEW 3.0 for Windows 3.1. LabVIEW software product of National Instruments and not included with program.
NASA Technical Reports Server (NTRS)
Mcenulty, R. E.
1977-01-01
The G189A simulation of the Shuttle Orbiter ECLSS was upgraded. All simulation library versions and simulation models were converted from the EXEC2 to the EXEC8 computer system and a new program, G189PL, was added to the combination master program library. The program permits the post-plotting of up to 100 frames of plot data over any time interval of a G189 simulation run. The overlay structure of the G189A simulations were restructured for the purpose of conserving computer core requirements and minimizing run time requirements.
INHYD: Computer code for intraply hybrid composite design. A users manual
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1983-01-01
A computer program (INHYD) was developed for intraply hybrid composite design. A users manual for INHYD is presented. In INHYD embodies several composite micromechanics theories, intraply hybrid composite theories, and an integrated hygrothermomechanical theory. The INHYD can be run in both interactive and batch modes. It has considerable flexibility and capability, which the user can exercise through several options. These options are demonstrated through appropriate INHYD runs in the manual.
Topology Optimization for Reducing Additive Manufacturing Processing Distortions
2017-12-01
features that curl or warp under thermal load and are subsequently struck by the recoater blade /roller. Support structures act to wick heat away and...was run for 150 iterations. The material properties for all examples were Young’s modulus E = 1 GPa, Poisson’s ratio ν = 0.25, and thermal expansion...the element-birth model is significantly more computationally expensive for a full op- timization run . Consider, the computational complexity of a
MindModeling@Home . . . and Anywhere Else You Have Idle Processors
2009-12-01
was SETI @Home. It was established in 1999 for the purpose of demonstrating the utility of “distributed grid computing” by providing a mechanism for...the public imagination, and SETI @Home remains the longest running and one of the most popular volunteer computing projects in the world. This...pursuits. Most of them, including SETI @Home, run on a software architecture called the Berkeley Open Infrastructure for Network Computing (BOINC). Some of
NASA Astrophysics Data System (ADS)
Zhiying, Chen; Ping, Zhou
2017-11-01
Considering the robust optimization computational precision and efficiency for complex mechanical assembly relationship like turbine blade-tip radial running clearance, a hierarchically response surface robust optimization algorithm is proposed. The distribute collaborative response surface method is used to generate assembly system level approximation model of overall parameters and blade-tip clearance, and then a set samples of design parameters and objective response mean and/or standard deviation is generated by using system approximation model and design of experiment method. Finally, a new response surface approximation model is constructed by using those samples, and this approximation model is used for robust optimization process. The analyses results demonstrate the proposed method can dramatic reduce the computational cost and ensure the computational precision. The presented research offers an effective way for the robust optimization design of turbine blade-tip radial running clearance.
Implementing Parquet equations using HPX
NASA Astrophysics Data System (ADS)
Kellar, Samuel; Wagle, Bibek; Yang, Shuxiang; Tam, Ka-Ming; Kaiser, Hartmut; Moreno, Juana; Jarrell, Mark
A new C++ runtime system (HPX) enables simulations of complex systems to run more efficiently on parallel and heterogeneous systems. This increased efficiency allows for solutions to larger simulations of the parquet approximation for a system with impurities. The relevancy of the parquet equations depends upon the ability to solve systems which require long runs and large amounts of memory. These limitations, in addition to numerical complications arising from stability of the solutions, necessitate running on large distributed systems. As the computational resources trend towards the exascale and the limitations arising from computational resources vanish efficiency of large scale simulations becomes a focus. HPX facilitates efficient simulations through intelligent overlapping of computation and communication. Simulations such as the parquet equations which require the transfer of large amounts of data should benefit from HPX implementations. Supported by the the NSF EPSCoR Cooperative Agreement No. EPS-1003897 with additional support from the Louisiana Board of Regents.
DualSPHysics: A numerical tool to simulate real breakwaters
NASA Astrophysics Data System (ADS)
Zhang, Feng; Crespo, Alejandro; Altomare, Corrado; Domínguez, José; Marzeddu, Andrea; Shang, Shao-ping; Gómez-Gesteira, Moncho
2018-02-01
The open-source code DualSPHysics is used in this work to compute the wave run-up in an existing dike in the Chinese coast using realistic dimensions, bathymetry and wave conditions. The GPU computing power of the DualSPHysics allows simulating real-engineering problems that involve complex geometries with a high resolution in a reasonable computational time. The code is first validated by comparing the numerical free-surface elevation, the wave orbital velocities and the time series of the run-up with physical data in a wave flume. Those experiments include a smooth dike and an armored dike with two layers of cubic blocks. After validation, the code is applied to a real case to obtain the wave run-up under different incident wave conditions. In order to simulate the real open sea, the spurious reflections from the wavemaker are removed by using an active wave absorption technique.
Prediction of sound radiated from different practical jet engine inlets
NASA Technical Reports Server (NTRS)
Zinn, B. T.; Meyer, W. L.
1980-01-01
Existing computer codes for calculating the far field radiation patterns surrounding various practical jet engine inlet configurations under different excitation conditions were upgraded. The computer codes were refined and expanded so that they are now more efficient computationally by a factor of about three and they are now capable of producing accurate results up to nondimensional wave numbers of twenty. Computer programs were also developed to help generate accurate geometrical representations of the inlets to be investigated. This data is required as input for the computer programs which calculate the sound fields. This new geometry generating computer program considerably reduces the time required to generate the input data which was one of the most time consuming steps in the process. The results of sample runs using the NASA-Lewis QCSEE inlet are presented and comparison of run times and accuracy are made between the old and upgraded computer codes. The overall accuracy of the computations is determined by comparison of the results of the computations with simple source solutions.
NASA Astrophysics Data System (ADS)
Gerjuoy, Edward
2005-06-01
The security of messages encoded via the widely used RSA public key encryption system rests on the enormous computational effort required to find the prime factors of a large number N using classical (conventional) computers. In 1994 Peter Shor showed that for sufficiently large N, a quantum computer could perform the factoring with much less computational effort. This paper endeavors to explain, in a fashion comprehensible to the nonexpert, the RSA encryption protocol; the various quantum computer manipulations constituting the Shor algorithm; how the Shor algorithm performs the factoring; and the precise sense in which a quantum computer employing Shor's algorithm can be said to accomplish the factoring of very large numbers with less computational effort than a classical computer. It is made apparent that factoring N generally requires many successive runs of the algorithm. Our analysis reveals that the probability of achieving a successful factorization on a single run is about twice as large as commonly quoted in the literature.
Programming the social computer.
Robertson, David; Giunchiglia, Fausto
2013-03-28
The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.
Myers, E W; Mount, D W
1986-01-01
We describe a program which may be used to find approximate matches to a short predefined DNA sequence in a larger target DNA sequence. The program predicts the usefulness of specific DNA probes and sequencing primers and finds nearly identical sequences that might represent the same regulatory signal. The program is written in the C programming language and will run on virtually any computer system with a C compiler, such as the IBM/PC and other computers running under the MS/DOS and UNIX operating systems. The program has been integrated into an existing software package for the IBM personal computer (see article by Mount and Conrad, this volume). Some examples of its use are given. PMID:3753785
AGIS: Evolution of Distributed Computing information system for ATLAS
NASA Astrophysics Data System (ADS)
Anisenkov, A.; Di Girolamo, A.; Alandes, M.; Karavakis, E.
2015-12-01
ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produces petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization of computing resources in order to meet the ATLAS requirements of petabytes scale data operations. It has been evolved after the first period of LHC data taking (Run-1) in order to cope with new challenges of the upcoming Run- 2. In this paper we describe the evolution and recent developments of the ATLAS Grid Information System (AGIS), developed in order to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.
Jaschob, Daniel; Riffle, Michael
2012-07-30
Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. JobCenter is a client-server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or "in the cloud") and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/.
Small Tactical Unmanned Aerial System (STUAS) Rapid Integration and Fielding Process (RAIN)
2013-09-01
121A4 September 2013 Thesis Advisor: Rama Gehris Co-Advisor: Bonnie Young THIS PAGE INTENTIONALLY LEFT BLANK i REPORT...11~ ... . , - ... $tlo.rotfydle:IW#hlt I •:tl ~dciwolloCII’fil -K ~IOI’I.,....UMdto~~h..,orj ’Bj ~ $11owh~t:~I) Cft .... l’utopK i7, MtM rf Mtdlar...Trndc-off Anolysis Rcsulls ’ll•c trnde-ofl’ Analysis results will be a summary of tho conclusions derived from incorporating aml analyzing tho
Topographic map of part of the Kasei Valles and Sacra Fossae regions of Mars - MTM 500k 20/287E OMKT
Rosiek, Mark R.; Redding, Bonnie L.; Galuszca, Donna M.
2005-01-01
This map is part of a series of topographic maps of areas of special scientific interest on Mars. The topography was compiled photogrammetrically using Viking Orbiter stereo image pairs and photoclinometry from a Viking Orbiter image. The contour interval is 250 m. Horizontal and vertical control was established using the USGS Mars Digital Image Model 2.0 (MDIM 2.0) and data from the Mars Orbiter Laser Altimeter (MOLA).
Spectrophotometry of stars 9 - 12m north polar spectrophotometric sequence (NPSS) program.
NASA Astrophysics Data System (ADS)
Sharipova, L. M.; Prokof'eva, V. V.
Spectrophotometric observations of stars 9 - 12m of the NPSS program have been made with the use of hgh-sensitivity light-detecting apparatus of the digital television complex of the 0.5-m Maksutov telescope MTM-500 and original slitless spectrograph. Atmospheric extinction was controlled during the night by means of an energetically calibrated brightness standard. Absolute energy distributions of 12 stars, their synthetic magnitudes in the V band, and B-V color indices were obtained.
CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.
ERIC Educational Resources Information Center
Skrein, Dale
1994-01-01
CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)
Flame-Vortex Studies to Quantify Markstein Numbers Needed to Model Flame Extinction Limits
NASA Technical Reports Server (NTRS)
Driscoll, James F.; Feikema, Douglas A.
2003-01-01
This has quantified a database of Markstein numbers for unsteady flames; future work will quantify a database of flame extinction limits for unsteady conditions. Unsteady extinction limits have not been documented previously; both a stretch rate and a residence time must be measured, since extinction requires that the stretch rate be sufficiently large for a sufficiently long residence time. Ma was measured for an inwardly-propagating flame (IPF) that is negatively-stretched under microgravity conditions. Computations also were performed using RUN-1DL to explain the measurements. The Markstein number of an inwardly-propagating flame, for both the microgravity experiment and the computations, is significantly larger than that of an outwardy-propagating flame. The computed profiles of the various species within the flame suggest reasons. Computed hydrogen concentrations build up ahead of the IPF but not the OPF. Understanding was gained by running the computations for both simplified and full-chemistry conditions. Numerical Simulations. To explain the experimental findings, numerical simulations of both inwardly and outwardly propagating spherical flames (with complex chemistry) were generated using the RUN-1DL code, which includes 16 species and 46 reactions.
Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm
NASA Astrophysics Data System (ADS)
Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Twomey, M. S.; Zaytsev, A.
2014-06-01
With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.
Scalable load balancing for massively parallel distributed Monte Carlo particle transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, M. J.; Brantley, P. S.; Joy, K. I.
2013-07-01
In order to run computer simulations efficiently on massively parallel computers with hundreds of thousands or millions of processors, care must be taken that the calculation is load balanced across the processors. Examining the workload of every processor leads to an unscalable algorithm, with run time at least as large as O(N), where N is the number of processors. We present a scalable load balancing algorithm, with run time 0(log(N)), that involves iterated processor-pair-wise balancing steps, ultimately leading to a globally balanced workload. We demonstrate scalability of the algorithm up to 2 million processors on the Sequoia supercomputer at Lawrencemore » Livermore National Laboratory. (authors)« less
Performance of a supercharged direct-injection stratified-charge rotary combustion engine
NASA Technical Reports Server (NTRS)
Bartrand, Timothy A.; Willis, Edward A.
1990-01-01
A zero-dimensional thermodynamic performance computer model for direct-injection stratified-charge rotary combustion engines was modified and run for a single rotor supercharged engine. Operating conditions for the computer runs were a single boost pressure and a matrix of speeds, loads and engine materials. A representative engine map is presented showing the predicted range of efficient operation. After discussion of the engine map, a number of engine features are analyzed individually. These features are: heat transfer and the influence insulating materials have on engine performance and exhaust energy; intake manifold pressure oscillations and interactions with the combustion chamber; and performance losses and seal friction. Finally, code running times and convergence data are presented.
Decrease in Ground-Run Distance of Small Airplanes by Applying Electrically-Driven Wheels
NASA Astrophysics Data System (ADS)
Kobayashi, Hiroshi; Nishizawa, Akira
A new takeoff method for small airplanes was proposed. Ground-roll performance of an airplane driven by electrically-powered wheels was experimentally and computationally studied. The experiments verified that the ground-run distance was decreased by half with a combination of the powered driven wheels and propeller without increase of energy consumption during the ground-roll. The computational analysis showed the ground-run distance of the wheel-driven aircraft was independent of the motor power when the motor capability exceeded the friction between tires and ground. Furthermore, the distance was minimized when the angle of attack was set to the value so that the wing generated negative lift.
NASADIG - NASA DEVICE INDEPENDENT GRAPHICS LIBRARY (AMDAHL VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. E.
1994-01-01
The NASA Device Independent Graphics Library, NASADIG, can be used with many computer-based engineering and management applications. The library gives the user the opportunity to translate data into effective graphic displays for presentation. The software offers many features which allow the user flexibility in creating graphics. These include two-dimensional plots, subplot projections in 3D-space, surface contour line plots, and surface contour color-shaded plots. Routines for three-dimensional plotting, wireframe surface plots, surface plots with hidden line removal, and surface contour line plots are provided. Other features include polar and spherical coordinate plotting, world map plotting utilizing either cylindrical equidistant or Lambert equal area projection, plot translation, plot rotation, plot blowup, splines and polynomial interpolation, area blanking control, multiple log/linear axes, legends and text control, curve thickness control, and multiple text fonts (18 regular, 4 bold). NASADIG contains several groups of subroutines. Included are subroutines for plot area and axis definition; text set-up and display; area blanking; line style set-up, interpolation, and plotting; color shading and pattern control; legend, text block, and character control; device initialization; mixed alphabets setting; and other useful functions. The usefulness of many routines is dependent on the prior definition of basic parameters. The program's control structure uses a serial-level construct with each routine restricted for activation at some prescribed level(s) of problem definition. NASADIG provides the following output device drivers: Selanar 100XL, VECTOR Move/Draw ASCII and PostScript files, Tektronix 40xx, 41xx, and 4510 Rasterizer, DEC VT-240 (4014 mode), IBM AT/PC compatible with SmartTerm 240 emulator, HP Lasergrafix Film Recorder, QMS 800/1200, DEC LN03+ Laserprinters, and HP LaserJet (Series III). NASADIG is written in FORTRAN and is available for several platforms. NASADIG 5.7 is available for DEC VAX series computers running VMS 5.0 or later (MSC-21801), Cray X-MP and Y-MP series computers running UNICOS (COS-10049), and Amdahl 5990 mainframe computers running UTS (COS-10050). NASADIG 5.1 is available for UNIX-based operating systems (MSC-22001). The UNIX version has been successfully implemented on Sun4 series computers running SunOS, SGI IRIS computers running IRIX, Hewlett Packard 9000 computers running HP-UX, and Convex computers running Convex OS (MSC-22001). The standard distribution medium for MSC-21801 is a set of two 6250 BPI 9-track magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. The standard distribution medium for COS-10049 and COS-10050 is a 6250 BPI 9-track magnetic tape in UNIX tar format. Other distribution media and formats may be available upon request. The standard distribution medium for MSC-22001 is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. With minor modification, the UNIX source code can be ported to other platforms including IBM PC/AT series computers and compatibles. NASADIG is also available bundled with TRASYS, the Thermal Radiation Analysis System (COS-10026, DEC VAX version; COS-10040, CRAY version).
NASADIG - NASA DEVICE INDEPENDENT GRAPHICS LIBRARY (UNIX VERSION)
NASA Technical Reports Server (NTRS)
Rogers, J. E.
1994-01-01
The NASA Device Independent Graphics Library, NASADIG, can be used with many computer-based engineering and management applications. The library gives the user the opportunity to translate data into effective graphic displays for presentation. The software offers many features which allow the user flexibility in creating graphics. These include two-dimensional plots, subplot projections in 3D-space, surface contour line plots, and surface contour color-shaded plots. Routines for three-dimensional plotting, wireframe surface plots, surface plots with hidden line removal, and surface contour line plots are provided. Other features include polar and spherical coordinate plotting, world map plotting utilizing either cylindrical equidistant or Lambert equal area projection, plot translation, plot rotation, plot blowup, splines and polynomial interpolation, area blanking control, multiple log/linear axes, legends and text control, curve thickness control, and multiple text fonts (18 regular, 4 bold). NASADIG contains several groups of subroutines. Included are subroutines for plot area and axis definition; text set-up and display; area blanking; line style set-up, interpolation, and plotting; color shading and pattern control; legend, text block, and character control; device initialization; mixed alphabets setting; and other useful functions. The usefulness of many routines is dependent on the prior definition of basic parameters. The program's control structure uses a serial-level construct with each routine restricted for activation at some prescribed level(s) of problem definition. NASADIG provides the following output device drivers: Selanar 100XL, VECTOR Move/Draw ASCII and PostScript files, Tektronix 40xx, 41xx, and 4510 Rasterizer, DEC VT-240 (4014 mode), IBM AT/PC compatible with SmartTerm 240 emulator, HP Lasergrafix Film Recorder, QMS 800/1200, DEC LN03+ Laserprinters, and HP LaserJet (Series III). NASADIG is written in FORTRAN and is available for several platforms. NASADIG 5.7 is available for DEC VAX series computers running VMS 5.0 or later (MSC-21801), Cray X-MP and Y-MP series computers running UNICOS (COS-10049), and Amdahl 5990 mainframe computers running UTS (COS-10050). NASADIG 5.1 is available for UNIX-based operating systems (MSC-22001). The UNIX version has been successfully implemented on Sun4 series computers running SunOS, SGI IRIS computers running IRIX, Hewlett Packard 9000 computers running HP-UX, and Convex computers running Convex OS (MSC-22001). The standard distribution medium for MSC-21801 is a set of two 6250 BPI 9-track magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. The standard distribution medium for COS-10049 and COS-10050 is a 6250 BPI 9-track magnetic tape in UNIX tar format. Other distribution media and formats may be available upon request. The standard distribution medium for MSC-22001 is a .25 inch streaming magnetic tape cartridge (Sun QIC-24) in UNIX tar format. Alternate distribution media and formats are available upon request. With minor modification, the UNIX source code can be ported to other platforms including IBM PC/AT series computers and compatibles. NASADIG is also available bundled with TRASYS, the Thermal Radiation Analysis System (COS-10026, DEC VAX version; COS-10040, CRAY version).
Multiple running speed signals in medial entorhinal cortex
Hinman, James R.; Brandon, Mark P.; Climer, Jason R.; Chapman, G. William; Hasselmo, Michael E.
2016-01-01
Grid cells in medial entorhinal cortex (MEC) can be modeled using oscillatory interference or attractor dynamic mechanisms that perform path integration, a computation requiring information about running direction and speed. The two classes of computational models often use either an oscillatory frequency or a firing rate that increases as a function of running speed. Yet it is currently not known whether these are two manifestations of the same speed signal or dissociable signals with potentially different anatomical substrates. We examined coding of running speed in MEC and identified these two speed signals to be independent of each other within individual neurons. The medial septum (MS) is strongly linked to locomotor behavior and removal of MS input resulted in strengthening of the firing rate speed signal, while decreasing the strength of the oscillatory speed signal. Thus two speed signals are present in MEC that are differentially affected by disrupted MS input. PMID:27427460
Running SINDA '85/FLUINT interactive on the VAX
NASA Technical Reports Server (NTRS)
Simmonds, Boris
1992-01-01
Computer software as engineering tools are typically run in three modes: Batch, Demand, and Interactive. The first two are the most popular in the SINDA world. The third one is not so popular, due probably to the users inaccessibility to the command procedure files for running SINDA '85, or lack of familiarity with the SINDA '85 execution processes (pre-processor, processor, compilation, linking, execution and all of the file assignment, creation, deletions and de-assignments). Interactive is the mode that makes thermal analysis with SINDA '85 a real-time design tool. This paper explains a command procedure sufficient (the minimum modifications required in an existing demand command procedure) to run SINDA '85 on the VAX in an interactive mode. To exercise the procedure a sample problem is presented exemplifying the mode, plus additional programming capabilities available in SINDA '85. Following the same guidelines the process can be extended to other SINDA '85 residence computer platforms.
Multi-GPGPU Tsunami simulation at Toyama-bay
NASA Astrophysics Data System (ADS)
Furuyama, Shoichi; Ueda, Yuki
2017-07-01
Accelerated multi General Purpose Graphics Processing Unit (GPGPU) calculation for Tsunami run-up simulation was achieved at the wide area (whole Toyama-bay in Japan) by faster computation technique. Toyama-bay has active-faults at the sea-bed. It has a high possibility to occur earthquakes and Tsunami waves in the case of the huge earthquake, that's why to predict the area of Tsunami run-up is important for decreasing damages to residents by the disaster. However it is very hard task to achieve the simulation by the computer resources problem. A several meter's order of the high resolution calculation is required for the running-up Tsunami simulation because artificial structures on the ground such as roads, buildings, and houses are very small. On the other hand the huge area simulation is also required. In the Toyama-bay case the area is 42 [km] × 15 [km]. When 5 [m] × 5 [m] size computational cells are used for the simulation, over 26,000,000 computational cells are generated. To calculate the simulation, a normal CPU desktop computer took about 10 hours for the calculation. An improvement of calculation time is important problem for the immediate prediction system of Tsunami running-up, as a result it will contribute to protect a lot of residents around the coastal region. The study tried to decrease this calculation time by using multi GPGPU system which is equipped with six NVIDIA TESLA K20xs, InfiniBand network connection between computer nodes by MVAPICH library. As a result 5.16 times faster calculation was achieved on six GPUs than one GPU case and it was 86% parallel efficiency to the linear speed up.
An analysis of running skyline load path.
Ward W. Carson; Charles N. Mann
1971-01-01
This paper is intended for those who wish to prepare an algorithm to determine the load path of a running skyline. The mathematics of a simplified approach to this running skyline design problem are presented. The approach employs assumptions which reduce the complexity of the problem to the point where it can be solved on desk-top computers of limited capacities. The...
Job Priorities on Peregrine | High-Performance Computing | NREL
allocation when run with qos=high. Requesting a Node Reservation If you are doing work that requires real scheduler more efficiently plan resources for larger jobs. When projects reach their allocation limit, jobs associated with those projects will run at very low priority, which will ensure that these jobs run only when
Running High-Throughput Jobs on Peregrine | High-Performance Computing |
unique name (using "name=") and usse the task name to create a unique output file name. For runs on and how many tasks to give to each worker at a time using the NITRO_COORD_OPTIONS environment . Finally, you start Nitro by executing launch_nitro.sh. Sample Nitro job script To run a job using the
AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.
Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld
2016-08-01
There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenough, Jeffrey A.; de Supinski, Bronis R.; Yates, Robert K.
2005-04-25
We describe the performance of the block-structured Adaptive Mesh Refinement (AMR) code Raptor on the 32k node IBM BlueGene/L computer. This machine represents a significant step forward towards petascale computing. As such, it presents Raptor with many challenges for utilizing the hardware efficiently. In terms of performance, Raptor shows excellent weak and strong scaling when running in single level mode (no adaptivity). Hardware performance monitors show Raptor achieves an aggregate performance of 3:0 Tflops in the main integration kernel on the 32k system. Results from preliminary AMR runs on a prototype astrophysical problem demonstrate the efficiency of the current softwaremore » when running at large scale. The BG/L system is enabling a physics problem to be considered that represents a factor of 64 increase in overall size compared to the largest ones of this type computed to date. Finally, we provide a description of the development work currently underway to address our inefficiencies.« less
ASDIR-II. Volume I. User Manual
1975-12-01
normally the most significant part of the overall aircraft IR signature. The 4 radiance is directly dependent upon the geometric view factors , a set...tactors as punched card output in. a view factor computer run. For the view factor computer run IB49 through 53 and all IDS input A, from IDS-2 to IDS-6...may be excluded from the input string if the * program execution is requested to stop after punching the viewv factors . Inputs required for punching
Feasibility of Virtual Machine and Cloud Computing Technologies for High Performance Computing
2014-05-01
Hat Enterprise Linux SaaS software as a service VM virtual machine vNUMA virtual non-uniform memory access WRF weather research and forecasting...previously mentioned in Chapter I Section B1 of this paper, which is used to run the weather research and forecasting ( WRF ) model in their experiments...against a VMware virtualization solution of WRF . The experiment consisted of running WRF in a standard configuration between the D-VTM and VMware while
The Air Force Geophysics Laboratory Standalone Data Acquisition System: A Functional Description.
1980-10-09
the board are a buffer for the RUN/HALT front panel switch and a retriggerable oneshot multivibrator. This latter circuit senses the SRUN pulse train...recording on the data tapes, and providing the master timing source for data acquisition. An Electronic Research Company (ERC) model 2446 digital...the computer is fed to a retriggerable oneshot multivibrator on the board. (SRUN consists of a pulse train that is present when the computer is running
Improved Algorithms Speed It Up for Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazi, A
2005-09-20
Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leadermore » for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.« less
Hari, Pradip; Ko, Kevin; Koukoumidis, Emmanouil; Kremer, Ulrich; Martonosi, Margaret; Ottoni, Desiree; Peh, Li-Shiuan; Zhang, Pei
2008-10-28
Increasingly, spatial awareness plays a central role in many distributed and mobile computing applications. Spatially aware applications rely on information about the geographical position of compute devices and their supported services in order to support novel functionality. While many spatial application drivers already exist in mobile and distributed computing, very little systems research has explored how best to program these applications, to express their spatial and temporal constraints, and to allow efficient implementations on highly dynamic real-world platforms. This paper proposes the SARANA system architecture, which includes language and run-time system support for spatially aware and resource-aware applications. SARANA allows users to express spatial regions of interest, as well as trade-offs between quality of result (QoR), latency and cost. The goal is to produce applications that use resources efficiently and that can be run on diverse resource-constrained platforms ranging from laptops to personal digital assistants and to smart phones. SARANA's run-time system manages QoR and cost trade-offs dynamically by tracking resource availability and locations, brokering usage/pricing agreements and migrating programs to nodes accordingly. A resource cost model permeates the SARANA system layers, permitting users to express their resource needs and QoR expectations in units that make sense to them. Although we are still early in the system development, initial versions have been demonstrated on a nine-node system prototype.
Lee, Jae H.; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T.; Seo, Youngho
2014-01-01
The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting. PMID:27081299
Lee, Jae H; Yao, Yushu; Shrestha, Uttam; Gullberg, Grant T; Seo, Youngho
2014-11-01
The primary goal of this project is to implement the iterative statistical image reconstruction algorithm, in this case maximum likelihood expectation maximum (MLEM) used for dynamic cardiac single photon emission computed tomography, on Spark/GraphX. This involves porting the algorithm to run on large-scale parallel computing systems. Spark is an easy-to- program software platform that can handle large amounts of data in parallel. GraphX is a graph analytic system running on top of Spark to handle graph and sparse linear algebra operations in parallel. The main advantage of implementing MLEM algorithm in Spark/GraphX is that it allows users to parallelize such computation without any expertise in parallel computing or prior knowledge in computer science. In this paper we demonstrate a successful implementation of MLEM in Spark/GraphX and present the performance gains with the goal to eventually make it useable in clinical setting.
The engineering design integration (EDIN) system. [digital computer program complex
NASA Technical Reports Server (NTRS)
Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.
1974-01-01
A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.
NASA Astrophysics Data System (ADS)
Gupta, V.; Gupta, N.; Gupta, S.; Field, E.; Maechling, P.
2003-12-01
Modern laptop computers, and personal computers, can provide capabilities that are, in many ways, comparable to workstations or departmental servers. However, this doesn't mean we should run all computations on our local computers. We have identified several situations in which it preferable to implement our seismological application programs in a distributed, server-based, computing model. In this model, application programs on the user's laptop, or local computer, invoke programs that run on an organizational server, and the results are returned to the invoking system. Situations in which a server-based architecture may be preferred include: (a) a program is written in a language, or written for an operating environment, that is unsupported on the local computer, (b) software libraries or utilities required to execute a program are not available on the users computer, (c) a computational program is physically too large, or computationally too expensive, to run on a users computer, (d) a user community wants to enforce a consistent method of performing a computation by standardizing on a single implementation of a program, and (e) the computational program may require current information, that is not available to all client computers. Until recently, distributed, server-based, computational capabilities were implemented using client/server architectures. In these architectures, client programs were often written in the same language, and they executed in the same computing environment, as the servers. Recently, a new distributed computational model, called Web Services, has been developed. Web Services are based on Internet standards such as XML, SOAP, WDSL, and UDDI. Web Services offer the promise of platform, and language, independent distributed computing. To investigate this new computational model, and to provide useful services to the SCEC Community, we have implemented several computational and utility programs using a Web Service architecture. We have hosted these Web Services as a part of the SCEC Community Modeling Environment (SCEC/CME) ITR Project (http://www.scec.org/cme). We have implemented Web Services for several of the reasons sited previously. For example, we implemented a FORTRAN-based Earthquake Rupture Forecast (ERF) as a Web Service for use by client computers that don't support a FORTRAN runtime environment. We implemented a Generic Mapping Tool (GMT) Web Service for use by systems that don't have local access to GMT. We implemented a Hazard Map Calculator Web Service to execute Hazard calculations that are too computationally intensive to run on a local system. We implemented a Coordinate Conversion Web Service to enforce a standard and consistent method for converting between UTM and Lat/Lon. Our experience developing these services indicates both strengths and weakness in current Web Service technology. Client programs that utilize Web Services typically need network access, a significant disadvantage at times. Programs with simple input and output parameters were the easiest to implement as Web Services, while programs with complex parameter-types required a significant amount of additional development. We also noted that Web services are very data-oriented, and adapting object-oriented software into the Web Service model proved problematic. Also, the Web Service approach of converting data types into XML format for network transmission has significant inefficiencies for some data sets.
Using Avizo Software on the Peregrine System | High-Performance Computing |
be run remotely from the Peregrine visualization node. First, launch a TurboVNC remote desktop. Then from a terminal in that remote desktop: % module load avizo % vglrun avizo Running Locally Avizo can
A Computing Infrastructure for Supporting Climate Studies
NASA Astrophysics Data System (ADS)
Yang, C.; Bambacus, M.; Freeman, S. M.; Huang, Q.; Li, J.; Sun, M.; Xu, C.; Wojcik, G. S.; Cahalan, R. F.; NASA Climate @ Home Project Team
2011-12-01
Climate change is one of the major challenges facing us on the Earth planet in the 21st century. Scientists build many models to simulate the past and predict the climate change for the next decades or century. Most of the models are at a low resolution with some targeting high resolution in linkage to practical climate change preparedness. To calibrate and validate the models, millions of model runs are needed to find the best simulation and configuration. This paper introduces the NASA effort on Climate@Home project to build a supercomputer based-on advanced computing technologies, such as cloud computing, grid computing, and others. Climate@Home computing infrastructure includes several aspects: 1) a cloud computing platform is utilized to manage the potential spike access to the centralized components, such as grid computing server for dispatching and collecting models runs results; 2) a grid computing engine is developed based on MapReduce to dispatch models, model configuration, and collect simulation results and contributing statistics; 3) a portal serves as the entry point for the project to provide the management, sharing, and data exploration for end users; 4) scientists can access customized tools to configure model runs and visualize model results; 5) the public can access twitter and facebook to get the latest about the project. This paper will introduce the latest progress of the project and demonstrate the operational system during the AGU fall meeting. It will also discuss how this technology can become a trailblazer for other climate studies and relevant sciences. It will share how the challenges in computation and software integration were solved.
Zhang, Yong; Otani, Akihito; Maginn, Edward J
2015-08-11
Equilibrium molecular dynamics is often used in conjunction with a Green-Kubo integral of the pressure tensor autocorrelation function to compute the shear viscosity of fluids. This approach is computationally expensive and is subject to a large amount of variability because the plateau region of the Green-Kubo integral is difficult to identify unambiguously. Here, we propose a time decomposition approach for computing the shear viscosity using the Green-Kubo formalism. Instead of one long trajectory, multiple independent trajectories are run and the Green-Kubo relation is applied to each trajectory. The averaged running integral as a function of time is fit to a double-exponential function with a weighting function derived from the standard deviation of the running integrals. Such a weighting function minimizes the uncertainty of the estimated shear viscosity and provides an objective means of estimating the viscosity. While the formal Green-Kubo integral requires an integration to infinite time, we suggest an integration cutoff time tcut, which can be determined by the relative values of the running integral and the corresponding standard deviation. This approach for computing the shear viscosity can be easily automated and used in computational screening studies where human judgment and intervention in the data analysis are impractical. The method has been applied to the calculation of the shear viscosity of a relatively low-viscosity liquid, ethanol, and relatively high-viscosity ionic liquid, 1-n-butyl-3-methylimidazolium bis(trifluoromethane-sulfonyl)imide ([BMIM][Tf2N]), over a range of temperatures. These test cases show that the method is robust and yields reproducible and reliable shear viscosity values.
Shared address collectives using counter mechanisms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blocksome, Michael; Dozsa, Gabor; Gooding, Thomas M
A shared address space on a compute node stores data received from a network and data to transmit to the network. The shared address space includes an application buffer that can be directly operated upon by a plurality of processes, for instance, running on different cores on the compute node. A shared counter is used for one or more of signaling arrival of the data across the plurality of processes running on the compute node, signaling completion of an operation performed by one or more of the plurality of processes, obtaining reservation slots by one or more of the pluralitymore » of processes, or combinations thereof.« less
Crew appliance computer program manual, volume 1
NASA Technical Reports Server (NTRS)
Russell, D. J.
1975-01-01
Trade studies of numerous appliance concepts for advanced spacecraft galley, personal hygiene, housekeeping, and other areas were made to determine which best satisfy the space shuttle orbiter and modular space station mission requirements. Analytical models of selected appliance concepts not currently included in the G-189A Generalized Environmental/Thermal Control and Life Support Systems (ETCLSS) Computer Program subroutine library were developed. The new appliance subroutines are given along with complete analytical model descriptions, solution methods, user's input instructions, and validation run results. The appliance components modeled were integrated with G-189A ETCLSS models for shuttle orbiter and modular space station, and results from computer runs of these systems are presented.
NASA Astrophysics Data System (ADS)
Steiger, Damian S.; Haener, Thomas; Troyer, Matthias
Quantum computers promise to transform our notions of computation by offering a completely new paradigm. A high level quantum programming language and optimizing compilers are essential components to achieve scalable quantum computation. In order to address this, we introduce the ProjectQ software framework - an open source effort to support both theorists and experimentalists by providing intuitive tools to implement and run quantum algorithms. Here, we present our ProjectQ quantum compiler, which compiles a quantum algorithm from our high-level Python-embedded language down to low-level quantum gates available on the target system. We demonstrate how this compiler can be used to control actual hardware and to run high-performance simulations.
PHREEQCI; a graphical user interface for the geochemical computer program PHREEQC
Charlton, Scott R.; Macklin, Clifford L.; Parkhurst, David L.
1997-01-01
PhreeqcI is a Windows-based graphical user interface for the geochemical computer program PHREEQC. PhreeqcI provides the capability to generate and edit input data files, run simulations, and view text files containing simulation results, all within the framework of a single interface. PHREEQC is a multipurpose geochemical program that can perform speciation, inverse, reaction-path, and 1D advective reaction-transport modeling. Interactive access to all of the capabilities of PHREEQC is available with PhreeqcI. The interface is written in Visual Basic and will run on personal computers under the Windows(3.1), Windows95, and WindowsNT operating systems.
Local rollback for fault-tolerance in parallel computing systems
Blumrich, Matthias A [Yorktown Heights, NY; Chen, Dong [Yorktown Heights, NY; Gara, Alan [Yorktown Heights, NY; Giampapa, Mark E [Yorktown Heights, NY; Heidelberger, Philip [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Steinmacher-Burow, Burkhard [Boeblingen, DE; Sugavanam, Krishnan [Yorktown Heights, NY
2012-01-24
A control logic device performs a local rollback in a parallel super computing system. The super computing system includes at least one cache memory device. The control logic device determines a local rollback interval. The control logic device runs at least one instruction in the local rollback interval. The control logic device evaluates whether an unrecoverable condition occurs while running the at least one instruction during the local rollback interval. The control logic device checks whether an error occurs during the local rollback. The control logic device restarts the local rollback interval if the error occurs and the unrecoverable condition does not occur during the local rollback interval.
[Groupamatic 360 C1 and automated blood donor processing in a transfusion center].
Guimbretiere, J; Toscer, M; Harousseau, H
1978-03-01
Automation of donor management flow path is controlled by: --a 3 slip "port a punch" card, --the groupamatic unit with a result sorted out on punch paper tape, --the management computer off line connected to groupamatic. Data tracking at blood collection time is made by punching a card with the donor card used as a master card. Groupamatic performs: --a standard blood grouping with one run for registered donors and two runs for new donors, --a phenotyping with two runs, --a screening of irregular antibodies. Themanagement computer checks the correlation between the data of the two runs or the data of a single run and that of previous file. It updates the data resident in the central file and prints out: --the controls of the different blood group for the red cell panel, --The listing of error messages, --The listing of emergency call up, --The listing of collected blood units when arrived at the blood center, with quantitative and qualitative information such as: number of blood, units collected, donor addresses, etc., --Statistics, --Donor cards, --Diplomas.
Commercial products to preserve specimens for tuberculosis diagnosis: a systematic review.
Reeve, B W P; McFall, S M; Song, R; Warren, R; Steingart, K R; Theron, G
2018-07-01
Eliminating tuberculosis in high-burden settings requires improved diagnostic capacity. Important tests such as Xpert® MTB/RIF and culture are often performed at centralised laboratories that are geographically distant from the point of specimen collection. Preserving specimen integrity during transportation, which could affect test performance, is challenging. To conduct a systematic review of commercial products for specimen preservation for a World Health Organization technical consultation. Databases were searched up to January 2018. Methodological quality was assessed using Quality Assessment of Technical Studies, a new technical study quality-appraisal tool, and Quality Assessment of Diagnostic Accuracy Studies-2. Studies were analysed descriptively in terms of the different products, study designs and diagnostic strategies used. Four products were identified from 16 studies: PrimeStore-Molecular-Transport-Medium (PS-MTM), FTA card, GENO•CARD (all for nucleic acid amplification tests [NAATs]) and OMNIgene•SPUTUM (OMS; culture, NAATs). PS-MTM, but not FTA card or GENO•CARD, rendered Mycobacterium tuberculosis non-culturable. OMS reduced Löwenstein-Jensen but not MGIT™ 960™ contamination, led to delayed MGIT time-to-positivity, resulted in Xpert performance similar to cold chain-transported untreated specimens, and obviated the need for N-acetyl-L-cysteine-sodium hydroxide decontamination. Data from paucibacillary specimens were limited. Evidence that a cold chain improves culture was mixed and absent for Xpert. The effect of the product alone could be discerned in only four studies. Limited evidence suggests that transport products result in test performance comparable to that seen in cold chain-transported specimens.
Three-year financial analysis of pharmacy services at an independent community pharmacy.
Doucette, William R; McDonough, Randal P; Mormann, Megan M; Vaschevici, Renata; Urmie, Julie M; Patterson, Brandon J
2012-01-01
To assess the financial performance of pharmacy services including vaccinations, cholesterol screenings, medication therapy management (MTM), adherence management services, employee health fairs, and compounding services provided by an independent community pharmacy. Three years (2008-10) of pharmacy records were examined to determine the total revenue and costs of each service. Costs included products, materials, labor, marketing, overhead, equipment, reference materials, and fax/phone usage. Costs were allocated to each service using accepted principles (e.g., time for labor). Depending on the service, the total revenue was calculated by multiplying the frequency of the service by the revenue per patient or by adding the total revenue received. A sensitivity analysis was conducted for the adherence management services to account for average dispensing net profit. 7 of 11 pharmacy services showed a net profit each year. Those services include influenza and herpes zoster immunization services, MTM, two adherence management services, employee health fairs, and prescription compounding services. The services that realized a net loss included the pneumococcal immunization service, cholesterol screenings, and two adherence management services. The sensitivity analysis showed that all adherence services had a net gain when average dispensing net profit was included. Most of the pharmacist services had an annual positive net gain. It seems likely that these services can be sustained. Further cost management, such as reducing labor costs, could improve the viability of services with net losses. However, even with greater efficiency, external factors such as competition and reimbursement challenge the sustainability of these services.
Lee, Seung-Kyun; Mathieu, Jean-Baptiste; Graziani, Dominic; Piel, Joseph; Budesheim, Eric; Fiveland, Eric; Hardy, Christopher J.; Tan, Ek Tsoon; Amm, Bruce; Foo, Thomas K.-F; Bernstein, Matt A.; Huston, John; Shu, Yunhong; Schenck, John F.
2015-01-01
Purpose To characterize peripheral nerve stimulation (PNS) of an asymmetric head-only gradient coil that is compatible with a commercial high-channel-count receive-only array. Methods Two prototypes of an asymmetric head-only gradient coil set, with 42-cm inner diameter, were constructed for brain imaging at 3T with maximum performance specifications of up to 85 mT/m and 708 T/m/s. 24 volunteer tests were performed to measure PNS thresholds with the transverse (X, left/right; Y, anterior/posterior) gradient coils of both prototypes. 14 volunteers were also tested for the Z-gradient PNS in the second prototype, and were additionally scanned with high-slew-rate EPI immediately after the PNS tests. Results For both prototypes, the Y-gradient PNS threshold was markedly higher than the X-gradient. The Z-gradient threshold was intermediate between those for the X- and Y-coils. Out of the 24 volunteer subjects, only two experienced Y-gradient PNS at 80 mT/m, 500 T/m/s. All volunteers underwent the EPI scan without PNS when the readout direction was set to A/P. Conclusion Measured PNS characteristics of asymmetric head-only gradient coil prototypes indicate that such coils, especially in the A/P direction, can be used for fast EPI readout in high-performance neuroimaging scans with substantially reduced PNS concerns compared to conventional whole-body gradient coils. PMID:26628078
The paradox of pharmacy: A profession's house divided.
Brown, Daniel
2012-01-01
To describe the paradox in pharmacy between the vision of patient care and the reality of community pharmacy practice and to explore how integrated reimbursement for the retail prescription and linking cognitive patient care services directly to prescription processing could benefit the profession. A dichotomy exists between what many pharmacists do and what they've been trained to do. Pharmacy leaders have formulated a vision for pharmacists to become more involved in direct patient care. All graduates now receive PharmD-level training, and some leaders call for requirements of postgraduate residency training and board certification for pharmacists who provide patient care. How such requirements would relate to community pharmacy practice is unclear. The retail prescription remains the primary link between the pharmacist and the health care consumer. Cognitive services, such as medication therapy management (MTM), need to be integrated into the standard workflow of community pharmacies so as to become a natural extension of the professional services rendered in the process of filling a prescription. Current prescription fees are not sufficient to support legitimate professional services. A proposed integrated pricing system for retail prescriptions includes a $15 professional fee that is scaled upward for value-added services, such as MTM. Pharmacy includes a diversity of practice that has historically been a source of division. For pharmacists to reach their potential as patient care providers, the various factions within the profession must forge a unified vision of the future that addresses all realms of practice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Viola, M. E.; Brown, T.; Heitzenroeder, P.
The National Compact Stellarator Experiment (NCSX) is being constructed at the Princeton Plasma Physics Laboratory (PPPL) in conjunction with the Oak Ridge National Laboratory (ORNL). The goal of this experiment is to develop a device which has the steady state properties of a traditional stellarator along with the high performance characteristics of a tokamak. A key element of this device is its highly shaped Inconel 625 vacuum vessel. This paper describes the manufacturing of the vessel. The vessel is being fabricated by Major Tool and Machine, Inc. (MTM) in three identical 120º vessel segments, corresponding to the three NCSX fieldmore » periods, in order to accommodate assembly of the device. The port extensions are welded on, leak checked, cut off within 1" of the vessel surface at MTM and then reattached at PPPL, to accommodate assembly of the close-fitting modular coils that surround the vessel. The 120º vessel segments are formed by welding two 60º segments together. Each 60º segment is fabricated by welding ten press-formed panels together over a collapsible welding fixture which is needed to precisely position the panels. The vessel is joined at assembly by welding via custom machined 8" (20.3 cm) wide spacer "spool pieces." The vessel must have a total leak rate less than 5 X 10-6 t-l/s, magnetic permeability less than 1.02μ, and its contours must be within 0.188" (4.76 mm). It is scheduled for completion in January 2006.« less
Abusive User Policy | High-Performance Computing | NREL
below. First Incident The user's ability to run new jobs or store new data will be suspended temporarily acknowledged and participated in a remedy, ability to run new jobs or store new data will be restored. Second Incident Suspend running new jobs or storing new data. Terminate jobs if necessary. The system and
PNNL streamlines energy-guzzling computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, Mary T.; Marquez, Andres
In a room the size of a garage, two rows of six-foot-tall racks holding supercomputer hard drives sit back-to-back. Thin tubes and wires snake off the hard drives, slithering into the corners. Stepping between the rows, a rush of heat whips around you -- the air from fans blowing off processing heat. But walk farther in, between the next racks of hard drives, and the temperature drops noticeably. These drives are being cooled by a non-conducting liquid that runs right over the hardworking processors. The liquid carries the heat away in tubes, saving the air a few degrees. This ismore » the Energy Smart Data Center at Pacific Northwest National Laboratory. The bigger, faster, and meatier supercomputers get, the more energy they consume. PNNL's Andres Marquez has developed this test bed to learn how to train the behemoths in energy efficiency. The work will help supercomputers perform better as well. Processors have to keep cool or suffer from "thermal throttling," says Marquez. "That's the performance threshold where the computer is too hot to run well. That threshold is an industry secret." The center at EMSL, DOE's national scientific user facility at PNNL, harbors several ways of experimenting with energy usage. For example, the room's air conditioning is isolated from the rest of EMSL -- pipes running beneath the floor carry temperature-controlled water through heat exchangers to cooling towers outside. "We can test whether it's more energy efficient to cool directly on the processing chips or out in the water tower," says Marquez. The hard drives feed energy and temperature data to a network server running specially designed software that controls and monitors the data center. To test the center’s limits, the team runs the processors flat out – not only on carefully controlled test programs in the Energy Smart computers, but also on real world software from other EMSL research, such as regional weather forecasting models. Marquez's group is also developing "power aware computing", where the computer programs themselves perform calculations more energy efficiently. Maybe once computers get smart about energy, they'll have tips for their users.« less
A Menu-Driven Interface to Unix-Based Resources
Evans, Elizabeth A.
1989-01-01
Unix has often been overlooked in the past as a viable operating system for anyone other than computer scientists. Its terseness, non-mnemonic nature of the commands, and the lack of user-friendly software to run under it are but a few of the user-related reasons which have been cited. It is, nevertheless, the operating system of choice in many cases. This paper describes a menu-driven interface to Unix which provides user-friendlier access to the software resources available on the computers running under Unix.
The Impact of Typhoons on the Ocean in the Pacific (ITOP) Field and Data Management Support
2011-12-16
in October o f 2009 to develop effective sampling strategies for 20 I 0. EOL /Computing Data and Software Facil ity (CDS) supported the !TO P Dry Run...measurement strategies necessitated a dry run experiment in October of 2009 to develop effective sampling strategies for 2010. EOL /Computing Data and...contains products from 21 September through 32 October 2009. The catalog remains accessible at EOL at the above mentioned uri. The products listed by
Building Computer-Based Experiments in Psychology without Programming Skills.
Ruisoto, Pablo; Bellido, Alberto; Ruiz, Javier; Juanes, Juan A
2016-06-01
Research in Psychology usually requires to build and run experiments. However, although this task has required scripting, recent computer tools based on graphical interfaces offer new opportunities in this field for researchers with non-programming skills. The purpose of this study is to illustrate and provide a comparative overview of two of the main free open source "point and click" software packages for building and running experiments in Psychology: PsychoPy and OpenSesame. Recommendations for their potential use are further discussed.
Interoperability...NMCI and Beyond
2001-05-31
wireless. “On The Road” – Pagers – Cell phones – Palm-size PDAs – Two way pagers – Hand-held computing device – Laptop computer – Two-way radios – A...combat capability”… $0 $5 $10 $15 $20 $25 Electric Power NMCI Seat First Run Movie Cell Phone Fed. Civilian Salary 23.80 11.00 4.00 1.380.20 F/A-18...Flying Hour: 1,134.00 Fed. Civilian Salary (mean): 23.80 Cell Phone Air Time: 11.00 First Run Movie: 4.00 DSN
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results. PMID:3032390
Barlough, J E; Jacobson, R H; Downing, D R; Lynch, T J; Scott, F W
1987-01-01
The computer-assisted, kinetics-based enzyme-linked immunosorbent assay for coronavirus antibodies in cats was calibrated to the conventional indirect immunofluorescence assay by linear regression analysis and computerized interpolation (generation of "immunofluorescence assay-equivalent" titers). Procedures were developed for normalization and standardization of kinetics-based enzyme-linked immunosorbent assay results through incorporation of five different control sera of predetermined ("expected") titer in daily runs. When used with such sera and with computer assistance, the kinetics-based enzyme-linked immunosorbent assay minimized both within-run and between-run variability while allowing also for efficient data reduction and statistical analysis and reporting of results.
2012-01-01
Background Laboratories engaged in computational biology or bioinformatics frequently need to run lengthy, multistep, and user-driven computational jobs. Each job can tie up a computer for a few minutes to several days, and many laboratories lack the expertise or resources to build and maintain a dedicated computer cluster. Results JobCenter is a client–server application and framework for job management and distributed job execution. The client and server components are both written in Java and are cross-platform and relatively easy to install. All communication with the server is client-driven, which allows worker nodes to run anywhere (even behind external firewalls or “in the cloud”) and provides inherent load balancing. Adding a worker node to the worker pool is as simple as dropping the JobCenter client files onto any computer and performing basic configuration, which provides tremendous ease-of-use, flexibility, and limitless horizontal scalability. Each worker installation may be independently configured, including the types of jobs it is able to run. Executed jobs may be written in any language and may include multistep workflows. Conclusions JobCenter is a versatile and scalable distributed job management system that allows laboratories to very efficiently distribute all computational work among available resources. JobCenter is freely available at http://code.google.com/p/jobcenter/. PMID:22846423
Changes in running kinematics, kinetics, and spring-mass behavior over a 24-h run.
Morin, Jean-Benoît; Samozino, Pierre; Millet, Guillaume Y
2011-05-01
This study investigated the changes in running mechanics and spring-mass behavior over a 24-h treadmill run (24TR). Kinematics, kinetics, and spring-mass characteristics of the running step were assessed in 10 experienced ultralong-distance runners before, every 2 h, and after a 24TR using an instrumented treadmill dynamometer. These measurements were performed at 10 km·h, and mechanical parameters were sampled at 1000 Hz for 10 consecutive steps. Contact and aerial times were determined from ground reaction force (GRF) signals and used to compute step frequency. Maximal GRF, loading rate, downward displacement of the center of mass, and leg length change during the support phase were determined and used to compute both vertical and leg stiffness. Subjects' running pattern and spring-mass behavior significantly changed over the 24TR with a 4.9% higher step frequency on average (because of a significantly 4.5% shorter contact time), a lower maximal GRF (by 4.4% on average), a 13.0% lower leg length change during contact, and an increase in both leg and vertical stiffness (+9.9% and +8.6% on average, respectively). Most of these changes were significant from the early phase of the 24TR (fourth to sixth hour of running) and could be speculated as contributing to an overall limitation of the potentially harmful consequences of such a long-duration run on subjects' musculoskeletal system. During a 24TR, the changes in running mechanics and spring-mass behavior show a clear shift toward a higher oscillating frequency and stiffness, along with lower GRF and leg length change (hence a reduced overall eccentric load) during the support phase of running. © 2011 by the American College of Sports Medicine
Leonard, Noelle Regina; Silverman, Michelle; Sherpa, Dawa Phuti; Naegle, Madeline A; Kim, Hyorim; Coffman, Donna L; Ferdschneider, Marcy
2017-07-07
An increasing number of mobile app interventions have been developed for problem drinking among college students; however, few studies have examined the integration of a mobile app with continuous physiological monitoring and alerting of affective states related to drinking behaviors. The aim of this paper was to evaluate the acceptability and feasibility of Mind the Moment (MtM), a theoretically based intervention for female college students with problem drinking that combines brief, in-person counseling with ecological momentary intervention (EMI) on a mobile app integrated with a wearable sensorband. We recruited 10 non-treatment seeking, female undergraduates from a university health clinic who scored a 3 or higher on the Alcohol Use Disorders Identification Test-Consumption (AUDIT-C) to participate in this pilot study. Study activities involved an in-person baseline intake and 1 follow-up assessment, 2 in-person alcohol brief intervention counseling sessions, and use of MtM technology components (sensorband and EMI on a mobile app) for approximately 3-4 weeks. The intervention used motivational interviewing (MI) and cognitive behavioral therapy (CBT) strategies for reducing risks associated with drinking. We used both qualitative and quantitative assessments to measure acceptability of the intervention and feasibility of delivery. Use patterns of the sensorband and mobile app were also collected. Quantitative and qualitative data indicated high levels of acceptability for the MtM intervention. Altogether, participants made reports on the app on 26.7% (78/292) the days the technology was available to them and completed a total of 325 reports with wide variation between participants. Qualitative findings indicated that sensorband-elicited alerts promoted an increase in awareness of thoughts, feelings, and behaviors related to current environmental stressors and drinking behaviors in theoretically meaningful ways. Specific challenges related to functionality and form of the sensorband were identified. Delivering intervention material "just-in-time," at the moment participants need to use behavioral strategies has great potential to individualize behavioral interventions for reducing problem drinking and other health behaviors. These findings provide initial evidence for the promise of wearable sensors for increasing potency of theoretically grounded mobile health interventions and point to directions for future research and uptake of these technologies. ©Noelle Regina Leonard, Michelle Silverman, Dawa Phuti Sherpa, Madeline A Naegle, Hyorim Kim, Donna L Coffman, Marcy Ferdschneider. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 07.07.2017.
Observing Equatorial Thermospheric Winds and Temperatures with a New Mapping Technique
NASA Astrophysics Data System (ADS)
Faivre, M. W.; Meriwether, J. W.; Sherwood, P.; Veliz, O.
2005-12-01
Application of the Fabry-Perot interferometer (FPI) at Arequipa, Peru (16.4S, 71.4 W) to measure the Doppler shifts and Doppler broadenings in the equatorial O(1D) 630-nm nightglow has resulted in numerous detections of a large-scale thermospheric phenomenon called the Midnight Temperature Maximum (MTM). A recent detector upgrade with a CCD camera has improved the accuracy of these measurements by a factor of 5. Temperature increases of 50 to 150K have been measured during nights in April and July, 2005, with error bars less than 10K after averaging in all directions. Moreover, the meridional wind measurements show evidence for a flow reversal from equatorward to poleward near local midnight for such events. A new observing strategy based upon the pioneering work of Burnside et al.[1981] maps the equatorial wind and temperature fields by observing in eight equally-spaced azimuth directions, each with a zenith angle of 60 degrees. Analysis of the data obtained with this technique gives the mean wind velocities in the meridional and zonal directions as well as the horizontal gradients of the wind field for these directions. Significant horizontal wind gradients are found for the meridional direction but not for the zonal direction. The zonal wind blows eastward throughout the night with a maximum speed of ~150 m/s near the middle of the night and then decreases towards zero just before dawn. In general, the fastest poleward meridional wind is observed near mid-evening. By the end of the night, the meridional flow tends to be more equatorward at speeds of about 50 m/s. Using the assumption that local time and longitude are equivalent over a period of 30 minutes, a map of the horizontal wind field vector field is constructed over a range of 12 degrees latitude centered at 16.5 S. Comparison between MTM nights and quiet nights (no MTM) revealed significant differences in the horizontal wind fields. Using the method of Fourier decomposition of the line-of-sight winds, the vertical wind can be retrieved from the horizontal flow divergence with a much-improved sensitivity than that represented by direct zenith measurements. The value of the vertical wind speed ranges from -5 to 5 m/s. Some nights seem to present gravity wave activity with periodic fluctuations of 1-2 hours visible in the vertical winds as well as in the temperature series.
NASA Astrophysics Data System (ADS)
Mélice, J. L.; Roucou, P.
The spectral characteristics of the δ18O isotopic ratio time series of the Quelccaya ice cap summit core are investigated with the multi taper method (MTM), the singular spectrum analysis (SSA) and the wavelet transform (WT) techniques for the 500 y long 1485-1984 period. The most significant (at the 99.8% level) cycle according to the MTM F-test has a period centered at 14.4 y while the largest variance explaining oscillation according to the SSA technique has a period centered at 12.9 y. The stability over time of these periods is investigated by performing evolutive MTM and SSA on the 500 y long δ18O series with a 100 y wide moving window. It is shown that the cycles with largest amplitude and that the oscillations with largest extracting variance have corresponding periods aggregated around 13.5 y that are very stable over the period between 1485 and 1984. The WT of the same isotopic time series reveals the existence of a main oscillation around 12 y which are also very stable in time. The relation between the isotopic data at Quelccaya and the annual sea surface temperature (SST) field anomalies is then evaluated for the overlapping 1919-1984 period. Significant global correlation and significant coherency at 12.1 y are found between the isotopic series and the annual global sea surface temperature (GSST) series. Moreover, the correlation between the low (over 8 y) frequency component of the isotopic time series and the annual SST field point out significant values in the tropical North Atlantic. This region is characterized by a main SST variability at 12.8 y. The Quelccaya δ18O isotopic ratio series may therefore be considered as a good recorder of the tropical North Atlantic SSTs. This may be explained by the following mechanism: the water vapor amount evaporated by the tropical North Atlantic is function of the SST. So is the water vapor δ18O isotopic ratio. This water vapor is advected during the rainy season by northeast winds and precipitates at the Quelccaya summit with its tropical North Atlantic isotopic signature. It is also suggested from this described stability of the decadal time scale variability observed in the Quelccaya isotopic series, that the decadal time scale GSST variability was also stable during the last five centuries.
ERIC Educational Resources Information Center
Shade, Daniel D.
1994-01-01
Provides advice and suggestions for educators or parents who are trying to decide what type of computer to buy to run the latest computer software for children. Suggests that purchasers should buy a computer with as large a hard drive as possible, at least 10 megabytes of RAM, and a CD-ROM drive. (MDM)
Use of UNIX in large online processor farms
NASA Astrophysics Data System (ADS)
Biel, Joseph R.
1990-08-01
There has been a recent rapid increase in the power of RISC computers running the UNIX operating system. Fermilab has begun to make use of these computers in the next generation of offline computer farms. It is also planning to use such computers in online computer farms. Issues involved in constructing online UNIX farms are discussed.
Optimizing a mobile robot control system using GPU acceleration
NASA Astrophysics Data System (ADS)
Tuck, Nat; McGuinness, Michael; Martin, Fred
2012-01-01
This paper describes our attempt to optimize a robot control program for the Intelligent Ground Vehicle Competition (IGVC) by running computationally intensive portions of the system on a commodity graphics processing unit (GPU). The IGVC Autonomous Challenge requires a control program that performs a number of different computationally intensive tasks ranging from computer vision to path planning. For the 2011 competition our Robot Operating System (ROS) based control system would not run comfortably on the multicore CPU on our custom robot platform. The process of profiling the ROS control program and selecting appropriate modules for porting to run on a GPU is described. A GPU-targeting compiler, Bacon, is used to speed up development and help optimize the ported modules. The impact of the ported modules on overall performance is discussed. We conclude that GPU optimization can free a significant amount of CPU resources with minimal effort for expensive user-written code, but that replacing heavily-optimized library functions is more difficult, and a much less efficient use of time.
NASA Technical Reports Server (NTRS)
Meyer, Donald; Uchenik, Igor
2007-01-01
The PPC750 Performance Monitor (Perfmon) is a computer program that helps the user to assess the performance characteristics of application programs running under the Wind River VxWorks real-time operating system on a PPC750 computer. Perfmon generates a user-friendly interface and collects performance data by use of performance registers provided by the PPC750 architecture. It processes and presents run-time statistics on a per-task basis over a repeating time interval (typically, several seconds or minutes) specified by the user. When the Perfmon software module is loaded with the user s software modules, it is available for use through Perfmon commands, without any modification of the user s code and at negligible performance penalty. Per-task run-time performance data made available by Perfmon include percentage time, number of instructions executed per unit time, dispatch ratio, stack high water mark, and level-1 instruction and data cache miss rates. The performance data are written to a file specified by the user or to the serial port of the computer
Volunteer Computing Experience with ATLAS@Home
NASA Astrophysics Data System (ADS)
Adam-Bourdarios, C.; Bianchi, R.; Cameron, D.; Filipčič, A.; Isacchini, G.; Lançon, E.; Wu, W.;
2017-10-01
ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.
MPI_XSTAR: MPI-based Parallelization of the XSTAR Photoionization Program
NASA Astrophysics Data System (ADS)
Danehkar, Ashkbiz; Nowak, Michael A.; Lee, Julia C.; Smith, Randall K.
2018-02-01
We describe a program for the parallel implementation of multiple runs of XSTAR, a photoionization code that is used to predict the physical properties of an ionized gas from its emission and/or absorption lines. The parallelization program, called MPI_XSTAR, has been developed and implemented in the C++ language by using the Message Passing Interface (MPI) protocol, a conventional standard of parallel computing. We have benchmarked parallel multiprocessing executions of XSTAR, using MPI_XSTAR, against a serial execution of XSTAR, in terms of the parallelization speedup and the computing resource efficiency. Our experience indicates that the parallel execution runs significantly faster than the serial execution, however, the efficiency in terms of the computing resource usage decreases with increasing the number of processors used in the parallel computing.
ATLAS Distributed Computing Experience and Performance During the LHC Run-2
NASA Astrophysics Data System (ADS)
Filipčič, A.;
2017-10-01
ATLAS Distributed Computing during LHC Run-1 was challenged by steadily increasing computing, storage and network requirements. In addition, the complexity of processing task workflows and their associated data management requirements led to a new paradigm in the ATLAS computing model for Run-2, accompanied by extensive evolution and redesign of the workflow and data management systems. The new systems were put into production at the end of 2014, and gained robustness and maturity during 2015 data taking. ProdSys2, the new request and task interface; JEDI, the dynamic job execution engine developed as an extension to PanDA; and Rucio, the new data management system, form the core of Run-2 ATLAS distributed computing engine. One of the big changes for Run-2 was the adoption of the Derivation Framework, which moves the chaotic CPU and data intensive part of the user analysis into the centrally organized train production, delivering derived AOD datasets to user groups for final analysis. The effectiveness of the new model was demonstrated through the delivery of analysis datasets to users just one week after data taking, by completing the calibration loop, Tier-0 processing and train production steps promptly. The great flexibility of the new system also makes it possible to execute part of the Tier-0 processing on the grid when Tier-0 resources experience a backlog during high data-taking periods. The introduction of the data lifetime model, where each dataset is assigned a finite lifetime (with extensions possible for frequently accessed data), was made possible by Rucio. Thanks to this the storage crises experienced in Run-1 have not reappeared during Run-2. In addition, the distinction between Tier-1 and Tier-2 disk storage, now largely artificial given the quality of Tier-2 resources and their networking, has been removed through the introduction of dynamic ATLAS clouds that group the storage endpoint nucleus and its close-by execution satellite sites. All stable ATLAS sites are now able to store unique or primary copies of the datasets. ATLAS Distributed Computing is further evolving to speed up request processing by introducing network awareness, using machine learning and optimisation of the latencies during the execution of the full chain of tasks. The Event Service, a new workflow and job execution engine, is designed around check-pointing at the level of event processing to use opportunistic resources more efficiently. ATLAS has been extensively exploring possibilities of using computing resources extending beyond conventional grid sites in the WLCG fabric to deliver as many computing cycles as possible and thereby enhance the significance of the Monte-Carlo samples to deliver better physics results. The exploitation of opportunistic resources was at an early stage throughout 2015, at the level of 10% of the total ATLAS computing power, but in the next few years it is expected to deliver much more. In addition, demonstrating the ability to use an opportunistic resource can lead to securing ATLAS allocations on the facility, hence the importance of this work goes beyond merely the initial CPU cycles gained. In this paper, we give an overview and compare the performance, development effort, flexibility and robustness of the various approaches.
Power Analysis of an Enterprise Wireless Communication Architecture
2017-09-01
easily plug a satellite-based communication module into the enterprise processor when needed. Once plugged-in, it automatically runs the corresponding...reduce the SWaP by using a singular processing/computing module to run user applications and to implement waveform algorithms. This approach would...GPP) technology improved enough to allow a wide variety of waveforms to run in the GPP; thus giving rise to the SDR (Brannon 2004). Today’s
Data intensive computing at Sandia.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, Andrew T.
2010-09-01
Data-Intensive Computing is parallel computing where you design your algorithms and your software around efficient access and traversal of a data set; where hardware requirements are dictated by data size as much as by desired run times usually distilling compact results from massive data.
SFU Hacking for Non-Hackers v. 1.005
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, David James
The program provides a limited virtual environment for exploring some concepts of computer hacking. It simulates a simple computer system with intentional vulnerabilities, allowing the user to issue commands and observe their results. It does not affect the computer on which it runs.
Virtual network computing: cross-platform remote display and collaboration software.
Konerding, D E
1999-04-01
VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.
Injecting Artificial Memory Errors Into a Running Computer Program
NASA Technical Reports Server (NTRS)
Bornstein, Benjamin J.; Granat, Robert A.; Wagstaff, Kiri L.
2008-01-01
Single-event upsets (SEUs) or bitflips are computer memory errors caused by radiation. BITFLIPS (Basic Instrumentation Tool for Fault Localized Injection of Probabilistic SEUs) is a computer program that deliberately injects SEUs into another computer program, while the latter is running, for the purpose of evaluating the fault tolerance of that program. BITFLIPS was written as a plug-in extension of the open-source Valgrind debugging and profiling software. BITFLIPS can inject SEUs into any program that can be run on the Linux operating system, without needing to modify the program s source code. Further, if access to the original program source code is available, BITFLIPS offers fine-grained control over exactly when and which areas of memory (as specified via program variables) will be subjected to SEUs. The rate of injection of SEUs is controlled by specifying either a fault probability or a fault rate based on memory size and radiation exposure time, in units of SEUs per byte per second. BITFLIPS can also log each SEU that it injects and, if program source code is available, report the magnitude of effect of the SEU on a floating-point value or other program variable.
Recent Performance Results of VPIC on Trinity
NASA Astrophysics Data System (ADS)
Nystrom, W. D.; Bergen, B.; Bird, R. F.; Bowers, K. J.; Daughton, W. S.; Guo, F.; Le, A.; Li, H.; Nam, H.; Pang, X.; Stark, D. J.; Rust, W. N., III; Yin, L.; Albright, B. J.
2017-10-01
Trinity is a new DOE compute resource now in production at Los Alamos National Laboratory. Trinity has several new and unique features including two compute partitions, one with dual socket Intel Haswell Xeon compute nodes and one with Intel Knights Landing (KNL) Xeon Phi compute nodes, use of on package high bandwidth memory (HBM) for KNL nodes, ability to configure KNL nodes with respect to HBM model and on die network topology in a variety of operational modes at run time, and use of solid state storage via burst buffer technology to reduce time required to perform I/O. An effort is in progress to optimize VPIC on Trinity by taking advantage of these new architectural features. Results of work will be presented on performance of VPIC on Haswell and KNL partitions for single node runs and runs at scale. Results include use of burst buffers at scale to optimize I/O, comparison of strategies for using MPI and threads, performance benefits using HBM and effectiveness of using intrinsics for vectorization. Work performed under auspices of U.S. Dept. of Energy by Los Alamos National Security, LLC Los Alamos National Laboratory under contract DE-AC52-06NA25396 and supported by LANL LDRD program.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kendon, Viv
2014-12-04
Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer.
29 CFR 102.111 - Time computation.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 29 Labor 2 2010-07-01 2010-07-01 false Time computation. 102.111 Section 102.111 Labor Regulations... Papers § 102.111 Time computation. (a) In computing any period of time prescribed or allowed by these rules, the day of the act, event, or default after which the designated period of time begins to run is...
29 CFR 102.111 - Time computation.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 29 Labor 2 2014-07-01 2014-07-01 false Time computation. 102.111 Section 102.111 Labor Regulations... Papers § 102.111 Time computation. (a) In computing any period of time prescribed or allowed by these rules, the day of the act, event, or default after which the designated period of time begins to run is...
29 CFR 102.111 - Time computation.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 29 Labor 2 2012-07-01 2012-07-01 false Time computation. 102.111 Section 102.111 Labor Regulations... Papers § 102.111 Time computation. (a) In computing any period of time prescribed or allowed by these rules, the day of the act, event, or default after which the designated period of time begins to run is...
29 CFR 102.111 - Time computation.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 29 Labor 2 2013-07-01 2013-07-01 false Time computation. 102.111 Section 102.111 Labor Regulations... Papers § 102.111 Time computation. (a) In computing any period of time prescribed or allowed by these rules, the day of the act, event, or default after which the designated period of time begins to run is...
Fast methods to numerically integrate the Reynolds equation for gas fluid films
NASA Technical Reports Server (NTRS)
Dimofte, Florin
1992-01-01
The alternating direction implicit (ADI) method is adopted, modified, and applied to the Reynolds equation for thin, gas fluid films. An efficient code is developed to predict both the steady-state and dynamic performance of an aerodynamic journal bearing. An alternative approach is shown for hybrid journal gas bearings by using Liebmann's iterative solution (LIS) for elliptic partial differential equations. The results are compared with known design criteria from experimental data. The developed methods show good accuracy and very short computer running time in comparison with methods based on an inverting of a matrix. The computer codes need a small amount of memory and can be run on either personal computers or on mainframe systems.
New insights into faster computation of uncertainties
NASA Astrophysics Data System (ADS)
Bhattacharya, Atreyee
2012-11-01
Heavy computation power, lengthy simulations, and an exhaustive number of model runs—often these seem like the only statistical tools that scientists have at their disposal when computing uncertainties associated with predictions, particularly in cases of environmental processes such as groundwater movement. However, calculation of uncertainties need not be as lengthy, a new study shows. Comparing two approaches—the classical Bayesian “credible interval” and a less commonly used regression-based “confidence interval” method—Lu et al. show that for many practical purposes both methods provide similar estimates of uncertainties. The advantage of the regression method is that it demands 10-1000 model runs, whereas the classical Bayesian approach requires 10,000 to millions of model runs.
Non-volatile memory for checkpoint storage
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blumrich, Matthias A.; Chen, Dong; Cipolla, Thomas M.
A system, method and computer program product for supporting system initiated checkpoints in high performance parallel computing systems and storing of checkpoint data to a non-volatile memory storage device. The system and method generates selective control signals to perform checkpointing of system related data in presence of messaging activity associated with a user application running at the node. The checkpointing is initiated by the system such that checkpoint data of a plurality of network nodes may be obtained even in the presence of user applications running on highly parallel computers that include ongoing user messaging activity. In one embodiment, themore » non-volatile memory is a pluggable flash memory card.« less
Drug & Gene Interaction Risk Analysis With & Without Genetic Testing Among Patients Undergoing MTM
2017-02-22
Cytochrome P450 CYP2D6 Enzyme Deficiency; Poor Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Ultrarapid Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Extensive Metabolizer Due to Cytochrome P450 CYP2D6 Variant; Cytochrome P450 CYP2C9 Enzyme Deficiency; Cytochrome P450 CYP2C19 Enzyme Deficiency; Drug Metabolism, Poor, CYP2D6-RELATED; Drug Metabolism, Poor, CYP2C19-RELATED; CYP2D6 Polymorphism
One New Method to Generate 3-Dimensional Virtual Mannequin
NASA Astrophysics Data System (ADS)
Xiu-jin, Shi; Zhi-jun, Wang; Jia-jin, Le
The personal virtual mannequin is very important in electronic made to measure (eMTM) system. There is one new easy method to generate personal virtual mannequin. First, the characteristic information of customer's body is got from two photos. Secondly, some human body part templates corresponding with the customer are selected from the templates library. Thirdly, these templates are modified and assembled according to certain rules to generate a personalized 3-dimensional human, and then the virtual mannequin is realized. Experimental result shows that the method is easy and feasible.
Patterns of Food Utilization in the DOD. Volume 2
1975-05-01
DOD .on .213 81.540 ROLLS faROWN/SERVE DOD .014 .213 81.752 COOKIES VANILLA WAFER DOD .014 .210 81.963 SWEET DOUGH MIX DOD .014 .209 82.172 PEACHES...ROASTED COOKIE MIX OATiEAL SERVICE USACVRATION DOD .0C2 DOD .003 DOD .00307 DOD .00002 DOD .008 DOD .001 DO" .005 POD .0004 DOD .002 DOD .001...M.,lin.mtm^.,m,mtff»m. -~——-^.-P-~—~ iwmmm^* m u ITEM COOKIES V’MILL» WAFER CCRN BREAD MIX CORN CHIPS CORN CREAM STYLE CND CORK
Medical Surveillance Monthly Report (MSMR). Volume 19, Number 2, February 2012
2012-02-01
Health Surveillance Center CAPT Kevin L. Russell, MD, MTM &H, FIDSA (USN) Editor Francis L. O’Donnell, MD, MPH Contributing Former Editor John F ...S U M M A R Y T A B L E S A N D F I G U R E S P A G E 2 4 Deployment-related conditions of special surveillance interest Nancy A. Skopp, PhD...proximity to their suicides; an esti- mated 45 percent of individuals with com- pleted suicides had encounters with health care providers within one
Integration of High-Performance Computing into Cloud Computing Services
NASA Astrophysics Data System (ADS)
Vouk, Mladen A.; Sills, Eric; Dreher, Patrick
High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).
Control of the TSU 2-m automatic telescope
NASA Astrophysics Data System (ADS)
Eaton, Joel A.; Williamson, Michael H.
2004-09-01
Tennessee State University is operating a 2-m automatic telescope for high-dispersion spectroscopy. The alt-azimuth telescope is fiber-coupled to a conventional echelle spectrograph with two resolutions (R=30,000 and 70,000). We control this instrument with four computers running linux and communicating over ethernet through the UDP protocol. A computer physically located on the telescope handles the acquisition and tracking of stars. We avoid the need for real-time programming in this application by periodically latching the positions of the axes in a commercial motion controller and the time in a GPS receiver. A second (spectrograph) computer sets up the spectrograph and runs its CCD, a third (roof) computer controls the roll-off roof and front flap of the telescope enclosure, and the fourth (executive) computer makes decisions about which stars to observe and when to close the observatory for bad weather. The only human intervention in the telescope's operation involves changing the observing program, copying data back to TSU, and running quality-control checks on the data. It has been running reliably in this completely automatic, unattended mode for more than a year with all day-to-day adminsitration carried out over the Internet. To support automatic operation, we have written a number of useful tools to predict and analyze what the telescope does. These include a simulator that predicts roughly how the telescope will operate on a given night, a quality-control program to parse logfiles from the telescope and identify problems, and a rescheduling program that calculates new priorities to keep the frequency of observation for the various stars roughly as desired. We have also set up a database to keep track of the tens of thousands of spectra we expect to get each year.
Implementation of an Intelligent Control System
1992-05-01
there- fore implemented in a portable equipment rack. The controls computer consists of a microcomputer running a real time operating system , interface...circuit boards are mounted in an industry standard Multibus I chassis. The microcomputer runs the iRMX real time operating system . This operating system
10 CFR 205.5 - Computation of time.
Code of Federal Regulations, 2010 CFR
2010-01-01
..., event, or default from which the designated period of time begins to run is not to be included. The last... holiday in which event the period runs until the end of the next day that is neither a Saturday, Sunday... be added to the prescribed period. ...
10 CFR 205.5 - Computation of time.
Code of Federal Regulations, 2013 CFR
2013-01-01
..., event, or default from which the designated period of time begins to run is not to be included. The last... holiday in which event the period runs until the end of the next day that is neither a Saturday, Sunday... be added to the prescribed period. ...
10 CFR 205.5 - Computation of time.
Code of Federal Regulations, 2012 CFR
2012-01-01
..., event, or default from which the designated period of time begins to run is not to be included. The last... holiday in which event the period runs until the end of the next day that is neither a Saturday, Sunday... be added to the prescribed period. ...
10 CFR 205.5 - Computation of time.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., event, or default from which the designated period of time begins to run is not to be included. The last... holiday in which event the period runs until the end of the next day that is neither a Saturday, Sunday... be added to the prescribed period. ...
10 CFR 205.5 - Computation of time.
Code of Federal Regulations, 2014 CFR
2014-01-01
..., event, or default from which the designated period of time begins to run is not to be included. The last... holiday in which event the period runs until the end of the next day that is neither a Saturday, Sunday... be added to the prescribed period. ...
DOT National Transportation Integrated Search
1995-09-05
The Run-Off-Road Collision Avoidance Using IVHS Countermeasures program is to address the single vehicle crash problem through application of technology to prevent and/or reduce the severity of these crashes. : This report documents the RORSIM comput...
Investigation of Storage Options for Scientific Computing on Grid and Cloud Facilities
NASA Astrophysics Data System (ADS)
Garzoglio, Gabriele
2012-12-01
In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storage server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on “bare metal” nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.
Investigation of storage options for scientific computing on Grid and Cloud facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garzoglio, Gabriele
In recent years, several new storage technologies, such as Lustre, Hadoop, OrangeFS, and BlueArc, have emerged. While several groups have run benchmarks to characterize them under a variety of configurations, more work is needed to evaluate these technologies for the use cases of scientific computing on Grid clusters and Cloud facilities. This paper discusses our evaluation of the technologies as deployed on a test bed at FermiCloud, one of the Fermilab infrastructure-as-a-service Cloud facilities. The test bed consists of 4 server-class nodes with 40 TB of disk space and up to 50 virtual machine clients, some running on the storagemore » server nodes themselves. With this configuration, the evaluation compares the performance of some of these technologies when deployed on virtual machines and on bare metal nodes. In addition to running standard benchmarks such as IOZone to check the sanity of our installation, we have run I/O intensive tests using physics-analysis applications. This paper presents how the storage solutions perform in a variety of realistic use cases of scientific computing. One interesting difference among the storage systems tested is found in a decrease in total read throughput with increasing number of client processes, which occurs in some implementations but not others.« less
Pedretti, Kevin
2008-11-18
A compute processor allocator architecture for allocating compute processors to run applications in a multiple processor computing apparatus is distributed among a subset of processors within the computing apparatus. Each processor of the subset includes a compute processor allocator. The compute processor allocators can share a common database of information pertinent to compute processor allocation. A communication path permits retrieval of information from the database independently of the compute processor allocators.
BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way
NASA Astrophysics Data System (ADS)
Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip
2017-10-01
The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.
Rotary Kiln Gasification of Solid Waste for Base Camps
2017-10-02
cup after full day run 3.3 Feedstock Handling System Garbage bags containing waste feedstock are placed into feed bin FB-101. Ram feeder RF-102...Environmental Science and Technology using the Factory Talk SCADA software running on a laptop computer. A wireless Ethernet router that is located within the...pyrolysis oil produced required consistent draining from the system during operation and became a liquid waste disposal problem. A 5-hour test run could
Integration of Titan supercomputer at OLCF with ATLAS Production System
NASA Astrophysics Data System (ADS)
Barreiro Megino, F.; De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wells, J.; Wenaus, T.; ATLAS Collaboration
2017-10-01
The PanDA (Production and Distributed Analysis) workload management system was developed to meet the scale and complexity of distributed computing for the ATLAS experiment. PanDA managed resources are distributed worldwide, on hundreds of computing sites, with thousands of physicists accessing hundreds of Petabytes of data and the rate of data processing already exceeds Exabyte per year. While PanDA currently uses more than 200,000 cores at well over 100 Grid sites, future LHC data taking runs will require more resources than Grid computing can possibly provide. Additional computing and storage resources are required. Therefore ATLAS is engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. In this paper we will describe a project aimed at integration of ATLAS Production System with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF). Current approach utilizes modified PanDA Pilot framework for job submission to Titan’s batch queues and local data management, with lightweight MPI wrappers to run single node workloads in parallel on Titan’s multi-core worker nodes. It provides for running of standard ATLAS production jobs on unused resources (backfill) on Titan. The system already allowed ATLAS to collect on Titan millions of core-hours per month, execute hundreds of thousands jobs, while simultaneously improving Titans utilization efficiency. We will discuss the details of the implementation, current experience with running the system, as well as future plans aimed at improvements in scalability and efficiency. Notice: This manuscript has been authored, by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.
Simulation Framework for Intelligent Transportation Systems
DOT National Transportation Integrated Search
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System. The simulator is designed for running on parellel computers and distributed (networked) computer systems, but ca...
Prediction of sound radiation from different practical jet engine inlets
NASA Technical Reports Server (NTRS)
Zinn, B. T.; Meyer, W. L.
1982-01-01
The computer codes necessary for this study were developed and checked against exact solutions generated by the point source method using the NASA Lewis QCSEE inlet geometry. These computer codes were used to predict the acoustic properties of the following five inlet configurations: the NASA Langley Bellmouth, the NASA Lewis JT15D-1 Ground Test Nacelle, and three finite hyperbolic inlets of 50, 70 and 90 degrees. Thirty-five computer runs were done for the NASA Langley Bellmouth. For each of these computer runs, the reflection coefficient at the duct exit plane was calculated as was the far field radiation pattern. These results are presented in both graphical and tabular form with many of the results cross plotted so that trends in the results verses cut-off ratio (wave number) and tangential mode number may be easily identified.
ERIC Educational Resources Information Center
Navarro, Aaron B.
1981-01-01
Presents a program in Level II BASIC for a TRS-80 computer that simulates a Turing machine and discusses the nature of the device. The program is run interactively and is designed to be used as an educational tool by computer science or mathematics students studying computational or automata theory. (MP)
10 CFR 2.1003 - Availability of material.
Code of Federal Regulations, 2013 CFR
2013-01-01
... its license application for a geologic repository, the NRC shall make available no later than thirty... privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer... discrepancies; (ii) Gauge, meter and computer settings; (iii) Probe locations; (iv) Logging intervals and rates...
10 CFR 2.1003 - Availability of material.
Code of Federal Regulations, 2014 CFR
2014-01-01
... its license application for a geologic repository, the NRC shall make available no later than thirty... privilege in § 2.1006, graphic-oriented documentary material that includes raw data, computer runs, computer... discrepancies; (ii) Gauge, meter and computer settings; (iii) Probe locations; (iv) Logging intervals and rates...
ABSENTEE COMPUTATIONS IN A MULTIPLE-ACCESS COMPUTER SYSTEM.
require user interaction, and the user may therefore want to run these computations ’ absentee ’ (or, user not present). A mechanism is presented which...provides for the handling of absentee computations in a multiple-access computer system. The design is intended to be implementation-independent...Some novel features of the system’s design are: a user can switch computations from interactive to absentee (and vice versa), the system can
Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond
NASA Astrophysics Data System (ADS)
Bonacorsi, D.; Boccali, T.; Giordano, D.; Girone, M.; Neri, M.; Magini, N.; Kuznetsov, V.; Wildish, T.
2015-12-01
During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This is of great importance, given that the scale of the computing problem will increase far faster than the resources available to the experiments, for Run-2 and beyond. Studying data-access patterns involves the validation of the quality of the monitoring data collected on the “popularity of each dataset, the analysis of the frequency and pattern of accesses to different datasets by analysis end-users, the exploration of different views of the popularity data (by physics activity, by region, by data type), the study of the evolution of Run-1 data exploitation over time, the evaluation of the impact of different data placement and distribution choices on the available network and storage resources and their impact on the computing operations. This work presents some insights from studies on the popularity data from the CMS experiment. We present the properties of a range of physics analysis activities as seen by the data popularity, and make recommendations for how to tune the initial distribution of data in anticipation of how it will be used in Run-2 and beyond.
Federated data storage system prototype for LHC experiments and data intensive science
NASA Astrophysics Data System (ADS)
Kiryanov, A.; Klimentov, A.; Krasnopevtsev, D.; Ryabinkin, E.; Zarochentsev, A.
2017-10-01
Rapid increase of data volume from the experiments running at the Large Hadron Collider (LHC) prompted physics computing community to evaluate new data handling and processing solutions. Russian grid sites and universities’ clusters scattered over a large area aim at the task of uniting their resources for future productive work, at the same time giving an opportunity to support large physics collaborations. In our project we address the fundamental problem of designing a computing architecture to integrate distributed storage resources for LHC experiments and other data-intensive science applications and to provide access to data from heterogeneous computing facilities. Studies include development and implementation of federated data storage prototype for Worldwide LHC Computing Grid (WLCG) centres of different levels and University clusters within one National Cloud. The prototype is based on computing resources located in Moscow, Dubna, Saint Petersburg, Gatchina and Geneva. This project intends to implement a federated distributed storage for all kind of operations such as read/write/transfer and access via WAN from Grid centres, university clusters, supercomputers, academic and commercial clouds. The efficiency and performance of the system are demonstrated using synthetic and experiment-specific tests including real data processing and analysis workflows from ATLAS and ALICE experiments, as well as compute-intensive bioinformatics applications (PALEOMIX) running on supercomputers. We present topology and architecture of the designed system, report performance and statistics for different access patterns and show how federated data storage can be used efficiently by physicists and biologists. We also describe how sharing data on a widely distributed storage system can lead to a new computing model and reformations of computing style, for instance how bioinformatics program running on supercomputers can read/write data from the federated storage.
Vectorization of transport and diffusion computations on the CDC Cyber 205
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abu-Shumays, I.K.
1986-01-01
The development and testing of alternative numerical methods and computational algorithms specifically designed for the vectorization of transport and diffusion computations on a Control Data Corporation (CDC) Cyber 205 vector computer are described. Two solution methods for the discrete ordinates approximation to the transport equation are summarized and compared. Factors of 4 to 7 reduction in run times for certain large transport problems were achieved on a Cyber 205 as compared with run times on a CDC-7600. The solution of tridiagonal systems of linear equations, central to several efficient numerical methods for multidimensional diffusion computations and essential for fluid flowmore » and other physics and engineering problems, is also dealt with. Among the methods tested, a combined odd-even cyclic reduction and modified Cholesky factorization algorithm for solving linear symmetric positive definite tridiagonal systems is found to be the most effective for these systems on a Cyber 205. For large tridiagonal systems, computation with this algorithm is an order of magnitude faster on a Cyber 205 than computation with the best algorithm for tridiagonal systems on a CDC-7600.« less
Simulation framework for intelligent transportation systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ewing, T.; Doss, E.; Hanebutte, U.
1996-10-01
A simulation framework has been developed for a large-scale, comprehensive, scaleable simulation of an Intelligent Transportation System (ITS). The simulator is designed for running on parallel computers and distributed (networked) computer systems, but can run on standalone workstations for smaller simulations. The simulator currently models instrumented smart vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide two-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphicalmore » user interfaces to support human-factors studies. Realistic modeling of variations of the posted driving speed are based on human factors studies that take into consideration weather, road conditions, driver personality and behavior, and vehicle type. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on parallel computers, such as ANL`s IBM SP-2, for large-scale problems. A novel feature of the approach is that vehicles are represented by autonomous computer processes which exchange messages with other processes. The vehicles have a behavior model which governs route selection and driving behavior, and can react to external traffic events much like real vehicles. With this approach, the simulation is scaleable to take advantage of emerging massively parallel processor (MPP) systems.« less
Lanier, T.H.
1996-01-01
The 100-year flood plain was determined for Upper Three Runs, its tributaries, and the part of the Savannah River that borders the Savannah River Site. The results are provided in tabular and graphical formats. The 100-year flood-plain maps and flood profiles provide water-resource managers of the Savannah River Site with a technical basis for making flood-plain management decisions that could minimize future flood problems and provide a basis for designing and constructing drainage structures along roadways. A hydrologic analysis was made to estimate the 100-year recurrence- interval flow for Upper Three Runs and its tributaries. The analysis showed that the well-drained, sandy soils in the head waters of Upper Three Runs reduce the high flows in the stream; therefore, the South Carolina upper Coastal Plain regional-rural-regression equation does not apply for Upper Three Runs. Conse- quently, a relation was established for 100-year recurrence-interval flow and drainage area using streamflow data from U.S. Geological Survey gaging stations on Upper Three Runs. This relation was used to compute 100-year recurrence-interval flows at selected points along the stream. The regional regression equations were applicable for the tributaries to Upper Three Runs, because the soil types in the drainage basins of the tributaries resemble those normally occurring in upper Coastal Plain basins. This was verified by analysis of the flood-frequency data collected from U.S. Geological Survey gaging station 02197342 on Fourmile Branch. Cross sections were surveyed throughout each reach, and other pertinent data such as flow resistance and land-use were col- lected. The surveyed cross sections and computed 100-year recurrence-interval flows were used in a step-backwater model to compute the 100-year flood profile for Upper Three Runs and its tributaries. The profiles were used to delineate the 100-year flood plain on topographic maps. The Savannah River forms the southwestern border of the Savannah River Site. Data from previously published reports were used to delineate the 100-year flood plain for the Savannah River from the downstream site boundary at the mouth of Lower Three Runs at river mile 125 to the upstream site boundary at river mile 163.
Preventing Run-Time Bugs at Compile-Time Using Advanced C++
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neswold, Richard
When writing software, we develop algorithms that tell the computer what to do at run-time. Our solutions are easier to understand and debug when they are properly modeled using class hierarchies, enumerations, and a well-factored API. Unfortunately, even with these design tools, we end up having to debug our programs at run-time. Worse still, debugging an embedded system changes its dynamics, making it tough to find and fix concurrency issues. This paper describes techniques using C++ to detect run-time bugs *at compile time*. A concurrency library, developed at Fermilab, is used for examples in illustrating these techniques.
Scopolamine Administration Modulates Muscarinic, Nicotinic and NMDA Receptor Systems
Höger, Harald; Pollak, Arnold; Lubec, Gert
2012-01-01
Studies on the effect of scopolamine on memory are abundant but so far only regulation of the muscarinic receptor (M1) has been reported. We hypothesized that levels of other cholinergic brain receptors as the nicotinic receptors and the N-methyl-D-aspartate (NMDA) receptor, known to be involved in memory formation, would be modified by scopolamine administration. C57BL/6J mice were used for the experiments and divided into four groups. Two groups were given scopolamine 1 mg/kg i.p. (the first group was trained and the second group untrained) in the multiple T-maze (MTM), a paradigm for evaluation of spatial memory. Likewise, vehicle-treated mice were trained or untrained thus serving as controls. Hippocampal levels of M1, nicotinic receptor alpha 4 (Nic4) and 7 (Nic7) and subunit NR1containing complexes were determined by immunoblotting on blue native gel electrophoresis. Vehicle-treated trained mice learned the task and showed memory retrieval on day 8, while scopolamine-treatment led to significant impairment of performance in the MTM. At the day of retrieval, hippocampal levels for M1, Nic7 and NR1 were higher in the scopolamine treated groups than in vehicle-treated groups. The concerted action, i.e. the pattern of four brain receptor complexes regulated by the anticholinergic compound scopolamine, is shown. Insight into probable action mechanisms of scopolamine at the brain receptor complex level in the hippocampus is provided. Scopolamine treatment is a standard approach to test cognitive enhancers and other psychoactive compounds in pharmacological studies and therefore knowledge on mechanisms is of pivotal interest. PMID:22384146
Q-Space Truncation and Sampling in Diffusion Spectrum Imaging
Tian, Qiyuan; Rokem, Ariel; Folkerth, Rebecca D.; Nummenmaa, Aapo; Fan, Qiuyun; Edlow, Brian L.; McNab, Jennifer A.
2015-01-01
Purpose To characterize the q-space truncation and sampling on the spin-displacement probability density function (PDF) in diffusion spectrum imaging (DSI). Methods DSI data were acquired using the MGH-USC connectome scanner (Gmax=300mT/m) with bmax=30,000s/mm2, 17×17×17, 15×15×15 and 11×11×11 grids in ex vivo human brains and bmax=10,000s/mm2, 11×11×11 grid in vivo. An additional in vivo scan using bmax=7,000s/mm2, 11×11×11 grid was performed with a derated gradient strength of 40mT/m. PDFs and orientation distribution functions (ODFs) were reconstructed with different q-space filtering and PDF integration lengths, and from down-sampled data by factors of two and three. Results Both ex vivo and in vivo data showed Gibbs ringing in PDFs, which becomes the main source of artifact in the subsequently reconstructed ODFs. For down-sampled data, PDFs interfere with the first replicas or their ringing, leading to obscured orientations in ODFs. Conclusion The minimum required q-space sampling density corresponds to a field-of-view approximately equal to twice the mean displacement distance (MDD) of the tissue. The 11×11×11 grid is suitable for both ex vivo and in vivo DSI experiments. To minimize the effects of Gibbs ringing, ODFs should be reconstructed from unfiltered q-space data with the integration length over the PDF constrained to around the MDD. PMID:26762670
Kibicho, Jennifer; Owczarzak, Jill
2012-01-01
Reflecting trends in health care delivery, pharmacy practice has shifted from a drug-specific to a patient-centered model of care, aimed at improving the quality of patient care and reducing health care costs. In this article, we outline a theoretical model of patient-centered pharmacy services (PCPS), based on in-depth, qualitative interviews with a purposive sample of 28 pharmacists providing care to HIV-infected patients in specialty, semispecialty, and nonspecialty pharmacy settings. Data analysis was an interactive process informed by pharmacists' interviews and a review of the general literature on patient centered care, including Medication Therapy Management (MTM) services. Our main finding was that the current models of pharmacy services, including MTM, do not capture the range of pharmacy services in excess of mandated drug dispensing services. In this article, we propose a theoretical PCPS model that reflects the actual services pharmacists provide. The model includes five elements: (1) addressing patients as whole, contextualized persons; (2) customizing interventions to unique patient circumstances; (3) empowering patients to take responsibility for their own health care; (4) collaborating with clinical and nonclinical providers to address patient needs; and (5) developing sustained relationships with patients. The overarching goal of PCPS is to empower patients' to take responsibility for their own health care and self-manage their HIV-infection. Our findings provide the foundation for future studies regarding how widespread these practices are in diverse community settings, the validity of the proposed PCPS model, the potential for standardizing pharmacist practices, and the feasibility of a PCPS framework to reimburse pharmacists services.
Auscultation in flight: comparison of conventional and electronic stethoscopes.
Tourtier, J P; Libert, N; Clapson, P; Tazarourte, K; Borne, M; Grasser, L; Debien, B; Auroy, Y
2011-01-01
The ability to auscultate during air medical transport is compromised by high ambient-noise levels. The aim of this study was to assess the capabilities of a traditional and an electronic stethoscope (which is expected to amplify sounds and reduce ambient noise) to assess heart and breath sounds during medical transport in a Boeing C135. We tested one model of a traditional stethoscope (3MTM Littmann Cardiology IIITM) and one model of an electronic stethoscope (3MTM Littmann Stethoscope Model 3000). We studied heart and lung auscultation during real medical evacuations aboard a medically configured C135. For each device, the quality of auscultation was described using a visual rating scale (ranging from 0 to 100 mm, 0 corresponding to "I hear nothing," 100 to "I hear perfectly"). Comparisons were accomplished using a t-test for paired values. A total of 36 comparative evaluations were performed. For cardiac auscultation, the value of the visual rating scale was 53 ± 24 and 85 ± 11 mm, respectively, for the traditional and electronic stethoscope (paired t-test: P = .0024). For lung sounds, quality of auscultation was estimated at 27 ± 17 mm for traditional stethoscope and 68 ± 13 for electronic stethoscope (paired t-test: P = .0003). The electronic stethoscope was considered to be better than the standard model for hearing heart and lung sounds. Flight practitioners involved in air medical evacuation in the C135 aircraft are better able to practice auscultation with this electronic stethoscope than with a traditional one. Copyright © 2011 Air Medical Journal Associates. Published by Elsevier Inc. All rights reserved.
Friedrich, Anke; Thomas, Ulf; Müller, Uli
2004-05-05
Learning and memory formation in intact animals is generally studied under defined parameters, including the control of feeding. We used associative olfactory conditioning of the proboscis extension response in honeybees to address effects of feeding status on processes of learning and memory formation. Comparing groups of animals with different but defined feeding status at the time of conditioning reveals new and characteristic features in memory formation. In animals fed 18 hr earlier, three-trial conditioning induces a stable memory that consists of different phases: a mid-term memory (MTM), translation-dependent early long-term memory (eLTM; 1-2 d), and a transcription-dependent late LTM (lLTM; > or =3 d). Additional feeding of a small amount of sucrose 4 hr before conditioning leads to a loss of all of these memory phases. Interestingly, the basal activity of the cAMP-dependent protein kinase A (PKA), a key player in LTM formation, differs in animals with different satiation levels. Pharmacological rescue of the low basal PKA activity in animals fed 4 hr before conditioning points to a specific function of cAMP-PKA cascade in mediating satiation-dependent memory formation. An increase in PKA activity during conditioning rescues only transcription-dependent lLTM; acquisition, MTM, and eLTM are still impaired. Thus, during conditioning, the cAMP-PKA cascade mediates the induction of the transcription-dependent lLTM, depending on the satiation level. This result provides the first evidence for a central and distinct function of the cAMP-PKA cascade connecting satiation level with learning.
Law, Alice Y S; Yeung, B H Y; Ching, L Y; Wong, Chris K C
2011-08-01
Our previous study demonstrated that, stanniocalcin-1 (STC1) was a target of histone deacetylase (HDAC) inhibitors and was involved in trichostatin A (TSA) induced apoptosis in the human colon cancer cells, HT29. In this study, we reported that the transcriptional factor, specificity protein 1 (Sp1) in association with retinoblastoma (Rb) repressed STC1 gene transcription in TSA-treated HT29 cells. Our data demonstrated that, a co-treatment of the cells with TSA and Sp1 inhibitor, mithramycin A (MTM) led to a marked synergistic induction of STC1 transcript levels, STC1 promoter (1 kb)-driven luciferase activity and an increase of apoptotic cell population. The knockdown of Sp1 gene expression in TSA treated cells, revealed the repressor role of Sp1 in STC1 transcription. Using a protein phosphatase inhibitor okadaic acid (OKA), an increase of Sp1 hyperphosphorylation and so a reduction of its transcriptional activity, led to a significant induction of STC1 gene expression. Chromatin immunoprecipitation (ChIP) assay revealed that Sp1 binding on STC1 proximal promoter in TSA treated cells. The binding of Sp1 to STC1 promoter was abolished by the co-treatment of MTM or OKA in TSA-treated cells. Re-ChIP assay illustrated that Sp1-mediated inhibition of STC1 transcription was associated with the recruitment of another repressor molecule, Rb. Collectively our findings identify STC1 is a downstream target of Sp1. Copyright © 2011 Wiley-Liss, Inc.
Ma, Dongfeng; Zhang, Zhijun; Zhang, Xiangrong; Li, Lingjiang
2014-06-01
New generation antidepressant therapies, including serotonin-norepinephrine reuptake inhibitor (SNRIs) and selective serotonin reuptake inhibitors (SSRIs) were introduced in the late 1980s; however, few comprehensive studies compared the benefits and risks of various contemporary treatments for major depressive disorder (MDD) in pediatric patients. Multiple-treatments meta-analysis (MTM) was conducted to assess efficacy, acceptability, and safety of contemporary interventions in children and adolescents with MDD. Cochrane Library, AMED, CINAHL, EMBASE, LiLACS, MEDLINE, PSYCINFO, PSYNDEX, and Journal of Medicine and Pharmacy databases were searched for randomized controlled trials (RCTs) comparing medicinal interventions (citalopram, escitalopram, fluoxetine, mirtazapine, paroxetine, sertraline, venlafaxine), cognitive behavioral therapy (CBT), combined fluoxetine with CBT, and placebo treatment for acute MDD from January 1988 to March 2013. Treatment success, dropout rate, and suicidal ideation/attempt outcomes were measured. Bayesian methods were used to conduct a MTM including age and funding subgroups. A total of 21 RCTs (4969 participants) were identified. Combined fluoxetine/CBT exhibited the highest efficacy, with fluoxetine alone superior to CBT, paroxetine, sertraline, citalopram, escitalopram, and placebo treatment. Sertraline, paroxetine, escitalopram, and venlafaxine showed superior acceptability to fluoxetine and combined fluoxetine/CBT. Combined fluoxetine/CBT combination was less safe, though CBT was safer than fluoxetine alone. Combined fluoxetine/CBT, fluoxetine, and mirtazapine exhibited the highest efficacy; sertraline, escitalopram, venlafaxine, and paroxetine were the best tolerated; and mirtazapine and venlafaxine were the safest. Sertraline and mirtazapine exhibited optimally balanced efficacy, acceptability, and safety for first-line acute treatment of child and adolescent MDD.
Navigating the Challenges of the Cloud
ERIC Educational Resources Information Center
Ovadia, Steven
2010-01-01
Cloud computing is increasingly popular in education. Cloud computing is "the delivery of computer services from vast warehouses of shared machines that enables companies and individuals to cut costs by handing over the running of their email, customer databases or accounting software to someone else, and then accessing it over the internet."…
Onboard Flow Sensing For Downwash Detection and Avoidance On Small Quadrotor Helicopters
2015-01-01
onboard computers, one for flight stabilization and a Linux computer for sensor integration and control calculations . The Linux computer runs Robot...Hirokawa, D. Kubo , S. Suzuki, J. Meguro, and T. Suzuki. Small uav for immediate hazard map generation. In AIAA Infotech@Aerospace Conf, May 2007. 8F
5 CFR 841.109 - Computation of time.
Code of Federal Regulations, 2011 CFR
2011-01-01
....109 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS... Computation of time. In computing a period of time for filing documents, the day of the action or event after... included unless it is a Saturday, a Sunday, or a legal holiday; in this event, the period runs until the...
Implications of Windowing Techniques for CAI.
ERIC Educational Resources Information Center
Heines, Jesse M.; Grinstein, Georges G.
This paper discusses the use of a technique called windowing in computer assisted instruction to allow independent control of functional areas in complex CAI displays and simultaneous display of output from a running computer program and coordinated instructional material. Two obstacles to widespread use of CAI in computer science courses are…
More Colleges Eye outside Companies to Run Their Computer Operations.
ERIC Educational Resources Information Center
DeLoughry, Thomas J.
1993-01-01
Increasingly, budget pressures and rapid technological change are causing colleges to consider "outsourcing" for computer operations management, particularly for administrative purposes. Supporters see the trend as similar to hiring experts for other, ancillary services. Critics fear loss of control of the institution's vital computer systems.…
Representing, Running, and Revising Mental Models: A Computational Model
ERIC Educational Resources Information Center
Friedman, Scott; Forbus, Kenneth; Sherin, Bruce
2018-01-01
People use commonsense science knowledge to flexibly explain, predict, and manipulate the world around them, yet we lack computational models of how this commonsense science knowledge is represented, acquired, utilized, and revised. This is an important challenge for cognitive science: Building higher order computational models in this area will…
Computational Participation: Understanding Coding as an Extension of Literacy Instruction
ERIC Educational Resources Information Center
Burke, Quinn; O'Byrne, W. Ian; Kafai, Yasmin B.
2016-01-01
Understanding the computational concepts on which countless digital applications run offers learners the opportunity to no longer simply read such media but also become more discerning end users and potentially innovative "writers" of new media themselves. To think computationally--to solve problems, to design systems, and to process and…
Mobile Computer-Assisted-Instruction in Rural New Mexico.
ERIC Educational Resources Information Center
Gittinger, Jack D., Jr.
The University of New Mexico's three-year Computer Assisted Instruction Project established one mobile and five permanent laboratories offering remedial and vocational instruction in winter, 1984-85. Each laboratory has a Degem learning system with minicomputer, teacher terminal, and 32 student terminals. A Digital PDP-11 host computer runs the…