Science.gov

Sample records for big 1-98 participants

  1. Molecular risk assessment of BIG 1-98 participants by expression profiling using RNA from archival tissue

    PubMed Central

    2010-01-01

    Background The purpose of the work reported here is to test reliable molecular profiles using routinely processed formalin-fixed paraffin-embedded (FFPE) tissues from participants of the clinical trial BIG 1-98 with a median follow-up of 60 months. Methods RNA from fresh frozen (FF) and FFPE tumor samples of 82 patients were used for quality control, and independent FFPE tissues of 342 postmenopausal participants of BIG 1-98 with ER-positive cancer were analyzed by measuring prospectively selected genes and computing scores representing the functions of the estrogen receptor (eight genes, ER_8), the progesterone receptor (five genes, PGR_5), Her2 (two genes, HER2_2), and proliferation (ten genes, PRO_10) by quantitative reverse transcription PCR (qRT-PCR) on TaqMan Low Density Arrays. Molecular scores were computed for each category and ER_8, PGR_5, HER2_2, and PRO_10 scores were combined into a RISK_25 score. Results Pearson correlation coefficients between FF- and FFPE-derived scores were at least 0.94 and high concordance was observed between molecular scores and immunohistochemical data. The HER2_2, PGR_5, PRO_10 and RISK_25 scores were significant predictors of disease free-survival (DFS) in univariate Cox proportional hazard regression. PRO_10 and RISK_25 scores predicted DFS in patients with histological grade II breast cancer and in lymph node positive disease. The PRO_10 and PGR_5 scores were independent predictors of DFS in multivariate Cox regression models incorporating clinical risk indicators; PRO_10 outperformed Ki-67 labeling index in multivariate Cox proportional hazard analyses. Conclusions Scores representing the endocrine responsiveness and proliferation status of breast cancers were developed from gene expression analyses based on RNA derived from FFPE tissues. The validation of the molecular scores with tumor samples of participants of the BIG 1-98 trial demonstrates that such scores can serve as independent prognostic factors to estimate

  2. The Breast International Group 1-98 trial: big results for women with hormone-sensitive early breast cancer.

    PubMed

    Monnier, Alain M

    2007-05-01

    As there is a risk for relapse in early breast cancer, especially at 1-3 years post surgery, the need for adjuvant therapy is clear. In terms of disease-free survival, aromatase inhibitors have emerged as superior to tamoxifen for the adjuvant treatment of hormone-sensitive breast cancer in several Phase III clinical trials. Of these trials, the Breast International Group (BIG) 1-98 trial stands out as unique in design, as it is the only trial to address whether an aromatase inhibitor is more effective as initial adjuvant therapy or as sequential therapy with an aromatase inhibitor and tamoxifen in either order and in rigor of end points and safety evaluations. When compared with tamoxifen, letrozole has been shown to significantly reduce recurrence risk in the overall population by 19% and also significantly reduced recurrence risk in the patient subgroups at increased risk: node-positive and previously chemotherapy-treated patients. Letrozole is the only aromatase inhibitor to demonstrate a significant 27% reduction in the risk of distant metastases (p = 0.001) in the clinically relevant, hormone receptor-positive population in the initial adjuvant setting. Recent results also suggest that letrozole in particular reduces the risk of distant metastases early on after initial surgery for breast cancer. This is important, as early distant metastatic events compose the majority of early recurrences and are a well-recognized predictor of breast cancer death. Letrozole has been found to be well tolerated in the initial adjuvant treatment setting, and these data have been confirmed by long-term safety data from the monotherapy analysis in the BIG 1-98 study. Thus far, the results from the BIG 1-98 trial provide clear support for the use of letrozole in the initial adjuvant treatment of breast cancer. Future studies will provide the definitive answer to questions of which initial adjuvant therapy is superior (i.e., anastrozole or letrozole) and information as to the

  3. Letrozole as upfront endocrine therapy for postmenopausal women with hormone-sensitive breast cancer: BIG 1-98

    PubMed Central

    Thuerlimann, Beat

    2007-01-01

    The BIG 1-98 trial is a large, randomized, independently conducted clinical trial designed to compare the efficacy of upfront letrozole versus tamoxifen monotherapy and to compare sequential or up-front use of letrozole and/or tamoxifen as an early adjuvant therapy for patients with early breast cancer. We report on the results from the primary core analysis of the BIG 1-98 trial of 8,010 patients, which compares monotherapy with letrozole versus tamoxifen. This pre-planned core analysis allowed the use of patient data from the monotherapy arms of letrozole and tamoxifen and from the sequential arms prior to the drug switch point. Patients randomized to letrozole had a 19% improved disease-free survival (hazard ratio [HR] = 0.81; P = 0.003), due especially to reduced distant metastases (HR = 0.73; P = 0.001). A 14% risk reduction of fatal events in favor of letrozole was also observed (P = NS). The results from the monotherapy arms alone confirmed the findings from the primary core analysis. Based on the results from this trial, the aromatase inhibitor letrozole (Femara®) is currently recommended as a part of standard adjuvant therapy for postmenopausal women with endocrine-responsive breast cancer and has recently been approved in the early adjuvant setting in both Europe and the United States. A subsequent analysis after additional follow-up will address the question of monotherapy versus sequential therapy. PMID:17912636

  4. Relative Effectiveness of Letrozole Compared With Tamoxifen for Patients With Lobular Carcinoma in the BIG 1-98 Trial

    PubMed Central

    Metzger Filho, Otto; Giobbie-Hurder, Anita; Mallon, Elizabeth; Gusterson, Barry; Viale, Giuseppe; Winer, Eric P.; Thürlimann, Beat; Gelber, Richard D.; Colleoni, Marco; Ejlertsen, Bent; Debled, Marc; Price, Karen N.; Regan, Meredith M.; Coates, Alan S.; Goldhirsch, Aron

    2015-01-01

    Purpose To evaluate the relative effectiveness of letrozole compared with tamoxifen for patients with invasive ductal or lobular carcinoma. Patients and Methods Patients diagnosed with early-stage invasive ductal carcinoma (IDC) or classic invasive lobular carcinoma (ILC) who were randomly assigned onto the Breast International Group (BIG) 1-98 trial and who had centrally reviewed pathology data were included (N = 2,923). HER2-negative IDC and ILC were additionally classified as hormone receptor–positive with high (luminal B [LB] –like) or low (luminal A [LA] –like) proliferative activity by Ki-67 labeling index. Survival analyses were performed with weighted Cox models that used inverse probability of censoring weighted modeling. Results The median follow-up time was 8.1 years. In multivariable models for disease-free survival (DFS), significant interactions between treatment and histology (ILC or IDC; P = .006) and treatment and subgroup (LB like or LA like; P = .01) were observed. In the ILC subset, there was a 66% reduction in the hazard of a DFS event with letrozole for LB (hazard ratio [HR], 0.34; 95% CI, 0.21 to 0.55) and a 50% reduction for LA subtypes (HR, 0.50; 95% CI, 0.32 to 0.78). In the IDC subset, there was a significant 35% reduction in the hazard of a DFS event with letrozole for the LB subtype (HR, 0.65; 95% CI, 0.53 to 0.79), but no difference between treatments was noted for IDC and the LA subtype (HR, 0.95; 95% CI, 0.76 to 1.20). Conclusion The magnitude of benefit of adjuvant letrozole is greater for patients diagnosed with lobular carcinoma versus ductal carcinoma. PMID:26215945

  5. Bone fractures among postmenopausal patients with endocrine-responsive early breast cancer treated with 5 years of letrozole or tamoxifen in the BIG 1-98 trial

    PubMed Central

    Rabaglio, M.; Sun, Z.; Castiglione-Gertsch, M.; Hawle, H.; Thürlimann, B.; Mouridsen, H.; Campone, M.; Forbes, J. F.; Paridaens, R. J.; Colleoni, M.; Pienkowski, T.; Nogaret, J.-M.; Láng, I.; Smith, I.; Gelber, R. D.; Goldhirsch, A.; Coates, A. S.

    2009-01-01

    Background: To compare the incidence and timing of bone fractures in postmenopausal women treated with 5 years of adjuvant tamoxifen or letrozole for endocrine-responsive early breast cancer in the Breast International Group (BIG) 1-98 trial. Methods: We evaluated 4895 patients allocated to 5 years of letrozole or tamoxifen in the BIG 1-98 trial who received at least some study medication (median follow-up 60.3 months). Bone fracture information (grade, cause, site) was collected every 6 months during trial treatment. Results: The incidence of bone fractures was higher among patients treated with letrozole [228 of 2448 women (9.3%)] versus tamoxifen [160 of 2447 women (6.5%)]. The wrist was the most common site of fracture in both treatment groups. Statistically significant risk factors for bone fractures during treatment included age, smoking history, osteoporosis at baseline, previous bone fracture, and previous hormone replacement therapy. Conclusions: Consistent with other trials comparing aromatase inhibitors to tamoxifen, letrozole was associated with an increase in bone fractures. Benefits of superior disease control associated with letrozole and lower incidence of fracture with tamoxifen should be considered with the risk profile for individual patients. PMID:19474112

  6. Cognitive function in postmenopausal women receiving adjuvant letrozole or tamoxifen for breast cancer in the BIG 1-98 randomized trial

    PubMed Central

    Phillips, Kelly Anne; Ribi, Karin; Sun, Zhuoxin; Stephens, Alisa; Thompson, Alastair; Harvey, Vernon; Thürlimann, Beat; Cardoso, Fatima; Pagani, Olivia; Coates, Alan S.; Goldhirsch, Aron; Price, Karen N.; Gelber, Richard D.; Bernhard, Jürg

    2010-01-01

    Summary Cognitive function in postmenopausal women receiving letrozole or tamoxifen as adjuvant endocrine treatment was compared during the fifth year of treatment in a substudy of the BIG 1-98 trial. In BIG 1-98 patients were randomized to receive adjuvant A) 5-years tamoxifen, B) 5-years letrozole, C) 2-years tamoxifen followed by 3-years letrozole, or D) 2-years letrozole followed by 3-years tamoxifen. The primary comparison was the difference in composite score for patients taking letrozole (B+C; N=65) versus tamoxifen (A+D; N=55). The patients taking letrozole had better overall cognitive function than those taking tamoxifen (difference in mean composite z-scores =0.28, p=0.04, 95% CI:0.02, 0.54, Cohen's D = 0.40 indicating small to moderate effect). In this substudy, breast cancer patients taking adjuvant letrozole during the fifth year of treatment had better cognitive function than those taking tamoxifen, suggesting aromatase inhibitors do not adversely impact cognition compared with tamoxifen. PMID:20385495

  7. The advantage of letrozole over tamoxifen in the BIG 1-98 trial is consistent in younger postmenopausal women and in those with chemotherapy-induced menopause.

    PubMed

    Chirgwin, Jacquie; Sun, Zhuoxin; Smith, Ian; Price, Karen N; Thürlimann, Beat; Ejlertsen, Bent; Bonnefoi, Hervé; Regan, Meredith M; Goldhirsch, Aron; Coates, Alan S

    2012-01-01

    Letrozole, an aromatase inhibitor, is ineffective in the presence of ovarian estrogen production. Two subpopulations of apparently postmenopausal women might derive reduced benefit from letrozole due to residual or returning ovarian activity: younger women (who have the potential for residual subclinical ovarian estrogen production), and those with chemotherapy-induced menopause who may experience return of ovarian function. In these situations tamoxifen may be preferable to an aromatase inhibitor. Among 4,922 patients allocated to the monotherapy arms (5 years of letrozole or tamoxifen) in the BIG 1-98 trial we identified two relevant subpopulations: patients with potential residual ovarian function, defined as having natural menopause, treated without adjuvant or neoadjuvant chemotherapy and age ≤ 55 years (n = 641); and those with chemotherapy-induced menopause (n = 105). Neither of the subpopulations examined showed treatment effects differing from the trial population as a whole (interaction P values are 0.23 and 0.62, respectively). Indeed, both among the 641 patients aged ≤ 55 years with natural menopause and no chemotherapy (HR 0.77 [0.51, 1.16]) and among the 105 patients with chemotherapy-induced menopause (HR 0.51 [0.19, 1.39]), the disease-free survival (DFS) point estimate favoring letrozole was marginally more beneficial than in the trial as a whole (HR 0.84 [0.74, 0.95]). Contrary to our initial concern, DFS results for young postmenopausal patients who did not receive chemotherapy and patients with chemotherapy-induced menopause parallel the letrozole benefit seen in the BIG 1-98 population as a whole. These data support the use of letrozole even in such patients.

  8. The advantage of letrozole over tamoxifen in the BIG 1-98 trial is consistent in younger postmenopausal women and in those with chemotherapy-induced menopause

    PubMed Central

    Sun, Zhuoxin; Smith, Ian; Price, Karen N.; Thürlimann, Beat; Ejlertsen, Bent; Bonnefoi, Hervé; Regan, Meredith M.; Goldhirsch, Aron; Coates, Alan S.

    2016-01-01

    Letrozole, an aromatase inhibitor, is ineffective in the presence of ovarian estrogen production. Two subpopulations of apparently postmenopausal women might derive reduced benefit from letrozole due to residual or returning ovarian activity: younger women (who have the potential for residual subclinical ovarian estrogen production), and those with chemotherapy-induced menopause who may experience return of ovarian function. In these situations tamoxifen may be preferable to an aromatase inhibitor. Among 4,922 patients allocated to the monotherapy arms (5 years of letrozole or tamoxifen) in the BIG 1-98 trial we identified two relevant subpopulations: patients with potential residual ovarian function, defined as having natural menopause, treated without adjuvant or neoadjuvant chemotherapy and age ≤55 years (n = 641); and those with chemotherapy-induced menopause (n = 105). Neither of the subpopulations examined showed treatment effects differing from the trial population as a whole (interaction P values are 0.23 and 0.62, respectively). Indeed, both among the 641 patients aged ≤55 years with natural menopause and no chemotherapy (HR 0.77 [0.51, 1.16]) and among the 105 patients with chemotherapy-induced menopause (HR 0.51 [0.19, 1.39]), the disease-free survival (DFS) point estimate favoring letrozole was marginally more beneficial than in the trial as a whole (HR 0.84 [0.74, 0.95]). Contrary to our initial concern, DFS results for young postmenopausal patients who did not receive chemotherapy and patients with chemotherapy-induced menopause parallel the letrozole benefit seen in the BIG 1-98 population as a whole. These data support the use of letrozole even in such patients. PMID:21892704

  9. Cognitive function in postmenopausal breast cancer patients one year after completing adjuvant endocrine therapy with letrozole and/or tamoxifen in the BIG 1-98 trial

    PubMed Central

    Phillips, Kelly-Anne; Aldridge, Julie; Ribi, Karin; Sun, Zhuoxin; Thompson, Alastair; Harvey, Vernon; Thürlimann, Beat; Cardoso, Fatima; Pagani, Olivia; Coates, Alan S.; Goldhirsch, Aron; Price, Karen N.; Gelber, Richard D.

    2011-01-01

    Endocrine therapy for breast cancer may affect cognition. The purpose of this study was to examine whether cognitive function improves after cessation of adjuvant endocrine therapy. Change in cognitive function was assessed in 100 postmenopausal breast cancer patients in the BIG 1-98 trial, who were randomized to receive 5 years of adjuvant tamoxifen or letrozole alone or in sequence. Cognitive function was evaluated by computerized tests during the fifth year of trial treatment (Y5) and 1 year after treatment completion (Y6). Cognitive test scores were standardized according to age-specific norms and the change assessed using the Wilcoxon signed-rank test. There was significant improvement in the composite cognitive function score from Y5 to Y6 (median of change = 0.22, effect size = 0.53, P < 0.0001). This improvement was consistent in women taking either tamoxifen or letrozole at Y5 (P = 0.0006 and P = 0.0002, respectively). For postmenopausal patients who received either adjuvant letrozole or tamoxifen alone or in sequence, cognitive function improved after cessation of treatment. PMID:21046229

  10. Interpreting breast international group (BIG) 1-98: a randomized, double-blind, phase III trial comparing letrozole and tamoxifen as adjuvant endocrine therapy for postmenopausal women with hormone receptor-positive, early breast cancer

    PubMed Central

    2011-01-01

    The Breast International Group (BIG) 1-98 study is a four-arm trial comparing 5 years of monotherapy with tamoxifen or with letrozole or with sequences of 2 years of one followed by 3 years of the other for postmenopausal women with endocrine-responsive early invasive breast cancer. From 1998 to 2003, BIG -98 enrolled 8,010 women. The enhanced design f the trial enabled two complementary analyses of efficacy and safety. Collection of tumor specimens further enabled treatment comparisons based on tumor biology. Reports of BIG 1-98 should be interpreted in relation to each individual patient as she weighs the costs and benefits of available treatments. Clinicaltrials.gov ID: NCT00004205. PMID:21635709

  11. Design, conduct, and analyses of Breast International Group (BIG) 1-98: A randomized, double-blind, phase-III study comparing letrozole and tamoxifen as adjuvant endocrine therapy for postmenopausal women with receptor-positive, early breast cancer

    PubMed Central

    Giobbie-Hurder, Anita; Price, Karen N; Gelber, Richard D

    2010-01-01

    Background Aromatase inhibitors provide superior disease control when compared with tamoxifen as adjuvant therapy for postmenopausal women with endocrine-responsive early breast cancer. Purpose To present the design, history, and analytic challenges of the Breast International Group (BIG) 1-98 trial: an international, multicenter, randomized, double-blind, phase-III study comparing the aromatase inhibitor letrozole with tamoxifen in this clinical setting. Methods From 1998–2003, BIG 1-98 enrolled 8028 women to receive monotherapy with either tamoxifen or letrozole for 5 years, or sequential therapy of 2 years of one agent followed by 3 years of the other. Randomization to one of four treatment groups permitted two complementary analyses to be conducted several years apart. The first, reported in 2005, provided a head-to-head comparison of letrozole versus tamoxifen. Statistical power was increased by an enriched design, which included patients who were assigned sequential treatments until the time of the treatment switch. The second, reported in late 2008, used a conditional landmark approach to test the hypothesis that switching endocrine agents at approximately 2 years from randomization for patients who are disease-free is superior to continuing with the original agent. Results The 2005 analysis showed the superiority of letrozole compared with tamoxifen. The patients who were assigned tamoxifen alone were unblinded and offered the opportunity to switch to letrozole. Results from other trials increased the clinical relevance about whether or not to start treatment with letrozole or tamoxifen, and analysis plans were expanded to evaluate sequential versus single-agent strategies from randomization. Limitations Due to the unblinding of patients assigned tamoxifen alone, analysis of updated data will require ascertainment of the influence of selective crossover from tamoxifen to letrozole. Conclusions BIG 1-98 is an example of an enriched design, involving

  12. Participation in global value chain and green technology progress: evidence from big data of Chinese enterprises.

    PubMed

    Song, Malin; Wang, Shuhong

    2017-01-01

    This study examined the stimulative effects of Chinese enterprises' participation in the global value chain (GVC) on the progress of their green technologies. Using difference-in-difference panel models with big data of Chinese enterprises, we measured influencing factors such as enterprise participation degree, enterprise scale, corporate ownership, and research and development (R&D) investment. The results revealed that participation in the GVC can considerably improve the green technology levels in all enterprises, except state-owned ones. However, the older an enterprise, the higher the sluggishness is likely to be in its R&D activities; this is particularly true for state-owned enterprises. The findings provide insights into the strategy of actively addressing Chinese enterprises' predicament of being restricted to the lower end of the GVC.

  13. A Framework for Learning about Big Data with Mobile Technologies for Democratic Participation: Possibilities, Limitations, and Unanticipated Obstacles

    ERIC Educational Resources Information Center

    Philip, Thomas M.; Schuler-Brown, Sarah; Way, Winmar

    2013-01-01

    As Big Data becomes increasingly important in policy-making, research, marketing, and commercial applications, we argue that literacy in this domain is critical for engaged democratic participation and that peer-generated data from mobile technologies offer rich possibilities for students to learn about this new genre of data. Through the lens of…

  14. When the Big Fish Turns Small: Effects of Participating in Gifted Summer Programs on Academic Self-Concepts

    ERIC Educational Resources Information Center

    Dai, David Yun; Rinn, Anne N.; Tan, Xiaoyuan

    2013-01-01

    The purposes of this study were to (a) examine the presence and prevalence of the big-fish-little-pond effect (BFLPE) in summer programs for the gifted, (b) identify group and individual difference variables that help predict those who are more susceptible to the BFLPE, and (c) put the possible BFLPE on academic self-concept in a larger context of…

  15. 5. VIEW LOOKING NORTHWEST OF BUILDING 444. (1/1/98) Rocky ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VIEW LOOKING NORTHWEST OF BUILDING 444. (1/1/98) - Rocky Flats Plant, Non-Nuclear Production Facility, South of Cottonwood Avenue, west of Seventh Avenue & east of Building 460, Golden, Jefferson County, CO

  16. 4. VIEW LOOKING SOUTHEAST AT BUILDING 444. (1/1/98) Rocky ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. VIEW LOOKING SOUTHEAST AT BUILDING 444. (1/1/98) - Rocky Flats Plant, Non-Nuclear Production Facility, South of Cottonwood Avenue, west of Seventh Avenue & east of Building 460, Golden, Jefferson County, CO

  17. Big Data: Big Confusion? Big Challenges?

    DTIC Science & Technology

    2015-05-01

    12th Annual Acquisition Research Symposium 12th Annual Acquisition Research Symposium Big Data: Big Confusion? Big Challenges? Mary Maureen...currently valid OMB control number. 1. REPORT DATE MAY 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Big ...Data: Big Confusion? Big Challenges? 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK

  18. 49 CFR 1.98 - The Research and Innovative Technology Administration.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false The Research and Innovative Technology... DELEGATION OF POWERS AND DUTIES Operating Administrations § 1.98 The Research and Innovative Technology Administration. Is responsible for: (a) Coordinating, facilitating, and reviewing the Department's research...

  19. 49 CFR 1.98 - The Research and Innovative Technology Administration.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false The Research and Innovative Technology... DELEGATION OF POWERS AND DUTIES Operating Administrations § 1.98 The Research and Innovative Technology Administration. Is responsible for: (a) Coordinating, facilitating, and reviewing the Department's research...

  20. 49 CFR 1.98 - The Research and Innovative Technology Administration.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false The Research and Innovative Technology... DELEGATION OF POWERS AND DUTIES Operating Administrations § 1.98 The Research and Innovative Technology Administration. Is responsible for: (a) Coordinating, facilitating, and reviewing the Department's research...

  1. Big Society, Big Deal?

    ERIC Educational Resources Information Center

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  2. Obesity and Risk of Recurrence or Death After Adjuvant Endocrine Therapy With Letrozole or Tamoxifen in the Breast International Group 1-98 Trial

    PubMed Central

    Ewertz, Marianne; Gray, Kathryn P.; Regan, Meredith M.; Ejlertsen, Bent; Price, Karen N.; Thürlimann, Beat; Bonnefoi, Hervé; Forbes, John F.; Paridaens, Robert J.; Rabaglio, Manuela; Gelber, Richard D.; Colleoni, Marco; Láng, István; Smith, Ian E.; Coates, Alan S.; Goldhirsch, Aron; Mouridsen, Henning T.

    2012-01-01

    Purpose To examine the association of baseline body mass index (BMI) with the risk of recurrence or death in postmenopausal women with early-stage breast cancer receiving adjuvant tamoxifen or letrozole in the Breast International Group (BIG) 1-98 trial at 8.7 years of median follow-up. Patients and Methods This report analyzes 4,760 patients with breast cancer randomly assigned to 5 years of monotherapy with letrozole or tamoxifen in the BIG 1-98 trial with available information on BMI at randomization. Multivariable Cox modeling assessed the association of BMI with disease-free survival, overall survival (OS), breast cancer–free interval, and distant recurrence-free interval and tested for treatment-by-BMI interaction. Median follow-up was 8.7 years. Results Seventeen percent of patients have died. Obese patients (BMI ≥ 30 kg/m2) had slightly poorer OS (hazard ratio [HR] = 1.19; 95% CI, 0.99 to 1.44) than patients with normal BMI (< 25 kg/m2), whereas no trend in OS was observed in overweight (BMI 25 to < 30 kg/m2) versus normal-weight patients (HR = 1.02; 95% CI, 0.86 to 1.20). Treatment-by-BMI interactions were not statistically significant. The HRs for OS comparing obese versus normal BMI were HR = 1.22 (95% CI, 0.93 to 1.60) and HR = 1.18 (95% CI, 0.91 to 1.52) in the letrozole and tamoxifen groups, respectively. Conclusion There was no evidence that the benefit of letrozole over tamoxifen differed according to patients' BMI. PMID:23045588

  3. Big Science! Big Problems?

    ERIC Educational Resources Information Center

    Beigel, Allan

    1991-01-01

    Lessons learned by the University of Arizona through participation in two major scientific projects, construction of an astronomical observatory and a super cyclotron, are discussed. Four criteria for institutional participation in such projects are outlined, including consistency with institutional mission, adequate resources, leadership, and…

  4. How Big Is Too Big?

    ERIC Educational Resources Information Center

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  5. Plasma proatrial natriuretic factor (1-98) concentration after myocardial infarction: relation to indices of cardiac and renal function.

    PubMed Central

    Bonarjee, V. V.; Omland, T.; Nilsen, D. W.; Caidahl, K.; Sundsfjord, J. A.; Dickstein, K.

    1995-01-01

    OBJECTIVES--(a) To assess the relation between plasma concentrations of proatrial natriuretic factor (1-98) and non-invasively derived indices of left ventricular systolic and diastolic performance and (b) to assess the potential confounding effect of renal function and age on this relation in patients with acute myocardial infarction. DESIGN--Cross sectional comparison of biochemical and echocardiographic indices of cardiac function. SETTING--Norwegian central hospital. PATIENTS--Sixty four patients with acute myocardial infarction. MAIN OUTCOME MEASURES--Relation between plasma proatrial natriuretic factor (1-98) concentrations and echocardiographic indices of left ventricular systolic function as assessed by univariate and multivariate linear regression analysis. Sensitivity and specificity of plasma proatrial natriuretic factor (1-98) concentration as a measure of left ventricular systolic and diastolic dysfunction. RESULTS--Plasma proatrial natriuretic factor (1-98) concentrations were significantly related to left ventricular ejection fraction (r = -0.33; P = 0.008), age (r = 0.43; P < 0.001), and creatinine clearance (r = - 0.53; P < 0.001). In a multivariate model left ventricular ejection fraction and creatinine clearance were both independently related to plasma values. The mean concentration of proatrial natriuretic factor (1-98) was significantly higher in patients with an ejection fraction of < 40% than in those with an ejection fraction of > or = 40% (1876 (1151) v 1174 (530) pmol/l; P = 0.03) and in patients with an abnormal transmitral E/A ratio ( < 0.65 or > 1.65, where E/A is ratio of peak early filling velocity to peak atrial component) compared with those with a normal ratio (1572 (895) v 1137 (523) pmol/l, respectively; P = 0.02). When patients were subdivided according to the median concentration of proatrial natriuretic factor (1192 pmol/l) the sensitivity and specificity were 89% and 56% respectively for detecting a left ventricular ejection

  6. Big Surveys, Big Data Centres

    NASA Astrophysics Data System (ADS)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  7. Big Opportunities and Big Concerns of Big Data in Education

    ERIC Educational Resources Information Center

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  8. Big Dreams

    ERIC Educational Resources Information Center

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  9. Big bluestem

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big Bluestem (Andropogon gerardii) is a warm season grass native to North America, accounting for 40% of the herbaceous biomass of the tall grass prairie, and a candidate for bioenergy feedstock production. The goal of this study was to measure among and within population genetic variation of natura...

  10. NMR study of Cu2Se and Cu1.98Ag0.2Se superionic conductors

    NASA Astrophysics Data System (ADS)

    Sirusi Arvij, Ali; Ross, Joseph H., Jr.; Ballikaya, Sedat; Uher, Ctirad

    2015-03-01

    Cu2Se and Cu1.98Ag0.2Se are well known as superionic conductors and recently as thermoelectric materials due to observation of high ZT. We will report NMR of these compounds. Our results include indications of glassy anharmonic behavior at low temperatures, Cu ionic motion which becomes initiated near 90K, and motional narrowing near the phase transition at high temperatures as well as modified dynamics observed in the Ag-doped sample. NMR is particularly well suited to probe low frequency dynamics and at low temperatures the relaxation rate indicates anharmonic rattling behavior similar to what has been observed in other thermoelectric materials. A 90K change in the NMR spectra corresponds to the recently observed transport anomaly and indicates that the slow motion of Cu ions is initiated at this temperature and eventually becomes liquid-like at higher temperatures. We detect fast ionic motion in Cu2Se starting at 140K whereas in the Ag-doped compound this onset shifts to a higher temperature around 300K. At high temperatures the spectra become motionally narrowed, and we will discuss the narrowing and shifts in terms of activated carrier density and ionic motion. This work was supported by the Robert A. Welch Foundation.

  11. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  12. Five Big Ideas

    ERIC Educational Resources Information Center

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  13. Improvement of thermoelectric properties and their correlations with electron effective mass in Cu1.98SxSe1‑x

    NASA Astrophysics Data System (ADS)

    Zhao, Lanling; Fei, Frank Yun; Wang, Jun; Wang, Funing; Wang, Chunlei; Li, Jichao; Wang, Jiyang; Cheng, Zhenxiang; Dou, Shixue; Wang, Xiaolin

    2017-01-01

    Sulphur doping effects on the crystal structures, thermoelectric properties, density-of-states, and effective mass in Cu1.98SxSe1‑x were studied based on the electrical and thermal transport property measurements, and first-principles calculations. The X-ray diffraction patterns and Rietveld refinements indicate that room temperature Cu1.98SxSe1‑x (x = 0, 0.02, 0.08, 0.16) and Cu1.98SxSe1‑x (x = 0.8, 0.9, 1.0) have the same crystal structure as monoclinic-Cu2Se and orthorhombic-Cu2S, respectively. Sulphur doping can greatly enhance zT values when x is in the range of 0.8≤ × ≤1.0. Furthermore, all doped samples show stable thermoelectric compatibility factors over a broad temperature range from 700 to 1000 K, which could greatly benefit their practical applications. First-principles calculations indicate that both the electron density-of-sates and the effective mass for all the compounds exhibit non-monotonic sulphur doping dependence. It is concluded that the overall thermoelectric performance of the Cu1.98SxSe1‑x system is mainly correlated with the electron effective mass and the density-of-states.

  14. Improvement of thermoelectric properties and their correlations with electron effective mass in Cu1.98SxSe1-x.

    PubMed

    Zhao, Lanling; Fei, Frank Yun; Wang, Jun; Wang, Funing; Wang, Chunlei; Li, Jichao; Wang, Jiyang; Cheng, Zhenxiang; Dou, Shixue; Wang, Xiaolin

    2017-01-16

    Sulphur doping effects on the crystal structures, thermoelectric properties, density-of-states, and effective mass in Cu1.98SxSe1-x were studied based on the electrical and thermal transport property measurements, and first-principles calculations. The X-ray diffraction patterns and Rietveld refinements indicate that room temperature Cu1.98SxSe1-x (x = 0, 0.02, 0.08, 0.16) and Cu1.98SxSe1-x (x = 0.8, 0.9, 1.0) have the same crystal structure as monoclinic-Cu2Se and orthorhombic-Cu2S, respectively. Sulphur doping can greatly enhance zT values when x is in the range of 0.8≤ × ≤1.0. Furthermore, all doped samples show stable thermoelectric compatibility factors over a broad temperature range from 700 to 1000 K, which could greatly benefit their practical applications. First-principles calculations indicate that both the electron density-of-sates and the effective mass for all the compounds exhibit non-monotonic sulphur doping dependence. It is concluded that the overall thermoelectric performance of the Cu1.98SxSe1-x system is mainly correlated with the electron effective mass and the density-of-states.

  15. Improvement of thermoelectric properties and their correlations with electron effective mass in Cu1.98SxSe1−x

    PubMed Central

    Zhao, Lanling; Fei, Frank Yun; Wang, Jun; Wang, Funing; Wang, Chunlei; Li, Jichao; Wang, Jiyang; Cheng, Zhenxiang; Dou, Shixue; Wang, Xiaolin

    2017-01-01

    Sulphur doping effects on the crystal structures, thermoelectric properties, density-of-states, and effective mass in Cu1.98SxSe1−x were studied based on the electrical and thermal transport property measurements, and first-principles calculations. The X-ray diffraction patterns and Rietveld refinements indicate that room temperature Cu1.98SxSe1−x (x = 0, 0.02, 0.08, 0.16) and Cu1.98SxSe1−x (x = 0.8, 0.9, 1.0) have the same crystal structure as monoclinic-Cu2Se and orthorhombic-Cu2S, respectively. Sulphur doping can greatly enhance zT values when x is in the range of 0.8≤ × ≤1.0. Furthermore, all doped samples show stable thermoelectric compatibility factors over a broad temperature range from 700 to 1000 K, which could greatly benefit their practical applications. First-principles calculations indicate that both the electron density-of-sates and the effective mass for all the compounds exhibit non-monotonic sulphur doping dependence. It is concluded that the overall thermoelectric performance of the Cu1.98SxSe1−x system is mainly correlated with the electron effective mass and the density-of-states. PMID:28091545

  16. The effects of surface spin on magnetic properties of weak magnetic ZnLa0.02Fe1.98O4 nanoparticles

    PubMed Central

    2014-01-01

    In order to prominently investigate the effects of the surface spin on the magnetic properties, the weak magnetic ZnLa0.02Fe1.98O4 nanoparticles were chosen as studying objects which benefit to reduce as possibly the effects of interparticle dipolar interaction and crystalline anisotropy energies. By annealing the undiluted and diluted ZnLa0.02Fe1.98O4 nanoparticles at different temperatures, we observed the rich variations of magnetic ordering states (superparamagnetism, weak ferromagnetism, and paramagnetism). The magnetic properties can be well understood by considering the effects of the surface spin of the magnetic nanoparticles. Our results indicate that in the nano-sized magnets with weak magnetism, the surface spin plays a crucial rule in the magnetic properties. PMID:25294976

  17. Diode-pumped Tm : Sc{sub 2}SiO{sub 5} laser ({lambda} = 1.98 {mu}m)

    SciTech Connect

    Zavartsev, Yu D; Zagumennyi, A I; Kalachev, Yu L; Kutovoi, S A; Mikhailov, Viktor A; Podreshetnikov, V V; Shcherbakov, Ivan A

    2011-05-31

    Lasing at a wavelength of 1.98 {mu}m is obtained for the first time in a diode-pumped ({lambda} = 792 {mu}m) active element made of a Tm{sup 3+}: Sc{sub 2}SiO{sub 5} crystal grown by the Czochralski method. The laser slope efficiency reached 18.7% at the output power up to 520 mW. (lasers)

  18. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  19. The big deal about big data.

    PubMed

    Moore, Keith D; Eyestone, Katherine; Coddington, Dean C

    2013-08-01

    Big data is a concept that is being widely applied in the retail industries as a means to understand customers' purchasing habits and preferences for followup promotional activity. It is characterized by vast amounts of diverse and rapidly multiplying data that are available at or near real-time. Conversations with executives of leading healthcare organizations provide a barometer for understanding where the industry stands in its adoption of big data as a means to meet the critical information requirements of value-based health care.

  20. Dual of big bang and big crunch

    SciTech Connect

    Bak, Dongsu

    2007-01-15

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory.

  1. Implementing Big History.

    ERIC Educational Resources Information Center

    Welter, Mark

    2000-01-01

    Contends that world history should be taught as "Big History," a view that includes all space and time beginning with the Big Bang. Discusses five "Cardinal Questions" that serve as a course structure and address the following concepts: perspectives, diversity, change and continuity, interdependence, and causes. (CMK)

  2. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  3. Prognostic and Predictive Value of Centrally Reviewed Ki-67 Labeling Index in Postmenopausal Women With Endocrine-Responsive Breast Cancer: Results From Breast International Group Trial 1-98 Comparing Adjuvant Tamoxifen With Letrozole

    PubMed Central

    Viale, Giuseppe; Giobbie-Hurder, Anita; Regan, Meredith M.; Coates, Alan S.; Mastropasqua, Mauro G.; Dell'Orto, Patrizia; Maiorano, Eugenio; MacGrogan, Gaëtan; Braye, Stephen G.; Öhlschlegel, Christian; Neven, Patrick; Orosz, Zsolt; Olszewski, Wojciech P.; Knox, Fiona; Thürlimann, Beat; Price, Karen N.; Castiglione-Gertsch, Monica; Gelber, Richard D.; Gusterson, Barry A.; Goldhirsch, Aron

    2008-01-01

    Purpose To evaluate the prognostic and predictive value of Ki-67 labeling index (LI) in a trial comparing letrozole (Let) with tamoxifen (Tam) as adjuvant therapy in postmenopausal women with early breast cancer. Patients and Methods Breast International Group (BIG) trial 1-98 randomly assigned 8,010 patients to four treatment arms comparing Let and Tam with sequences of each agent. Of 4,922 patients randomly assigned to receive 5 years of monotherapy with either agent, 2,685 had primary tumor material available for central pathology assessment of Ki-67 LI by immunohistochemistry and had tumors confirmed to express estrogen receptors after central review. The prognostic and predictive value of centrally measured Ki-67 LI on disease-free survival (DFS) were assessed among these patients using proportional hazards modeling, with Ki-67 LI values dichotomized at the median value of 11%. Results Higher values of Ki-67 LI were associated with adverse prognostic factors and with worse DFS (hazard ratio [HR; high:low] = 1.8; 95% CI, 1.4 to 2.3). The magnitude of the treatment benefit for Let versus Tam was greater among patients with high tumor Ki-67 LI (HR [Let:Tam] = 0.53; 95% CI, 0.39 to 0.72) than among patients with low tumor Ki-67 LI (HR [Let:Tam] = 0.81; 95% CI, 0.57 to 1.15; interaction P = .09). Conclusion Ki-67 LI is confirmed as a prognostic factor in this study. High Ki-67 LI levels may identify a patient group that particularly benefits from initial Let adjuvant therapy. PMID:18981464

  4. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  5. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record

  6. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  7. Optimizing Clinical Research Participant Selection with Informatics.

    PubMed

    Weng, Chunhua

    2015-11-01

    Clinical research participants are often not reflective of real-world patients due to overly restrictive eligibility criteria. Meanwhile, unselected participants introduce confounding factors and reduce research efficiency. Biomedical informatics, especially Big Data increasingly made available from electronic health records, offers promising aids to optimize research participant selection through data-driven transparency.

  8. The Big Bang Theory

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  9. The Big Bang Theory

    SciTech Connect

    Lincoln, Don

    2014-09-30

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  10. Thinking big thoughts

    NASA Astrophysics Data System (ADS)

    Vedral, Vlatko

    2016-08-01

    The short synopsis of The Big Picture by Sean Carroll is that it explores the question of whether science can explain everything in the world, and analyses the emerging reality that such an explanation entails.

  11. The Big Bang Singularity

    NASA Astrophysics Data System (ADS)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  12. Big data need big theory too.

    PubMed

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'.

  13. Big data need big theory too

    PubMed Central

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  14. Effects of Yaw on the Heat Transfer to a Blunt Cone-Cylinder Configuration at a Mach Number of 1.98

    NASA Technical Reports Server (NTRS)

    English, Roland D.

    1958-01-01

    A heat-transfer investigation has been made on a blunt cone-cylinder model at a Mach number of 1.98 at yaw angles from 0 deg to 9 deg. The results indicate that, except for the hemispherical nose, the heat-transfer coefficient increased on the windward side and decreased on the leeward side as yaw angle was increased. In general, the increase in heat transfer on the windward side was higher than the corresponding decrease on the leeward side. A comparison with theory (NACA Technical Note 4208) yielded agreement which was, in general, within 10 percent on the cone at all test conditions and on the cylinder at an angle of yaw of 0 deg.

  15. Cruise report, RV ocean alert cruise A1-98-HW; January 30 through February 23, 1998, Honolulu to Honolulu, Hawaii

    USGS Publications Warehouse

    Gardner, James V.; Hughes-Clarke, John E.

    1998-01-01

    The major objective of cruise A1-98 was to map portions of the insular slopes of Oahu, Kauai, Maui, Molokai, and Hawaii and to survey in detail US Environmental Protection Agency (USEPA) ocean dumping sites using a Simrad EM300 high-resolution multibeam mapping system. The cruise was a jointly funded project between the US Army Corps of Engineers (USCOE), USEPA, and the US Geological Survey (USGS). The USACOE and EPA are interested in these areas because of a series of ocean dump sites off Oahu, Kauai, Maui, and Hawaii (Fig. 1) that require high-resolution base maps for site monitoring purposes. The USGS Coastal and Marine Geology Program has several on-going projects off Oahu and Maui that lack high-precision base maps for a variety of ongoing geological studies. The cruise was conducted under a Cooperative Agreement between the USGS and the Ocean Mapping Group, University of New Brunswick, Canada.

  16. Big Bend sees big environmental push

    SciTech Connect

    Blankinship, S.

    2007-10-15

    The 1800 MW Big Bend Power Station is a coal-fired facility in Tampa Bay, Florida, USA owned by Tampa Electric. It has four pulverized coal- fired steam units equipped with FGD scrubbers and electrostatic precipitators. Currently the addition of selective catalytic reduction (SCR) systems is under consideration. The Unit 4 SCR retrofit was completed in June 2007; the remaining three systems are scheduled for completion by 2010. Boiler draft systems will be modified to a balance draft design to accommodate the increased pressure drop of the new systems. 3-D computer models were developed to determine constructability due to the tight clearance at the site. 1 photo.

  17. Big data bioinformatics.

    PubMed

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia.

  18. Big data in biomedicine.

    PubMed

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science.

  19. Big Questions: Missing Antimatter

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  20. Big Questions: Missing Antimatter

    SciTech Connect

    Lincoln, Don

    2013-08-27

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  1. A Big Bang Lab

    ERIC Educational Resources Information Center

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  2. The Big Sky inside

    ERIC Educational Resources Information Center

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  3. Big Enough for Everyone?

    ERIC Educational Resources Information Center

    Coote, Anna

    2010-01-01

    The UK's coalition government wants to build a "Big Society." The Prime Minister says "we are all in this together" and building it is the responsibility of every citizen as well as every government department. The broad vision is welcome, but everything depends on how the vision is translated into policy and practice. The…

  4. The big bang

    NASA Astrophysics Data System (ADS)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  5. A Sobering Big Idea

    ERIC Educational Resources Information Center

    Wineburg, Sam

    2006-01-01

    Since Susan Adler, Alberta Dougan, and Jesus Garcia like "big ideas," the author offers one to ponder: young people in this country can not read with comprehension. The saddest thing about this crisis is that it is no secret. The 2001 results of the National Assessment of Educational Progress (NAEP) for reading, published in every major…

  6. The Big Fish

    ERIC Educational Resources Information Center

    DeLisle, Rebecca; Hargis, Jace

    2005-01-01

    The Killer Whale, Shamu jumps through hoops and splashes tourists in hopes for the big fish, not because of passion, desire or simply the enjoyment of doing so. What would happen if those fish were obsolete? Would this killer whale be able to find the passion to continue to entertain people? Or would Shamu find other exciting activities to do…

  7. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  8. Business and Science - Big Data, Big Picture

    NASA Astrophysics Data System (ADS)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  9. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    ERIC Educational Resources Information Center

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  10. DARPA's Big Mechanism program

    NASA Astrophysics Data System (ADS)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  11. A holographic big bang?

    NASA Astrophysics Data System (ADS)

    Afshordi, N.; Mann, R. B.; Pourhasan, R.

    2015-11-01

    We present a cosmological model in which the Universe emerges out of the collapse of a five-dimensional (5D) star as a spherical three-brane. The initial singularity of the big bang becomes hidden behind a causal horizon. Near scale-invariant primordial curvature perturbations can be induced on the brane via a thermal atmosphere that is in equilibrium with the brane, circumventing the need for a separate inflationary process and providing an important test of the model.

  12. The Next Big Idea

    PubMed Central

    2013-01-01

    Abstract George S. Eisenbarth will remain in our memories as a brilliant scientist and great collaborator. His quest to discover the cause and prevention of type 1 (autoimmune) diabetes started from building predictive models based on immunogenetic markers. Despite his tremendous contributions to our understanding of the natural history of pre-type 1 diabetes and potential mechanisms, George left us with several big questions to answer before his quest is completed. PMID:23786296

  13. DARPA's Big Mechanism program.

    PubMed

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  14. Big3. Editorial

    PubMed Central

    Lehmann, Christoph U.; Séroussi, Brigitte; Jaulent, Marie-Christine

    2014-01-01

    Summary Objectives To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. Methods A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. Results ‘Big Data’ has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that ‘Big Data’ will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics – some to a higher degree than others. It was our goal to provide a comprehensive view at the state of ‘Big Data’ today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. Conclusions For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016. PMID:24853037

  15. Big Sky Carbon Atlas

    DOE Data Explorer

    The Big Sky Carbon Atlas is an online geoportal designed for you to discover, interpret, and access geospatial data and maps relevant to decision support and education on carbon sequestration in the Big Sky Region. In serving as the public face of the Partnership's spatial Data Libraries, the Atlas provides a gateway to geographic information characterizing CO2 sources, potential geologic sinks, terrestrial carbon fluxes, civil and energy infrastructure, energy use, and related themes. In addition to directly serving the BSCSP and its stakeholders, the Atlas feeds regional data to the NatCarb Portal, contributing to a national perspective on carbon sequestration. Established components of the Atlas include a gallery of thematic maps and an interactive map that allows you to: • Navigate and explore regional characterization data through a user-friendly interface • Print your map views or publish them as PDFs • Identify technical references relevant to specific areas of interest • Calculate straight-line or pipeline-constrained distances from point sources of CO2 to potential geologic sink features • Download regional data layers (feature under development) (Acknowledgment to the Big Sky Carbon Sequestration Partnership (BSCSP); see home page at http://www.bigskyco2.org/)

  16. Disaggregating asthma: Big investigation versus big data.

    PubMed

    Belgrave, Danielle; Henderson, John; Simpson, Angela; Buchan, Iain; Bishop, Christopher; Custovic, Adnan

    2017-02-01

    We are facing a major challenge in bridging the gap between identifying subtypes of asthma to understand causal mechanisms and translating this knowledge into personalized prevention and management strategies. In recent years, "big data" has been sold as a panacea for generating hypotheses and driving new frontiers of health care; the idea that the data must and will speak for themselves is fast becoming a new dogma. One of the dangers of ready accessibility of health care data and computational tools for data analysis is that the process of data mining can become uncoupled from the scientific process of clinical interpretation, understanding the provenance of the data, and external validation. Although advances in computational methods can be valuable for using unexpected structure in data to generate hypotheses, there remains a need for testing hypotheses and interpreting results with scientific rigor. We argue for combining data- and hypothesis-driven methods in a careful synergy, and the importance of carefully characterized birth and patient cohorts with genetic, phenotypic, biological, and molecular data in this process cannot be overemphasized. The main challenge on the road ahead is to harness bigger health care data in ways that produce meaningful clinical interpretation and to translate this into better diagnoses and properly personalized prevention and treatment plans. There is a pressing need for cross-disciplinary research with an integrative approach to data science, whereby basic scientists, clinicians, data analysts, and epidemiologists work together to understand the heterogeneity of asthma.

  17. HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.

    PubMed

    Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael

    2016-01-01

    Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.

  18. Age and Gender Differences in Motivational Manifestations of the Big Five from Age 16 to 60

    ERIC Educational Resources Information Center

    Lehmann, Regula; Denissen, Jaap J. A.; Allemand, Mathias; Penke, Lars

    2013-01-01

    The present cross-sectional study investigated age and gender differences in motivational manifestations of the Big Five in a large German-speaking Internet sample (N = 19,022). Participants ranging in age from 16 to 60 years completed the Five Individual Reaction Norms Inventory (FIRNI; Denissen & Penke, 2008a), and two traditional Big Five…

  19. How Big is Earth?

    NASA Astrophysics Data System (ADS)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  20. "Big Events" and Networks.

    PubMed

    Friedman, Samuel; Rossi, Diana; Flom, Peter L

    2006-01-01

    Some, but not all, "big events" such as wars, revolutions, socioeconomic transitions, economic collapses, and ecological disasters in recent years seem to lead to large-scale HIV outbreaks (Friedman et al, in press; Hankins et al 2002). This was true of transitions in the USSR, South Africa and Indonesia, for example, but not those in the Philippines or (so far) in Argentina. It has been hypothesized that whether or not HIV outbreaks occur is shaped in part by the nature and extent of changes in the numbers of voluntary or involuntary risk-takers, which itself may be related to the growth of roles such as sex-sellers or drug sellers; the riskiness of the behaviors engaged in by risk-takers; and changes in sexual and injection networks and other "mixing patterns" variables. Each of these potential causal processes, in turn, is shaped by the nature of pre-existing social networks and the patterns and content of normative regulation and communication that happen within these social networks-and on how these social networks and their characteristics are changed by the "big event" in question. We will present ideas about what research is needed to help understand these events and to help guide both indigenous community-based efforts to prevent HIV outbreaks and also to guide those who organize external intervention efforts and aid.

  1. The Big Read: Case Studies

    ERIC Educational Resources Information Center

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  2. The Rise of Big Data in Neurorehabilitation.

    PubMed

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  3. Earth Science Big Data Activities at Research Data Alliance

    NASA Astrophysics Data System (ADS)

    Kuo, Kwo-Sen; Baumann, Peter; Evans, Ben; Riedel, Morris

    2016-04-01

    In this presentation we introduce Earth science related activities of the Big Data Interest Group (BDIG) in Research Data Alliance (RDA). "RDA is an international organization focused on the development of infrastructure and community activities that reduce barriers to data sharing and exchange, and the acceleration of data driven innovation worldwide." The participation of researchers in RDA is voluntary. As the name implies, an Interest Group is a collection of participants sharing the same interest. The BDIG seeks to address community needs on all things having to do with Big Data. The ultimate goal of RDA Big Data Interest Group is to produce a set of recommendation documents to advise diverse research communities with respect to: • How to select an appropriate Big Data solution for a particular science application to realize optimal value? and • What are the best practices in dealing with various data and computing issues associated with such a solution? The primary means to reaching such recommendations is through the establishment and work of Working Groups, each of which focuses on a specific issue. Although BDIG is not specific to Earth science, its recent activities revolve mostly around it. We introduce some of these activities that are designed to advance our knowledge and to characterize Big Data in Earth science.

  4. Beware Participation

    ERIC Educational Resources Information Center

    Jones, Arfon

    1978-01-01

    In 1972 Sidney Stringer Community School and College was established in the inner city of Coventry. Its aims directed attention to community participation and the enlargement of the decision making process. Discusses the problems with delegating educational responsibility to the community. (Author/RK)

  5. BIG DATA AND STATISTICS

    PubMed Central

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040

  6. Big cat genomics.

    PubMed

    O'Brien, Stephen J; Johnson, Warren E

    2005-01-01

    Advances in population and quantitative genomics, aided by the computational algorithms that employ genetic theory and practice, are now being applied to biological questions that surround free-ranging species not traditionally suitable for genetic enquiry. Here we review how applications of molecular genetic tools have been used to describe the natural history, present status, and future disposition of wild cat species. Insight into phylogenetic hierarchy, demographic contractions, geographic population substructure, behavioral ecology, and infectious diseases have revealed strategies for survival and adaptation of these fascinating predators. Conservation, stabilization, and management of the big cats are important areas that derive benefit from the genome resources expanded and applied to highly successful species, imperiled by an expanding human population.

  7. The Last Big Bang

    SciTech Connect

    McGuire, Austin D.; Meade, Roger Allen

    2016-09-13

    As one of the very few people in the world to give the “go/no go” decision to detonate a nuclear device, Austin “Mac” McGuire holds a very special place in the history of both the Los Alamos National Laboratory and the world. As Commander of Joint Task Force Unit 8.1.1, on Christmas Island in the spring and summer of 1962, Mac directed the Los Alamos data collection efforts for twelve of the last atmospheric nuclear detonations conducted by the United States. Since data collection was at the heart of nuclear weapon testing, it fell to Mac to make the ultimate decision to detonate each test device. He calls his experience THE LAST BIG BANG, since these tests, part of Operation Dominic, were characterized by the dramatic displays of the heat, light, and sounds unique to atmospheric nuclear detonations – never, perhaps, to be witnessed again.

  8. Big bang and big crunch in matrix string theory

    SciTech Connect

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-04-15

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe.

  9. Participative Design for Participative Democracy.

    ERIC Educational Resources Information Center

    Emery, Merrelyn, Ed.

    This four-part volume addresses design principles for introducing democratic forms in workplaces, educational institutions, and social institutions, based on a trend toward participative democracy in Australia. Following an introduction, part I sets the context with two papers: "The Agenda for the Next Wave" and "Educational…

  10. Big Bang of Massenergy and Negative Big Bang of Spacetime

    NASA Astrophysics Data System (ADS)

    Cao, Dayong

    2017-01-01

    There is a balance between Big Bang of Massenergy and Negative Big Bang of Spacetime in the universe. Also some scientists considered there is an anti-Big Bang who could produce the antimatter. And the paper supposes there is a structure balance between Einstein field equation and negative Einstein field equation, a balance between massenergy structure and spacetime structure, a balance between an energy of nucleus of the stellar matter and a dark energy of nucleus of the dark matter-dark energy, and a balance between the particle and the wave-a balance system between massenergy (particle) and spacetime (wave). It should explain of the problems of the Big Bang. http://meetings.aps.org/Meeting/APR16/Session/M13.8

  11. Acquisition Reform: Three Big Ideas

    DTIC Science & Technology

    2015-05-19

    Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Three Acq Reform Big Ideas 5 /19/2015 2 (1) Competing Capability Needs Among Services...to sponsor and users • Not required to be an acquisition expert • Tenure not as important Summary: 3 Big Ideas 5 /19/2015 19 (1) Competing...Acquisition Reform Three Big Ideas The provocative views expressed here are not those of the Department of Defense, DAU, or perhaps even the

  12. Big Data and Ambulatory Care

    PubMed Central

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2015-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945

  13. The challenges of big data

    PubMed Central

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  14. Homogeneous and isotropic big rips?

    SciTech Connect

    Giovannini, Massimo

    2005-10-15

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behavior is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  15. Big Data and Perioperative Nursing.

    PubMed

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient.

  16. The BigBOSS Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  17. Speciation of uranium and doping induced defects in Gd1.98U0.02Zr2O7: Photoluminescence, X-ray photoelectron and positron annihilation lifetime spectroscopy

    NASA Astrophysics Data System (ADS)

    Gupta, Santosh K.; Reghukumar, C.; Pathak, Nimai; Sudarshan, K.; Tyagi, D.; Mohapatra, M.; Pujari, P. K.; Kadam, R. M.

    2017-02-01

    Based on photoluminescence spectroscopy it was inferred that uranium stabilizes as both U(IV) as well as U(VI) in Gd2Zr2O7 which was also corroborated using X-ray photo electron spectroscopy (XPS). Absence of equidistant vibronic structure in emission spectrum of Gd1.98U0.02Zr2O7 confirmed that U(VI) stabilizes in the form of UO66-. Based on luminescence lifetime it was inferred that majority of UO66- stabilizes at both Gd3+/Zr4+ whereas U4+ stabilizes only at Zr4+ sites. The positron lifetime doesn't change on uranium doping indicating the formation of antisite defect. Infact it is this antisite defect in Gd1.98U0.02Zr2O7 which favours the stabilization of its fluorite phase.

  18. Big Spherules near 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This frame from the microscopic imager on NASA's Mars Exploration Rover Opportunity shows spherules up to about 5 millimeters (one-fifth of an inch) in diameter. The camera took this image during the 924th Martian day, or sol, of Opportunity's Mars-surface mission (Aug. 30, 2006), when the rover was about 200 meters (650 feet) north of 'Victoria Crater.'

    Opportunity discovered spherules like these, nicknamed 'blueberries,' at its landing site in 'Eagle Crater,' and investigations determined them to be iron-rich concretions that formed inside deposits soaked with groundwater. However, such concretions were much smaller or absent at the ground surface along much of the rover's trek of more than 5 kilometers (3 miles) southward to Victoria. The big ones showed up again when Opportunity got to the ring, or annulus, of material excavated and thrown outward by the impact that created Victoria Crater. Researchers hypothesize that some layer beneath the surface in Victoria's vicinity was once soaked with water long enough to form the concretions, that the crater-forming impact dispersed some material from that layer, and that Opportunity might encounter that layer in place if the rover drives down into the crater.

  19. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  20. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  1. Powering Big Data for Nursing Through Partnership.

    PubMed

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  2. Considerations on Geospatial Big Data

    NASA Astrophysics Data System (ADS)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  3. GEOSS: Addressing Big Data Challenges

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  4. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  5. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  6. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  7. Big Data: Astronomical or Genomical?

    PubMed

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  8. Multiwavelength astronomy and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  9. Big Data: Astronomical or Genomical?

    PubMed Central

    Stephens, Zachary D.; Lee, Skylar Y.; Faghri, Faraz; Campbell, Roy H.; Zhai, Chengxiang; Efron, Miles J.; Iyer, Ravishankar; Schatz, Michael C.; Sinha, Saurabh; Robinson, Gene E.

    2015-01-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a “four-headed beast”—it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the “genomical” challenges of the next decade. PMID:26151137

  10. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  11. Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} luminescence whisker based on vapor-phase deposition: Facile synthesis, uniform morphology and enhanced luminescence properties

    SciTech Connect

    Xu, Jian; Hassan, Dhia A.; Zeng, Renjie; Peng, Dongliang

    2015-11-15

    Highlights: • For the first time, it is possible to obtain Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} whisker. • The whiskers are smooth and uniform with L/D ratio over 50. • Durability and thermal stability of the whisker are enhanced. - Abstract: A high performance strontium silicate phosphor has been successfully synthesized though a facile vapor-phase deposition method. The product consists of single crystal whiskers which are smooth and uniform, and with a sectional equivalent diameter of around 5 μm; the aspect ratio is over 50 and no agglomeration can be observed. X-ray diffraction result confirmed that the crystal structure of the whisker was α’-Sr{sub 2}SiO{sub 4}. The exact chemical composition was Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} which was analyzed by energy dispersive spectrometer and inductively coupled plasma-mass spectrometer. The whisker shows broad green emission with peak at 523 nm ranging from 470 to 600 nm (excited at 370 nm). Compared with traditional Sr{sub 2}SiO{sub 4}:Eu phosphor, durability (at 85% humidity and 85 °C) and thermal stability of the whisker are obviously improved. Moreover, growth mechanism of the Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} whiskers is Vapor–Liquid–Solid. On a macro-scale, the product is still powder which makes it suitable for the current packaging process of WLEDs.

  12. Circadian preference and the big five: the role of impulsivity and sensation seeking.

    PubMed

    Russo, Paolo Maria; Leone, Luigi; Penolazzi, Barbara; Natale, Vincenzo

    2012-10-01

    In the present study, the relationship between personality dimensions and Circadian Preference was evaluated using a structural equation modeling approach. Participants (N=390; 53.8% female, mean age: 26.8 ± 8.1 yrs) completed measures of Circadian Preference, Impulsivity, Sensation Seeking, and the Big Five factors. A mediation structural equation model assessed the direct and indirect effects of the Big Five factors on Circadian Preference. The results showed that Impulsivity and Sensation Seeking were significantly associated with Eveningness, whereas no significant direct effects of the Big Five traits were detected once the effects of Impulsivity and Sensation Seeking were taken into account.

  13. Internet-based brain training games, citizen scientists, and big data: ethical issues in unprecedented virtual territories.

    PubMed

    Purcell, Ryan H; Rommelfanger, Karen S

    2015-04-22

    Internet brain training programs, where consumers serve as both subjects and funders of the research, represent the closest engagement many individuals have with neuroscience. Safeguards are needed to protect participants' privacy and the evolving scientific enterprise of big data.

  14. The "big win" and resistance to extinction when gambling.

    PubMed

    Weatherly, Jeffrey N; Sauter, John M; King, Brent M

    2004-11-01

    One hypothesis for the reason a person might become a pathological gambler is that the individual initially experiences a big win, which creates a fallacious expectation of winning, which may then lead to persistent gambling despite suffering large losses. Although this hypothesis has been around for several decades, only one controlled empirical study has addressed it, and that study reported null results. In the present experiment, the authors tested the "big win" hypothesis by having 4 groups of participants with little to no experience gambling play a computer-simulated slot machine for credits that were exchangeable for cash. One group experienced a large win on the very 1st play. Another experienced a large win on the 5th play. A 3rd group experienced 2 small wins on the 2nd and 5th plays. No other winning outcomes were programmed. The 4th group never experienced a win. The authors observed a significant effect of group. Participants who experienced a large win on the 1st play quit playing the simulation earlier than participants who experienced a large win on the 5th play. These results appear to question the "big win" as an explanation for pathological gambling. They are more consistent with a behavioral theory of gambling behavior. The present study should also promote the use of laboratory-based research to test long-standing hypotheses in the gambling literature.

  15. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  16. Big Opportunities in Small Science

    ERIC Educational Resources Information Center

    Dewey, T. Gregory

    2007-01-01

    A transformation is occurring that will have a major impact on how academic science is done and how scientists are trained. That transformation--driven by declining federal funds, as well as by the rising cost of technology and the need for costly, labor-intensive interdisciplinary approaches--is from small science to big science. It is…

  17. Big6 Turbotools and Synthesis

    ERIC Educational Resources Information Center

    Tooley, Melinda

    2005-01-01

    The different tools that are helpful during the Synthesis stage, their role in boosting students abilities in Synthesis and the way in which it can be customized to meet the needs of each group of students are discussed. Big6 TurboTools offers several tools to help complete the task. In Synthesis stage, these same tools along with Turbo Report and…

  18. Big Explosives Experimental Facility - BEEF

    SciTech Connect

    2014-10-31

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  19. Big Explosives Experimental Facility - BEEF

    ScienceCinema

    None

    2016-07-12

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  20. Chinchilla "big" and "little" gastrins.

    PubMed

    Shinomura, Y; Eng, J; Yalow, R S

    1987-02-27

    Gastrin heptadecapeptides (gastrins I and II which differ in the presence of sulfate on the tyrosine of the latter) have been purified and sequenced from several mammalian species including pig, dog, cat, sheep, cow, human and rat. A 34 amino acid precursor ("big" gastrin), generally accounting for only 5% of total gastrin immunoreactivity, has been purified and sequenced only from the pig, human, dog and goat. Recently we have demonstrated that guinea pig (GP) "little" gastrin is a hexadecapeptide due to a deletion of a glutamic acid in the region 6-9 from its NH2-terminus and that GP "big" gastrin is a 33 amino acid peptide. The chinchilla, like the GP, is a New World hystricomorph. This report describes the extraction and purification of "little" and "big" gastrins from 31 chinchilla antra. Chinchilla "little" gastrin is a hexadecapeptide with a sequence identical to that of the GP and its "big" gastrin is a 33 amino acid peptide with the following sequence: (See text)

  1. 1976 Big Thompson flood, Colorado

    USGS Publications Warehouse

    Jarrett, R. D.; Vandas, S.J.

    2006-01-01

    In the early evening of July 31, 1976, a large stationary thunderstorm released as much as 7.5 inches of rainfall in about an hour (about 12 inches in a few hours) in the upper reaches of the Big Thompson River drainage. This large amount of rainfall in such a short period of time produced a flash flood that caught residents and tourists by surprise. The immense volume of water that churned down the narrow Big Thompson Canyon scoured the river channel and destroyed everything in its path, including 418 homes, 52 businesses, numerous bridges, paved and unpaved roads, power and telephone lines, and many other structures. The tragedy claimed the lives of 144 people. Scores of other people narrowly escaped with their lives. The Big Thompson flood ranks among the deadliest of Colorado's recorded floods. It is one of several destructive floods in the United States that has shown the necessity of conducting research to determine the causes and effects of floods. The U.S. Geological Survey (USGS) conducts research and operates a Nationwide streamgage network to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson Flood. Such research and streamgage information are part of an ongoing USGS effort to reduce flood hazards and to increase public awareness.

  2. The Case for "Big History."

    ERIC Educational Resources Information Center

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  3. The International Big History Association

    ERIC Educational Resources Information Center

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big…

  4. China: Big Changes Coming Soon

    ERIC Educational Resources Information Center

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  5. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  6. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  7. The BigBoss Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  8. A SWOT Analysis of Big Data

    ERIC Educational Resources Information Center

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  9. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  10. Big sagebrush seed bank densities following wildfires

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  11. Big Sagebrush Seed Bank Densities Following Wildfires

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia sp.) is a critical shrub to such sagebrush obligate species as sage grouse, (Centocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush do not sprout after wildfires wildfires and big sagebrush seed is generally sho...

  12. Judging Big Deals: Challenges, Outcomes, and Advice

    ERIC Educational Resources Information Center

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good deals…

  13. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  14. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    ScienceCinema

    None

    2016-07-12

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  15. Antigravity and the big crunch/big bang transition

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  16. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    SciTech Connect

    2009-10-13

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  17. Solution of a braneworld big crunch/big bang cosmology

    SciTech Connect

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-11-15

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c){sup 2}. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  18. Big data and ophthalmic research.

    PubMed

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  19. Statistical Inference: The Big Picture.

    PubMed

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  20. District Bets Big on Standards

    ERIC Educational Resources Information Center

    Gewertz, Catherine

    2013-01-01

    The big clock in Dowan McNair-Lee's 8th grade classroom in the Stuart-Hobson Middle School is silent, but she can hear the minutes ticking away nonetheless. On this day, like any other, the clock is a constant reminder of how little time she has to prepare her students--for spring tests, and for high school and all that lies beyond it. The…

  1. Genesis of the big bang

    NASA Astrophysics Data System (ADS)

    Alpher, Ralph A.; Herman, Robert

    The authors of this volume have been intimately connected with the conception of the big bang model since 1947. Following the late George Gamov's ideas in 1942 and more particularly in 1946 that the early universe was an appropriate site for the synthesis of the elements, they became deeply involved in the question of cosmic nucleosynthesis and particularly the synthesis of the light elements. In the course of this work they developed a general relativistic model of the expanding universe with physics folded in, which led in a progressive, logical sequence to our prediction of the existence of a present cosmic background radiation some seventeen years before the observation of such radiation was reported by Penzias and Wilson. In addition, they carried out with James W. Follin, Jr., a detailed study of the physics of what was then considered to be the very early universe, starting a few seconds after the big bang, which still provides a methodology for studies of light element nucleosynthesis. Because of their involvement, they bring a personal perspective to the subject. They present a picture of what is now believed to be the state of knowledge about the evolution of the expanding universe and delineate the story of the development of the big bang model as they have seen and lived it from their own unique vantage point.

  2. Big data: the management revolution.

    PubMed

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  3. EHR Big Data Deep Phenotyping

    PubMed Central

    Lenert, L.; Lopez-Campos, G.

    2014-01-01

    Summary Objectives Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increases. The growth of genomic data and adoption of Electronic Health Records (EHR) in medicine provides a unique opportunity to integrate phenotype and genotype data into medical records. The method by which collections of clinical findings and other health related data are leveraged to form meaningful phenotypes is an active area of research. Longitudinal data stored in EHRs provide a wealth of information that can be used to construct phenotypes of patients. We focus on a practical problem around data integration for deep phenotype identification within EHR data. The use of big data approaches are described that enable scalable markup of EHR events that can be used for semantic and temporal similarity analysis to support the identification of phenotype and genotype relationships. Conclusions Stead and colleagues’ 2005 concept of using light standards to increase the productivity of software systems by riding on the wave of hardware/processing power is described as a harbinger for designing future healthcare systems. The big data solution, using flexible markup, provides a route to improved utilization of processing power for organizing patient records in genotype and phenotype research. PMID:25123744

  4. Relationship between circadian typology and big five personality domains.

    PubMed

    Tonetti, Lorenzo; Fabbri, Marco; Natale, Vincenzo

    2009-02-01

    We explored the relationship between personality, based on the five-factor model, and circadian preference. To this end, 503 participants (280 females, 223 males) were administered the Morningness-Eveningness Questionnaire (MEQ) and the self-report version of the Big Five Observer (BFO) to determine circadian preference and personality features, respectively. Morning types scored significantly higher than evening and intermediate types on the conscientiousness factor. Evening types were found to be more neurotic than morning types. With reference to the big five personality model, our data, together with those of all the previous studies, indicate that the conscientiousness domain is the one that best discriminates among the three circadian types. Results are discussed with reference to neurobiological models of personality.

  5. Dynamic Investigation of Release Characteristics of a Streamlined Internal Store from a Simulated Bomb Bay of the Republic F-105 Airplane at Mach Numbers of 0.8, 1.4, and 1.98, Coord. No. AF-222

    NASA Technical Reports Server (NTRS)

    Lee, John B.

    1956-01-01

    An investigation has been conducted in the 27- by 27-inch preflight jet of the Langley Pilotless Aircraft Research Station at Wallops Island, Va., of the release characteristics of a dynamically scaled streamlined-type internally carried store from a simulated bomb bay at Mach numbers M(sub o) of 0.8, 1.4, and 1.98. A l/17-scale model of the Republic F-105 half-fuselage and bomb-bay configuration was used with a streamlined store shape of a fineness ratio of 6.00. Simulated altitudes were 3,400 feet at M(sub o) = 0.8, 3,400, and 29,000 feet at M(sub o) = 1.4, and 29,000 feet at M(sub o) = 1.98. At supersonic speeds, high pitching moments are induced on the store in the vicinity of the bomb bay at high dynamic pressures. Successful ejections could not be made with the original configuration at supersonic speeds at near sea-level conditions. The pitching moments caused by unsymmetrical pressures on the store in a disturbed flow field were overcome by replacing the high-aspect-ratio fin with a low-aspect-ratio fin that had a 30-percent area increase which was less subject to aeroelastic effects. Release characteristics of the store were improved by orienting the fins so that they were in a more uniform flow field at the point of store release. The store pitching moments were shown to be reduced by increasing the simulated altitude. Favorable ejections were made at subsonic speeds at near sea-level conditions.

  6. Turning big bang into big bounce. I. Classical dynamics

    SciTech Connect

    Dzierzak, Piotr; Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-11-15

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  7. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    SciTech Connect

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  8. Big data: an introduction for librarians.

    PubMed

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included.

  9. Medical big data: promise and challenges.

    PubMed

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  10. Traffic information computing platform for big data

    SciTech Connect

    Duan, Zongtao Li, Ying Zheng, Xibin Liu, Yan Dai, Jiting Kang, Jun

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  11. Medical big data: promise and challenges

    PubMed Central

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-01-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology. PMID:28392994

  12. The LHC's Next Big Mystery

    NASA Astrophysics Data System (ADS)

    Lincoln, Don

    2015-03-01

    When the sun rose over America on July 4, 2012, the world of science had radically changed. The Higgs boson had been discovered. Mind you, the press releases were more cautious than that, with "a new particle consistent with being the Higgs boson" being the carefully constructed phrase of the day. But, make no mistake, champagne corks were popped and backs were slapped. The data had spoken and a party was in order. Even if the observation turned out to be something other than the Higgs boson, the first big discovery from data taken at the Large Hadron Collider had been made.

  13. The faces of Big Science.

    PubMed

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  14. Big bang nucleosynthesis: An update

    SciTech Connect

    Olive, Keith A.

    2013-07-23

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, {sup 4}He, and {sup 7}Li is discussed and compared to their observational determination. While concordance for D and {sup 4}He is satisfactory, the prediction for {sup 7}Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed.

  15. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    ERIC Educational Resources Information Center

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  16. Untapped Potential: Fulfilling the Promise of Big Brothers Big Sisters and the Bigs and Littles They Represent

    ERIC Educational Resources Information Center

    Bridgeland, John M.; Moore, Laura A.

    2010-01-01

    American children represent a great untapped potential in our country. For many young people, choices are limited and the goal of a productive adulthood is a remote one. This report paints a picture of who these children are, shares their insights and reflections about the barriers they face, and offers ways forward for Big Brothers Big Sisters as…

  17. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    ERIC Educational Resources Information Center

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  18. Baryon symmetric big bang cosmology

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  19. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  20. The BigBOSS spectrograph

    NASA Astrophysics Data System (ADS)

    Jelinsky, Patrick; Bebek, Chris; Besuner, Robert; Carton, Pierre-Henri; Edelstein, Jerry; Lampton, Michael; Levi, Michael E.; Poppett, Claire; Prieto, Eric; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a 14,000 square degree galaxy and quasi-stellar object redshift survey. It consists of a 5,000- fiber-positioner focal plane feeding the spectrographs. The optical fibers are separated into ten 500 fiber slit heads at the entrance of ten identical spectrographs in a thermally insulated room. Each of the ten spectrographs has a spectral resolution (λ/Δλ) between 1500 and 4000 over a wavelength range from 360 - 980 nm. Each spectrograph uses two dichroic beam splitters to separate the spectrograph into three arms. It uses volume phase holographic (VPH) gratings for high efficiency and compactness. Each arm uses a 4096x4096 15 μm pixel charge coupled device (CCD) for the detector. We describe the requirements and current design of the BigBOSS spectrograph. Design trades (e.g. refractive versus reflective) and manufacturability are also discussed.

  1. Big data in oncologic imaging.

    PubMed

    Regge, Daniele; Mazzetti, Simone; Giannini, Valentina; Bracco, Christian; Stasi, Michele

    2016-09-13

    Cancer is a complex disease and unfortunately understanding how the components of the cancer system work does not help understand the behavior of the system as a whole. In the words of the Greek philosopher Aristotle "the whole is greater than the sum of parts." To date, thanks to improved information technology infrastructures, it is possible to store data from each single cancer patient, including clinical data, medical images, laboratory tests, and pathological and genomic information. Indeed, medical archive storage constitutes approximately one-third of total global storage demand and a large part of the data are in the form of medical images. The opportunity is now to draw insight on the whole to the benefit of each individual patient. In the oncologic patient, big data analysis is at the beginning but several useful applications can be envisaged including development of imaging biomarkers to predict disease outcome, assessing the risk of X-ray dose exposure or of renal damage following the administration of contrast agents, and tracking and optimizing patient workflow. The aim of this review is to present current evidence of how big data derived from medical images may impact on the diagnostic pathway of the oncologic patient.

  2. Application of Generalizability Theory to the Big Five Inventory.

    PubMed

    Arterberry, Brooke J; Martens, Matthew P; Cadigan, Jennifer M; Rohrer, David

    2014-10-01

    The purpose of the present study was to examine the Big Five Personality Inventory score reliability (BFI: John, Donahue, & Kentle, 1991) utilizing Generalizability Theory analyses. Participants were recruited from a large public Midwestern university and provided complete data for the BFI on three measurement occasions (n = 264). Results suggested score reliability for scales with 7-10 items were adequate. However, score reliability for two item scales did not reach a .80 threshold. These findings have indicated BFI score reliability was, in general, acceptable and demonstrated the advantages of using Generalizability Theory analyses to examine score reliability.

  3. Application of Generalizability Theory to the Big Five Inventory

    PubMed Central

    Arterberry, Brooke J.; Martens, Matthew P.; Cadigan, Jennifer M.; Rohrer, David

    2014-01-01

    The purpose of the present study was to examine the Big Five Personality Inventory score reliability (BFI: John, Donahue, & Kentle, 1991) utilizing Generalizability Theory analyses. Participants were recruited from a large public Midwestern university and provided complete data for the BFI on three measurement occasions (n = 264). Results suggested score reliability for scales with 7-10 items were adequate. However, score reliability for two item scales did not reach a .80 threshold. These findings have indicated BFI score reliability was, in general, acceptable and demonstrated the advantages of using Generalizability Theory analyses to examine score reliability. PMID:25419025

  4. 2. Big Creek Road, worm fence and road at trailhead. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Big Creek Road, worm fence and road at trailhead. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  5. 5. Big Creek Road, old bridge on Walnut Bottom Road, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Big Creek Road, old bridge on Walnut Bottom Road, deck view. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  6. 4. Big Creek Road, old bridge on Walnut Bottom Road, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Big Creek Road, old bridge on Walnut Bottom Road, elevation view. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  7. Big sagebrush transplanting success in crested wheatgrass stands

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The conversion of formerly big sagebrush (Artemisia tridentate ssp. wyomingensis)/bunchgrass communities to annual grass dominance, primarily cheatgrass (Bromus tectorum), in Wyoming big sagebrush ecosystems has sparked the increasing demand to establish big sagebrush on disturbed rangelands. The e...

  8. Old Big Oak Flat Road at intersection with New Tioga ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Old Big Oak Flat Road at intersection with New Tioga Road. Note gate for road to Tamarack Campground - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  9. View of Old Big Oak Flat Road in Talus Slope. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of Old Big Oak Flat Road in Talus Slope. Bridal Veil Falls at center distance. Looking east - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  10. Hardiness and the Big Five Personality Traits among Chinese University Students

    ERIC Educational Resources Information Center

    Zhang, Li-fang

    2011-01-01

    This study examines the construct of hardiness with the Big Five personality traits among 362 Chinese university students. Participants in the study responded to the Dispositional Hardiness Scale (Bartone, Ursano, Wright, & Ingraham, 1989) and the Revised NEO Personality Inventory (Costa & McCrae, 1992). Results indicate that personality…

  11. Perceptions of "Big Sisters" and Their "Little Sisters" Regarding Mentoring Relationships

    ERIC Educational Resources Information Center

    Quarles, Alice; Maldonado, Nancy L.; Lacey, Candace H.; Thompson, Steve D.

    2008-01-01

    This qualitative study explored the relationships between six Little Sisters (mentees) and their Big Sisters (mentors) to develop an understanding of the perceptions of high-risk adolescent female mentees and their mentors regarding their mentoring relationships. Participants were purposefully selected--those actively involved in a formal…

  12. Learning English through Social Interaction: The Case of "Big Brother 2006," Finland

    ERIC Educational Resources Information Center

    Kaanta, Leila; Jauni, Heidi; Leppanen, Sirpa; Peuronen, Saija; Paakkinen, Terhi

    2013-01-01

    In line with recent Conversation Analytic work on language learning as situated practice, this article investigates how interactants can create language learning opportunities for themselves and others in and through social interaction. The study shows how the participants of "Big Brother Finland," a reality TV show, whose main…

  13. An embedding for the big bang

    NASA Technical Reports Server (NTRS)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  14. In Search of the Big Bubble

    ERIC Educational Resources Information Center

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  15. Structuring the Curriculum around Big Ideas

    ERIC Educational Resources Information Center

    Alleman, Janet; Knighton, Barbara; Brophy, Jere

    2010-01-01

    This article provides an inside look at Barbara Knighton's classroom teaching. She uses big ideas to guide her planning and instruction and gives other teachers suggestions for adopting the big idea approach and ways for making the approach easier. This article also represents a "small slice" of a dozen years of collaborative research,…

  16. Hom-Big Brackets: Theory and Applications

    NASA Astrophysics Data System (ADS)

    Cai, Liqiang; Sheng, Yunhe

    2016-02-01

    In this paper, we introduce the notion of hom-big brackets, which is a generalization of Kosmann-Schwarzbach's big brackets. We show that it gives rise to a graded hom-Lie algebra. Thus, it is a useful tool to study hom-structures. In particular, we use it to describe hom-Lie bialgebras and hom-Nijenhuis operators.

  17. Big system: Interactive graphics for the engineer

    NASA Technical Reports Server (NTRS)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  18. A New Look at Big History

    ERIC Educational Resources Information Center

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  19. Ethics and Epistemology in Big Data Research.

    PubMed

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-03-20

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  20. Development and Validation of Big Four Personality Scales for the Schedule for Nonadaptive and Adaptive Personality-2nd Edition (SNAP-2)

    PubMed Central

    Calabrese, William R.; Rudick, Monica M.; Simms, Leonard J.; Clark, Lee Anna

    2012-01-01

    Recently, integrative, hierarchical models of personality and personality disorder (PD)—such as the Big Three, Big Four and Big Five trait models—have gained support as a unifying dimensional framework for describing PD. However, no measures to date can simultaneously represent each of these potentially interesting levels of the personality hierarchy. To unify these measurement models psychometrically, we sought to develop Big Five trait scales within the Schedule for Adaptive and Nonadaptive Personality–2nd Edition (SNAP-2). Through structural and content analyses, we examined relations between the SNAP-2, Big Five Inventory (BFI), and NEO-Five Factor Inventory (NEO-FFI) ratings in a large data set (N = 8,690), including clinical, military, college, and community participants. Results yielded scales consistent with the Big Four model of personality (i.e., Neuroticism, Conscientiousness, Introversion, and Antagonism) and not the Big Five as there were insufficient items related to Openness. Resulting scale scores demonstrated strong internal consistency and temporal stability. Structural and external validity was supported by strong convergent and discriminant validity patterns between Big Four scale scores and other personality trait scores and expectable patterns of self-peer agreement. Descriptive statistics and community-based norms are provided. The SNAP-2 Big Four Scales enable researchers and clinicians to assess personality at multiple levels of the trait hierarchy and facilitate comparisons among competing “Big Trait” models. PMID:22250598

  1. Development and validation of Big Four personality scales for the Schedule for Nonadaptive and Adaptive Personality--Second Edition (SNAP-2).

    PubMed

    Calabrese, William R; Rudick, Monica M; Simms, Leonard J; Clark, Lee Anna

    2012-09-01

    Recently, integrative, hierarchical models of personality and personality disorder (PD)--such as the Big Three, Big Four, and Big Five trait models--have gained support as a unifying dimensional framework for describing PD. However, no measures to date can simultaneously represent each of these potentially interesting levels of the personality hierarchy. To unify these measurement models psychometrically, we sought to develop Big Five trait scales within the Schedule for Nonadaptive and Adaptive Personality--Second Edition (SNAP-2). Through structural and content analyses, we examined relations between the SNAP-2, the Big Five Inventory (BFI), and the NEO Five-Factor Inventory (NEO-FFI) ratings in a large data set (N = 8,690), including clinical, military, college, and community participants. Results yielded scales consistent with the Big Four model of personality (i.e., Neuroticism, Conscientiousness, Introversion, and Antagonism) and not the Big Five, as there were insufficient items related to Openness. Resulting scale scores demonstrated strong internal consistency and temporal stability. Structural validity and external validity were supported by strong convergent and discriminant validity patterns between Big Four scale scores and other personality trait scores and expectable patterns of self-peer agreement. Descriptive statistics and community-based norms are provided. The SNAP-2 Big Four Scales enable researchers and clinicians to assess personality at multiple levels of the trait hierarchy and facilitate comparisons among competing big-trait models.

  2. Was the Big Bang hot?

    NASA Technical Reports Server (NTRS)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  3. Big Mysteries: The Higgs Mass

    ScienceCinema

    Lincoln, Don

    2016-07-12

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  4. Big Mysteries: The Higgs Mass

    SciTech Connect

    Lincoln, Don

    2014-04-28

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  5. Evidence of the big fix

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  6. Microsystems - The next big thing

    SciTech Connect

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  7. [Big data in medicine and healthcare].

    PubMed

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  8. Processing Solutions for Big Data in Astronomy

    NASA Astrophysics Data System (ADS)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  9. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  10. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-01-04

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the first performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first Partnership meeting the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Complementary to the efforts on evaluation of sources and sinks is the development of the Big Sky Partnership Carbon Cyberinfrastructure (BSP-CC) and a GIS Road Map for the Partnership. These efforts will put in place a map-based integrated information management system for our Partnership, with transferability to the national carbon sequestration effort. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but other policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best

  11. NOAA Big Data Partnership RFI

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  12. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  13. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  14. Cosmic relics from the big bang

    SciTech Connect

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  15. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  16. Federal participation in LEED

    SciTech Connect

    Payne, Christopher; Dyer, Beverly

    2004-11-10

    The federal government has been an active participant in the development and use of USGBC's Leadership in Energy & Environmental Design Green Building Rating System (LEED). This paper presents a review of this participation and some expectations for ongoing partnership.

  17. "Big data" and "open data": What kind of access should researchers enjoy?

    PubMed

    Chatellier, Gilles; Varlet, Vincent; Blachier-Poisson, Corinne

    2016-02-01

    The healthcare sector is currently facing a new paradigm, the explosion of "big data". Coupled with advances in computer technology, the field of "big data" appears promising, allowing us to better understand the natural history of diseases, to follow-up new technologies (devices, drugs) implementation and to participate in precision medicine, etc. Data sources are multiple (medical and administrative data, electronic medical records, data from rapidly developing technologies such as DNA sequencing, connected devices, etc.) and heterogeneous while their use requires complex methods for accurate analysis. Moreover, faced with this new paradigm, we must determine who could (or should) have access to which data, how to combine collective interest and protection of personal data and how to finance in the long-term both operating costs and databases interrogation. This article analyses the opportunities and challenges related to the use of open and/or "big data", from the viewpoint of pharmacologists and representatives of the pharmaceutical and medical device industry.

  18. Big five personality and adolescent Internet addiction: The mediating role of coping style.

    PubMed

    Zhou, Yueyue; Li, Dongping; Li, Xian; Wang, Yanhui; Zhao, Liyan

    2017-01-01

    This study examined the unique associations between big five personality traits and adolescent Internet addiction (IA), as well as the mediating role of coping style underlying these relations. Our theoretical model was tested with 998 adolescents. Participants provided self-report data on demographic variables, big five personality traits, coping style, and IA. After controlling for demographic variables, it was found that agreeableness and conscientiousness were negatively associated with IA, whereas extraversion, neuroticism, and openness to experience were positively associated with IA. Mediation analyses further indicated that conscientiousness had an indirect impact on adolescent IA through decreased emotion-focused coping, whereas extraversion, neuroticism, openness to experience had indirect impacts on adolescent IA through increased emotion-focused coping. In contrast, problem-focused coping had no mediating role. These findings suggest that emotion-focused coping may, in part, account for the association between big five personality and adolescent IA.

  19. Big-bang nucleosynthesis revisited

    NASA Technical Reports Server (NTRS)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  20. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  1. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  2. The NOAA Big Data Project

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  3. COBE looks back to the Big Bang

    NASA Technical Reports Server (NTRS)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  4. Data Confidentiality Challenges in Big Data Applications

    SciTech Connect

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  5. Quality of Big Data in Healthcare

    SciTech Connect

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  6. Dark energy, wormholes, and the big rip

    SciTech Connect

    Faraoni, V.; Israel, W.

    2005-03-15

    The time evolution of a wormhole in a Friedmann universe approaching the big rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid--two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the big rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  7. Cincinnati Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Love, Lonnie J.

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  8. School Lunch Program Participation.

    ERIC Educational Resources Information Center

    Zucchino, Lori; Ranney, Christine K.

    1990-01-01

    Reductions in participation in National School Lunch Program in 1981-82 are of concern to hunger groups and legislators. Extent to which Omnibus Budget Reconciliation Acts (OBRA) of 1980-81 contributes to participation decline was measured by simulation model in New York State. Results suggest that OBRA increased participation; declining…

  9. Participative Training Skills.

    ERIC Educational Resources Information Center

    Rodwell, John

    Based on extensive field experience, this two-part book is intended to be a practical guide for maximizing participative training methods. The first part of the book looks at the principles and the core skills involved in participative training. It shows how trainee participation corresponds to the processes of adult learning and describes each…

  10. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  11. Big Data: Survey, Technologies, Opportunities, and Challenges

    PubMed Central

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  12. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-17

    ... Fish and Wildlife Service Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN... (CCP) and finding of no significant impact (FONSI) for the environmental assessment (EA) for Big Stone.../FONSI on the planning Web site at http://www.fws.gov/midwest/planning/BigStoneNWR/index.html . A...

  13. What can zookeepers tell us about interacting with big cats in captivity?

    PubMed

    Szokalski, Monika S; Litchfield, Carla A; Foster, Wendy K

    2013-03-01

    Despite the potential dangers involved, interactions between zookeepers and captive big cats are increasing. Research with other animals, particularly nonhuman primates, suggests that closer interactions can be beneficial not only for the animals and their keepers, but also for zoo visitors. This study sought to determine whether the same benefits may apply to keeper-big cat interactions. An online questionnaire was completed by 86 keepers worldwide, assessing which types of handling (hands-on, protected, hands-off) they practice with their big cats, whether they practice training, and what their opinions of these methods are (through a series of rating scales and open-ended questions). Protected contact was the most frequently used handling method among this sample, particularly with lions, tigers, and cheetahs, and training was practiced by the majority of participants with all big cat species. Participants perceived protected contact as the most beneficial handling practice for big cats, keepers, and visitors, noting how it can allow a close bond between keeper and cat, as well as its educational value for zoo visitors. Contrastingly, concerns were raised about the use of hands-on approaches, particularly with regard to the safety of all parties involved and the potential for wrong messages to be sent to visitors. Further, training was reported to be more beneficial for each group than any handling practice, yielding similar potential benefits as protected contact. Consistent with existing information with other species, these findings will be useful in directing objective research examining the use of different handling and training methods with big cats.

  14. Native perennial forb variation between mountain big sagebrush and Wyoming big sagebrush plant communities.

    PubMed

    Davies, Kirk W; Bates, Jon D

    2010-09-01

    Big sagebrush (Artemisia tridentata Nutt.) occupies large portions of the western United States and provides valuable wildlife habitat. However, information is lacking quantifying differences in native perennial forb characteristics between mountain big sagebrush [A. tridentata spp. vaseyana (Rydb.) Beetle] and Wyoming big sagebrush [A. tridentata spp. wyomingensis (Beetle & A. Young) S.L. Welsh] plant communities. This information is critical to accurately evaluate the quality of habitat and forage that these communities can produce because many wildlife species consume large quantities of native perennial forbs and depend on them for hiding cover. To compare native perennial forb characteristics on sites dominated by these two subspecies of big sagebrush, we sampled 106 intact big sagebrush plant communities. Mountain big sagebrush plant communities produced almost 4.5-fold more native perennial forb biomass and had greater native perennial forb species richness and diversity compared to Wyoming big sagebrush plant communities (P < 0.001). Nonmetric multidimensional scaling (NMS) and the multiple-response permutation procedure (MRPP) demonstrated that native perennial forb composition varied between these plant communities (P < 0.001). Native perennial forb composition was more similar within plant communities grouped by big sagebrush subspecies than expected by chance (A = 0.112) and composition varied between community groups (P < 0.001). Indicator analysis did not identify any perennial forbs that were completely exclusive and faithful, but did identify several perennial forbs that were relatively good indicators of either mountain big sagebrush or Wyoming big sagebrush plant communities. Our results suggest that management plans and habitat guidelines should recognize differences in native perennial forb characteristics between mountain and Wyoming big sagebrush plant communities.

  15. Big Jobs: Planning for Competence

    ERIC Educational Resources Information Center

    Jones, Nancy P.

    2005-01-01

    Three- to five-year-olds grow emotionally participating in meaningful and challenging physical, social, and problem-solving activities outdoors in an early childhood program on a farm. Caring for animals, planting, raking, shoveling, and engaging in meaningful indoor activities, under adult supervision, children learn to work collaboratively,…

  16. Big Books and Small Marvels

    ERIC Educational Resources Information Center

    Stanistreet, Paul

    2012-01-01

    The Reader Organisation's Get into Reading programme is all about getting people together in groups to engage with serious books. The groups are mixed and the participants sometimes challenging, but the outcomes are often remarkable. Jane Davis, who founded the Reader Organisation and continues to oversee Get into Reading, has witnessed a massive…

  17. Boosting Big National Lab Data

    SciTech Connect

    Kleese van Dam, Kerstin

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  18. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  19. Small Molecules-Big Data.

    PubMed

    Császár, Attila G; Furtenbacher, Tibor; Árendás, Péter

    2016-11-17

    Quantum mechanics builds large-scale graphs (networks): the vertices are the discrete energy levels the quantum system possesses, and the edges are the (quantum-mechanically allowed) transitions. Parts of the complete quantum mechanical networks can be probed experimentally via high-resolution, energy-resolved spectroscopic techniques. The complete rovibronic line list information for a given molecule can only be obtained through sophisticated quantum-chemical computations. Experiments as well as computations yield what we call spectroscopic networks (SN). First-principles SNs of even small, three to five atomic molecules can be huge, qualifying for the big data description. Besides helping to interpret high-resolution spectra, the network-theoretical view offers several ideas for improving the accuracy and robustness of the increasingly important information systems containing line-by-line spectroscopic data. For example, the smallest number of measurements necessary to perform to obtain the complete list of energy levels is given by the minimum-weight spanning tree of the SN and network clustering studies may call attention to "weakest links" of a spectroscopic database. A present-day application of spectroscopic networks is within the MARVEL (Measured Active Rotational-Vibrational Energy Levels) approach, whereby the transitions information on a measured SN is turned into experimental energy levels via a weighted linear least-squares refinement. MARVEL has been used successfully for 15 molecules and allowed to validate most of the transitions measured and come up with energy levels with well-defined and realistic uncertainties. Accurate knowledge of the energy levels with computed transition intensities allows the realistic prediction of spectra under many different circumstances, e.g., for widely different temperatures. Detailed knowledge of the energy level structure of a molecule coming from a MARVEL analysis is important for a considerable number of modeling

  20. Pockmarks off Big Sur, California

    USGS Publications Warehouse

    Paull, C.; Ussler, W.; Maher, N.; Greene, H. Gary; Rehder, G.; Lorenson, T.; Lee, H.

    2002-01-01

    A pockmark field was discovered during EM-300 multi-beam bathymetric surveys on the lower continental slope off the Big Sur coast of California. The field contains ??? 1500 pockmarks which are between 130 and 260 m in diameter, and typically are 8-12 m deep located within a 560 km2 area. To investigate the origin of these features, piston cores were collected from both the interior and the flanks of the pockmarks, and remotely operated vehicle observation (ROV) video and sampling transects were conducted which passed through 19 of the pockmarks. The water column within and above the pockmarks was sampled for methane concentration. Piston cores and ROV collected push cores show that the pockmark field is composed of monotonous fine silts and clays and the cores within the pockmarks are indistinguishable from those outside the pockmarks. No evidence for either sediment winnowing or diagenetic alteration suggestive of fluid venting was obtained. 14C measurements of the organic carbon in the sediments indicate continuous sedimentation throughout the time resolution of the radiocarbon technique ( ??? 45000 yr BP), with a sedimentation rate of ??? 10 cm per 1000 yr both within and between the pockmarks. Concentrations of methane, dissolved inorganic carbon, sulfate, chloride, and ammonium in pore water extracted from within the cores are generally similar in composition to seawater and show little change with depth, suggesting low biogeochemical activity. These pore water chemical gradients indicate that neither significant accumulations of gas are likely to exist in the shallow subsurface ( ??? 100 m) nor is active fluid advection occurring within the sampled sediments. Taken together the data indicate that these pockmarks are more than 45000 yr old, are presently inactive, and contain no indications of earlier fluid or gas venting events. ?? 2002 Elsevier Science B.V. All rights reserved.

  1. Big bang nucleosynthesis: Present status

    NASA Astrophysics Data System (ADS)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nν<3.2 . The new precision of the CMB and D/H observations together leaves D/H predictions as the largest source of uncertainties. Future improvement in BBN calculations will therefore rely on improved nuclear cross-section data. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  2. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  3. Benchmarking Big Data Systems and the BigData Top100 List.

    PubMed

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  4. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    ERIC Educational Resources Information Center

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  5. Enhancement of β-catenin activity by BIG1 plus BIG2 via Arf activation and cAMP signals

    PubMed Central

    Li, Chun-Chun; Le, Kang; Kato, Jiro; Moss, Joel; Vaughan, Martha

    2016-01-01

    Multifunctional β-catenin, with critical roles in both cell–cell adhesion and Wnt-signaling pathways, was among HeLa cell proteins coimmunoprecipitated by antibodies against brefeldin A-inhibited guanine nucleotide-exchange factors 1 and 2 (BIG1 or BIG2) that activate ADP-ribosylation factors (Arfs) by accelerating the replacement of bound GDP with GTP. BIG proteins also contain A-kinase anchoring protein (AKAP) sequences that can act as scaffolds for multimolecular assemblies that facilitate and limit cAMP signaling temporally and spatially. Direct interaction of BIG1 N-terminal sequence with β-catenin was confirmed using yeast two-hybrid assays and in vitro synthesized proteins. Depletion of BIG1 and/or BIG2 or overexpression of guanine nucleotide-exchange factor inactive mutant, but not wild-type, proteins interfered with β-catenin trafficking, leading to accumulation at perinuclear Golgi structures. Both phospholipase D activity and vesicular trafficking were required for effects of BIG1 and BIG2 on β-catenin activation. Levels of PKA-phosphorylated β-catenin S675 and β-catenin association with PKA, BIG1, and BIG2 were also diminished after BIG1/BIG2 depletion. Inferring a requirement for BIG1 and/or BIG2 AKAP sequence in PKA modification of β-catenin and its effect on transcription activation, we confirmed dependence of S675 phosphorylation and transcription coactivator function on BIG2 AKAP-C sequence. PMID:27162341

  6. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed

    Hutter, Harald; Moerman, Donald

    2015-11-05

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  7. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed Central

    Hutter, Harald; Moerman, Donald

    2015-01-01

    A clear definition of what constitutes “Big Data” is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of “complete” data sets for this organism is actually rather small—not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein–protein interaction—important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  8. Resources available for autism research in the big data era: a systematic review.

    PubMed

    Al-Jawahiri, Reem; Milne, Elizabeth

    2017-01-01

    Recently, there has been a move encouraged by many stakeholders towards generating big, open data in many areas of research. One area where big, open data is particularly valuable is in research relating to complex heterogeneous disorders such as Autism Spectrum Disorder (ASD). The inconsistencies of findings and the great heterogeneity of ASD necessitate the use of big and open data to tackle important challenges such as understanding and defining the heterogeneity and potential subtypes of ASD. To this end, a number of initiatives have been established that aim to develop big and/or open data resources for autism research. In order to provide a useful data reference for autism researchers, a systematic search for ASD data resources was conducted using the Scopus database, the Google search engine, and the pages on 'recommended repositories' by key journals, and the findings were translated into a comprehensive list focused on ASD data. The aim of this review is to systematically search for all available ASD data resources providing the following data types: phenotypic, neuroimaging, human brain connectivity matrices, human brain statistical maps, biospecimens, and ASD participant recruitment. A total of 33 resources were found containing different types of data from varying numbers of participants. Description of the data available from each data resource, and links to each resource is provided. Moreover, key implications are addressed and underrepresented areas of data are identified.

  9. Resources available for autism research in the big data era: a systematic review

    PubMed Central

    Milne, Elizabeth

    2017-01-01

    Recently, there has been a move encouraged by many stakeholders towards generating big, open data in many areas of research. One area where big, open data is particularly valuable is in research relating to complex heterogeneous disorders such as Autism Spectrum Disorder (ASD). The inconsistencies of findings and the great heterogeneity of ASD necessitate the use of big and open data to tackle important challenges such as understanding and defining the heterogeneity and potential subtypes of ASD. To this end, a number of initiatives have been established that aim to develop big and/or open data resources for autism research. In order to provide a useful data reference for autism researchers, a systematic search for ASD data resources was conducted using the Scopus database, the Google search engine, and the pages on ‘recommended repositories’ by key journals, and the findings were translated into a comprehensive list focused on ASD data. The aim of this review is to systematically search for all available ASD data resources providing the following data types: phenotypic, neuroimaging, human brain connectivity matrices, human brain statistical maps, biospecimens, and ASD participant recruitment. A total of 33 resources were found containing different types of data from varying numbers of participants. Description of the data available from each data resource, and links to each resource is provided. Moreover, key implications are addressed and underrepresented areas of data are identified. PMID:28097074

  10. The Big Five personality dimensions and mental health: The mediating role of alexithymia.

    PubMed

    Atari, Mohammad; Yaghoubirad, Mahsa

    2016-12-01

    The role of personality constructs on mental health has attracted research attention in the last few decades. The Big Five personality traits have been introduced as parsimonious dimensions of non-pathological traits. The five-factor model of personality includes neuroticism, agreeableness, conscientiousness, extraversion, and openness to experience. The present study aimed to examine the relationship between the Big Five dimensions and mental health considering the mediating role of alexithymia as an important emotional-processing construct. A total of 257 participants were recruited from non-clinical settings in the general population. All participants completed the Ten-Item Personality Inventory (TIPI), 20-item Toronto Alexithymia Scale (TAS-20), and General Health Questionnaire-28 (GHQ-28). Structural equation modeling was utilized to examine the hypothesized mediated model. Findings indicated that the Big Five personality dimensions could significantly predict scores of alexithymia. Moreover, alexithymia could predict mental health scores as measured by indices of depression, anxiety, social functioning, and somatic symptoms. The fit indices (GFI=0.94; CFI=0.91; TLI=0.90; RMSEA=0.071; CMIN/df=2.29) indicated that the model fits the data. Therefore, the relationship between the Big Five personality dimensions and mental health is mediated by alexithymia.

  11. Patterns of public participation.

    PubMed

    Slutsky, Jean; Tumilty, Emma; Max, Catherine; Lu, Lanting; Tantivess, Sripen; Hauegen, Renata Curi; Whitty, Jennifer A; Weale, Albert; Pearson, Steven D; Tugendhaft, Aviva; Wang, Hufeng; Staniszewska, Sophie; Weerasuriya, Krisantha; Ahn, Jeonghoon; Cubillos, Leonardo

    2016-08-15

    Purpose - The paper summarizes data from 12 countries, chosen to exhibit wide variation, on the role and place of public participation in the setting of priorities. The purpose of this paper is to exhibit cross-national patterns in respect of public participation, linking those differences to institutional features of the countries concerned. Design/methodology/approach - The approach is an example of case-orientated qualitative assessment of participation practices. It derives its data from the presentation of country case studies by experts on each system. The country cases are located within the historical development of democracy in each country. Findings - Patterns of participation are widely variable. Participation that is effective through routinized institutional processes appears to be inversely related to contestatory participation that uses political mobilization to challenge the legitimacy of the priority setting process. No system has resolved the conceptual ambiguities that are implicit in the idea of public participation. Originality/value - The paper draws on a unique collection of country case studies in participatory practice in prioritization, supplementing existing published sources. In showing that contestatory participation plays an important role in a sub-set of these countries it makes an important contribution to the field because it broadens the debate about public participation in priority setting beyond the use of minipublics and the observation of public representatives on decision-making bodies.

  12. Cosmic inflation and big bang interpreted as explosions

    NASA Astrophysics Data System (ADS)

    Rebhan, E.

    2012-12-01

    It has become common understanding that the recession of galaxies and the corresponding redshift of light received from them can only be explained by an expansion of the space between them and us. In this paper, for the presently favored case of a universe without spatial curvature, it is shown that this interpretation is restricted to comoving coordinates. It is proven by construction that within the framework of general relativity other coordinates exist in relation to which these phenomena can be explained by a motion of the cosmic substrate across space, caused by an explosionlike big bang or by inflation preceding an almost big bang. At the place of an observer, this motion occurs without any spatial expansion. It is shown that in these “explosion coordinates” the usual redshift comes about by a Doppler shift and a subsequent gravitational shift. Making use of this interpretation, it can easily be understood why in comoving coordinates light rays of short spatial extension expand and thus constitute an exemption from the rule that small objects up to the size of the solar system or even galaxies do not participate in the expansion of the universe. It is also discussed how the two interpretations can be reconciled with each other.

  13. Transcriptome marker diagnostics using big data.

    PubMed

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  14. A Review of Big Graph Mining Research

    NASA Astrophysics Data System (ADS)

    Atastina, I.; Sitohang, B.; Saptawati, G. A. P.; Moertini, V. S.

    2017-03-01

    Big Graph Mining” is a continuously developing research that was started in 2009 until now. After 7 years, there are many researches that put this topic as the main concern. However, there is no mapping or summary concerning the important issues and solutions to explain this topic. This paper contains a summary of researches that have been conducted since 2009. The result is grouped based on the algorithms, built system and also preprocess techniques that have been developed. Based on survey, there are 11 algorithms and 6 distributed systems to analyse the Big Graph have been improved. While improved pre-process algorithm only covers: sampling and compression technique. These improving algorithms are usually aimed to frequent sub graphs discovery, whereas slightly those of is aimed to cluster Big Graph, and there is no algorithm to classify Big Graph. As a conclusion of this survey, there is a need for more researches to be conducted to improve a comprehensive Graph Mining System, especially for very big Graph.

  15. Volume and Value of Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  16. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  17. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  18. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  19. Big Data Analytics for Genomic Medicine.

    PubMed

    He, Karen Y; Ge, Dongliang; He, Max M

    2017-02-15

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients' genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs.

  20. Unsupervised Tensor Mining for Big Data Practitioners.

    PubMed

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  1. Little Big Horn River Water Quality Project

    SciTech Connect

    Bad Bear, D.J.; Hooker, D.

    1995-10-01

    This report summarizes the accomplishments of the Water Quality Project on the Little Big horn River during the summer of 1995. The majority of the summer was spent collecting data on the Little Big Horn River, then testing the water samples for a number of different tests which was done at the Little Big Horn College in Crow Agency, Montana. The intention of this study is to preform stream quality analysis to gain an understanding of the quality of selected portion of the river, to assess any impact that the existing developments may be causing to the environment and to gather base-line data which will serve to provide information concerning the proposed development. Citizens of the reservation have expressed a concern of the quality of the water on the reservation; surface waters, ground water, and well waters.

  2. Big data in food safety; an overview.

    PubMed

    Marvin, Hans J P; Janssen, Esmée M; Bouzembrak, Yamine; Hendriksen, Peter J M; Staats, Martijn

    2016-11-07

    Technology is now being developed that is able to handle vast amounts of structured and unstructured data from diverse sources and origins. These technologies are often referred to as big data, and opens new areas of research and applications that will have an increasing impact in all sectors of our society. In this paper we assessed to which extent big data is being applied in the food safety domain and identified several promising trends. In several parts of the world, governments stimulate the publication on internet of all data generated in public funded research projects. This policy opens new opportunities for stakeholders dealing with food safety to address issues which were not possible before. Application of mobile phones as detection devices for food safety and the use of social media as early warning of food safety problems are a few examples of the new developments that are possible due to big data.

  3. Big Data Analytics for Genomic Medicine

    PubMed Central

    He, Karen Y.; Ge, Dongliang; He, Max M.

    2017-01-01

    Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients’ genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:28212287

  4. The dominance of big pharma: power.

    PubMed

    Edgar, Andrew

    2013-05-01

    The purpose of this paper is to provide a normative model for the assessment of the exercise of power by Big Pharma. By drawing on the work of Steven Lukes, it will be argued that while Big Pharma is overtly highly regulated, so that its power is indeed restricted in the interests of patients and the general public, the industry is still able to exercise what Lukes describes as a third dimension of power. This entails concealing the conflicts of interest and grievances that Big Pharma may have with the health care system, physicians and patients, crucially through rhetorical engagements with Patient Advocacy Groups that seek to shape public opinion, and also by marginalising certain groups, excluding them from debates over health care resource allocation. Three issues will be examined: the construction of a conception of the patient as expert patient or consumer; the phenomenon of disease mongering; the suppression or distortion of debates over resource allocation.

  5. BigMouth: a multi-institutional dental data repository.

    PubMed

    Walji, Muhammad F; Kalenderian, Elsbeth; Stark, Paul C; White, Joel M; Kookal, Krishna K; Phan, Dat; Tran, Duong; Bernstam, Elmer V; Ramoni, Rachel

    2014-01-01

    Few oral health databases are available for research and the advancement of evidence-based dentistry. In this work we developed a centralized data repository derived from electronic health records (EHRs) at four dental schools participating in the Consortium of Oral Health Research and Informatics. A multi-stakeholder committee developed a data governance framework that encouraged data sharing while allowing control of contributed data. We adopted the i2b2 data warehousing platform and mapped data from each institution to a common reference terminology. We realized that dental EHRs urgently need to adopt common terminologies. While all used the same treatment code set, only three of the four sites used a common diagnostic terminology, and there were wide discrepancies in how medical and dental histories were documented. BigMouth was successfully launched in August 2012 with data on 1.1 million patients, and made available to users at the contributing institutions.

  6. BigMouth: a multi-institutional dental data repository

    PubMed Central

    Walji, Muhammad F; Kalenderian, Elsbeth; Stark, Paul C; White, Joel M; Kookal, Krishna K; Phan, Dat; Tran, Duong; Bernstam, Elmer V; Ramoni, Rachel

    2014-01-01

    Few oral health databases are available for research and the advancement of evidence-based dentistry. In this work we developed a centralized data repository derived from electronic health records (EHRs) at four dental schools participating in the Consortium of Oral Health Research and Informatics. A multi-stakeholder committee developed a data governance framework that encouraged data sharing while allowing control of contributed data. We adopted the i2b2 data warehousing platform and mapped data from each institution to a common reference terminology. We realized that dental EHRs urgently need to adopt common terminologies. While all used the same treatment code set, only three of the four sites used a common diagnostic terminology, and there were wide discrepancies in how medical and dental histories were documented. BigMouth was successfully launched in August 2012 with data on 1.1 million patients, and made available to users at the contributing institutions. PMID:24993547

  7. How do we identify big rivers? And how big is big?

    NASA Astrophysics Data System (ADS)

    Miall, Andrew D.

    2006-04-01

    "Big rivers" are the trunk rivers that carry the water and sediment load from major orogens, or that drain large areas of a continent. Identifying such rivers in the ancient record is a challenge. Some guidance may be provided by tectonic setting and sedimentological evidence, including the scale of architectural elements, and clues from provenance studies, but such data are not infallible guides to river magnitude. The scale of depositional elements is the most obvious clue to channel size, but evidence is typically sparse and inadequate, and may be misleading. For example, thick fining-upward successions may be tectonic cyclothems. Two examples of the analysis of large ancient river systems are discussed here in order to highlight problems of methodology and interpretation. The Hawkesbury Sandstone (Triassic) of the Sydney Basin, Australia, is commonly cited as the deposit of a large river, on the basis of abundant very large-scale crossbedding. An examination of very large outcrops of this unit, including a coastal cliff section 6 km long near Sydney, showed that even with 100% exposure there are ambiguities in the determination of channel scale. It was concluded in this case that the channel dimensions of the Hawkesbury rivers were about half the size of the modern Brahmaputra River. The tectonic setting of a major ancient fluvial system is commonly not a useful clue to river scale. The Hawkesbury Sandstone is a system draining transversely from a cratonic source into a foreland basin, whereas most large rivers in foreland basins flow axially and are derived mainly from the orogenic uplifts (e.g., the large tidally influenced rivers of the Athabasca Oil Sands, Alberta). Epeirogenic tilting of a continent by the dynamic topography process may generate drainages in unexpected directions. For example, analyses of detrital zircons in Upper Paleozoic-Mesozoic nonmarine successions in the SW United States suggests significant derivation from the Appalachian orogen

  8. How quantum is the big bang?

    PubMed

    Bojowald, Martin

    2008-06-06

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state.

  9. Livermore Big Trees Park: 1998 Results

    SciTech Connect

    Mac Queen, D; Gallegos, G; Surano, K

    2002-04-18

    This report is an in-depth study of results from environmental sampling conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) at Big Trees Park in the city of Livermore. The purpose of the sampling was to determine the extent and origin of plutonium found in soil at concentrations above fallout-background levels in the park. This report describes the sampling that was conducted, the chemical and radio-chemical analyses of the samples, the quality control assessments and statistical analyses of the analytical results, and LLNL's interpretations of the results. It includes a number of data analyses not presented in LLNL's previous reports on Big Trees Park.

  10. Harnessing the Heart of Big Data

    PubMed Central

    Scruggs, Sarah B.; Watson, Karol; Su, Andrew I.; Hermjakob, Henning; Yates, John R.; Lindsey, Merry L.; Ping, Peipei

    2015-01-01

    The exponential increase in Big Data generation combined with limited capitalization on the wealth of information embedded within Big Data have prompted us to revisit our scientific discovery paradigms. A successful transition into this digital era of medicine holds great promise for advancing fundamental knowledge in biology, innovating human health and driving personalized medicine, however, this will require a drastic shift of research culture in how we conceptualize science and use data. An e-transformation will require global adoption and synergism among computational science, biomedical research and clinical domains. PMID:25814682

  11. The origin of the big-bang

    NASA Astrophysics Data System (ADS)

    Thakur, R. K.

    1992-04-01

    A singularity-free model of the universe is developed within the framework of the Friedmann-Lemaitre-Robertson-Walker cosmology, which gives a physical explanation for the origin of the big bang and for the preponderance of matter over antimatter. It is shown that the model retains all the useful features of the standard-cosmology (Weinberg, 1972; Sandage, 1988) hot big-bang (HBB) model and resolves, in a very natural way, all the difficulties of the HBB model, such as the occurrence of space-time singularity. The new model also resolves the problem of the baryon asymmetry and can accont for the currently observed value of nu.

  12. Effective dynamics of the matrix big bang

    SciTech Connect

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-05-15

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics.

  13. Use of Big-Screen Films in Multiple Childbirth Education Classroom Settings

    PubMed Central

    Kaufman, Tamara

    2010-01-01

    Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroom setting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants. PMID:21358831

  14. The Mixed Plate: A Field Experience on the Cultural and Environmental Diversity of the Big Island of Hawai'i

    ERIC Educational Resources Information Center

    Strait, John B.; Fujimoto-Strait, Ava R.

    2017-01-01

    The intent of this paper was to outline a field endeavor that encourages increased insight into important geographic themes pertaining to the Big Island of Hawai'i. Student participants in this field course come away with an enhanced comprehension and appreciation of the benefits associated with learning to incorporate geographical perspectives as…

  15. A Comparative Investigation of the BigCAT and Erickson S-24 Measures of Speech-Associated Attitude

    ERIC Educational Resources Information Center

    Vanryckeghem, Martine; Brutten, Gene J.

    2012-01-01

    The BigCAT and the Erickson S-24, self-report measures of communication attitude, were administered in a randomly determined order to 72 adults who stuttered (PWS) and 72 who did not (PWNS). The two groups of participants differed from each other to a statistically significant extent on both of these measures of speech-associated attitude,…

  16. AirMSPI PODEX BigSur Terrain Images

    Atmospheric Science Data Center

    2013-12-13

    ... Browse Images from the PODEX 2013 Campaign   Big Sur target (Big Sur, California) 02/03/2013 Terrain-projected   Select ...   Version number   For more information, see the Data Product Specifications (DPS)   ...

  17. Big Creek Hydroelectric System, East & West Transmission Line, 241mile ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Big Creek Hydroelectric System, East & West Transmission Line, 241-mile transmission corridor extending between the Big Creek Hydroelectric System in the Sierra National Forest in Fresno County and the Eagle Rock Substation in Los Angeles, California, Visalia, Tulare County, CA

  18. Research recruitment using Facebook advertising: big potential, big challenges.

    PubMed

    Kapp, Julie M; Peters, Colleen; Oliver, Debra Parker

    2013-03-01

    To our knowledge, ours is the first study to report on Facebook advertising as an exclusive mechanism for recruiting women ages 35-49 years residing in the USA into a health-related research study. We directed our survey to women ages 35-49 years who resided in the USA exclusively using three Facebook advertisements. Women were then redirected to our survey site. There were 20,568,960 women on Facebook that met the eligibility criteria. The three ads resulted in 899,998 impressions with a reach of 374,225 women. Of the women reached, 280 women (0.075 %) clicked the ad. Of the women who clicked the ad, nine women (3.2 %) proceeded past the introductory page. Social networking, and in particular Facebook, is an innovative venue for recruiting participants for research studies. Challenges include developing an ad to foster interest without biasing the sample, and motivating women who click the ad to complete the survey. There is still much to learn about this potential method of recruitment.

  19. Participative Management at Work

    ERIC Educational Resources Information Center

    Harvard Business Review, 1977

    1977-01-01

    This interview with the chief executive of Donnelly Mirrors, Inc. explains the basis of the company's leadership in participative management and discusses why it is more successful than traditional authority-based management styles. (Author/JG)

  20. Treatment Integrity: Revisiting Some Big Ideas

    ERIC Educational Resources Information Center

    Greenwood, Charles R.

    2009-01-01

    The contributors to this special issue have helped everyone consider the next steps in building a research and practice agenda regarding the use of treatment integrity. Such an agenda must converge with the big ideas that link treatment integrity to the effectiveness of evidence-based practices (EBPs), and ultimately that of the profession. In…

  1. The Big Ideas behind Whole System Reform

    ERIC Educational Resources Information Center

    Fullan, Michael

    2010-01-01

    Whole system reform means that every vital part of the system--school, community, district, and government--contributes individually and in concert to forward movement and success, using practice, not research, as the driver of reform. With this in mind, several "big ideas", based on successful implementation, informed Ontario's reform…

  2. Marketing Your Library with the Big Read

    ERIC Educational Resources Information Center

    Johnson, Wendell G.

    2012-01-01

    The Big Read was developed by the National Endowment for the Arts to revitalize the role of culture in American society and encourage the reading of landmark literature. Each year since 2007, the DeKalb Public Library, Northern Illinois University, and Kishwaukee Community College have partnered to foster literacy in the community. This article…

  3. Big-Time Fundraising for Today's Schools

    ERIC Educational Resources Information Center

    Levenson, Stanley

    2006-01-01

    In this enlightening book, nationally recognized author and fundraising consultant Stanley Levenson shows school leaders how to move away from labor-intensive, nickel-and-dime bake sales and car washes, and into the world of big-time fundraising. Following the model used by colleges and universities, the author presents a wealth of practical…

  4. Challenges of Big Data in Educational Assessment

    ERIC Educational Resources Information Center

    Gibson, David C.; Webb, Mary; Ifenthaler, Dirk

    2015-01-01

    This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…

  5. Functional connectomics from a "big data" perspective.

    PubMed

    Xia, Mingrui; He, Yong

    2017-02-14

    In the last decade, explosive growth regarding functional connectome studies has been observed. Accumulating knowledge has significantly contributed to our understanding of the brain's functional network architectures in health and disease. With the development of innovative neuroimaging techniques, the establishment of large brain datasets and the increasing accumulation of published findings, functional connectomic research has begun to move into the era of "big data", which generates unprecedented opportunities for discovery in brain science and simultaneously encounters various challenging issues, such as data acquisition, management and analyses. Big data on the functional connectome exhibits several critical features: high spatial and/or temporal precision, large sample sizes, long-term recording of brain activity, multidimensional biological variables (e.g., imaging, genetic, demographic, cognitive and clinic) and/or vast quantities of existing findings. We review studies regarding functional connectomics from a big data perspective, with a focus on recent methodological advances in state-of-the-art image acquisition (e.g., multiband imaging), analysis approaches and statistical strategies (e.g., graph theoretical analysis, dynamic network analysis, independent component analysis, multivariate pattern analysis and machine learning), as well as reliability and reproducibility validations. We highlight the novel findings in the application of functional connectomic big data to the exploration of the biological mechanisms of cognitive functions, normal development and aging and of neurological and psychiatric disorders. We advocate the urgent need to expand efforts directed at the methodological challenges and discuss the direction of applications in this field.

  6. Big-Time Sports in American Universities

    ERIC Educational Resources Information Center

    Clotfelter, Charles T.

    2011-01-01

    For almost a century, big-time college sports has been a wildly popular but consistently problematic part of American higher education. The challenges it poses to traditional academic values have been recognized from the start, but they have grown more ominous in recent decades, as cable television has become ubiquitous, commercial opportunities…

  7. Big Data Cognition for City Emergency Rescue

    NASA Astrophysics Data System (ADS)

    Zhang, Xin; Chen, Yongxin; Wang, Weisheng

    2016-11-01

    There are many kinds of data produced in the city daily life, which operates as an elementary component of the citizen life support system. The city unexpected incidents occurs in a seemingly unpredictable patterns. With the Big Data analysis the emergency rescue can be carried out efficiently. In this paper, the Big Data cognition for city emergency rescue is studied from four perspectives. From the data volume perspective, the spatial data analysis technology is divided into two parts, the indoor data and the outdoor data. From the data velocity perspective, the big data is collected from the eyes in the sky and objects on-the-ground networks, together with demographic data. From the data variety analysis perspective, the population distribution data, the socio-economic data and model estimates are included. From the data value mining perspective, the crime model estimates are studied. In the end, the application in the big public venues emergency rescue is introduced, which is located in Urumqi, Xinjiang, China.

  8. Big Gods: Extended prosociality or group binding?

    PubMed

    Galen, Luke W

    2016-01-01

    Big Gods are described as having a "prosocial" effect. However, this conflates parochialism (group cohesion) with cooperation extended to strangers or out-group members. An examination of the cited experimental studies indicates that religion is actually associated with increased within-group parochialism, rather than extended or universal prosociality, and that the same general mechanisms underlie both religious and secular effects.

  9. The Lure of the Big Time.

    ERIC Educational Resources Information Center

    Krinsky, Ira W.; Rudiger, Charles W.

    1991-01-01

    Despite all the horror stories about big-city politics, diminishing resources, and pressure-cooker workloads, urban superintendencies continue to attract a certain breed of men and women. Frequently cited reasons include the challenge, sophistication, complexity, resources, diversity, people, visibility, and compensation associated with the job.…

  10. Big Bubbles in Boiling Liquids: Students' Views

    ERIC Educational Resources Information Center

    Costu, Bayram

    2008-01-01

    The aim of this study was to elicit students' conceptions about big bubbles in boiling liquids (water, ethanol and aqueous CuSO[subscript 4] solution). The study is based on twenty-four students at different ages and grades. The clinical interviews technique was conducted to solicit students' conceptions and the interviews were analyzed to…

  11. A Big Problem for Magellan: Food Preservation

    ERIC Educational Resources Information Center

    Galvao, Cecilia; Reis, Pedro; Freire, Sofia

    2008-01-01

    In this paper, we present data related to how a Portuguese teacher developed the module "A big problem for Magellan: Food preservation." Students were asked to plan an investigation in order to identify which were the best food preservation methods in the XV and XVI centuries of Portuguese overseas navigation, and then establish a…

  12. Big Broadband Connectivity in the United States

    ERIC Educational Resources Information Center

    Windhausen, John, Jr.

    2008-01-01

    The economic and social future of the United States depends on answering the growing demand for very high-speed broadband connectivity, a capability termed "big broadband." Failure to take on the challenge could lead to a decline in global competitiveness and an inability to educate students. (Contains 20 notes.)

  13. Integrating "big data" into surgical practice.

    PubMed

    Mathias, Brittany; Lipori, Gigi; Moldawer, Lyle L; Efron, Philip A

    2016-02-01

    'Big data' is the next frontier of medicine. We now have the ability to generate and analyze large quantities of healthcare data. Although interpreting and integrating this information into clinical practice poses many challenges, the potential benefits of personalized medicine are seemingly without limit.

  14. Financing Big City Schools: Some Possible Breakthroughs.

    ERIC Educational Resources Information Center

    Marland, S.P., Jr.

    Among the many factors contributing to the crisis in big-city school finance are (1) the in-migration of the poor to the cities accompanied by the out-migration of the higher-income people; (2) higher teacher salaries; (3) the new mandates placed on schools such as cradle-to-grave accomodation in educational opportunities, manpower retraining,…

  15. Big physics quartet win government backing

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2014-09-01

    Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.

  16. Big Crater as Viewed by Pathfinder Lander

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    Mars Pathfinder is the second in NASA

  17. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Notes [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to several hundreds light years on each side of the black hole, or about several thousand million million km! More information This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising

  18. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Note: [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to 1000 light-years, or about 9000 million million km! More Information: This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates

  19. 75 FR 71069 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... December 1, 2010, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County...

  20. 76 FR 59394 - Big Eddy-Knight Transmission Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-26

    ... Bonneville Power Administration Big Eddy-Knight Transmission Project AGENCY: Bonneville Power Administration...: This notice announces the availability of the ROD to implement the Big Eddy-Knight Transmission Project in Wasco County, Oregon and Klickitat County, Washington. Construction of the Big...

  1. 7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ARCHES, AN UPSTREAM VIEW OF THE PARAPET WALL ALONG THE CREST OF THE DAM, AND THE SHELTER HOUSE AT THE EAST END OF THE DAM. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  2. Big Ideas in Primary Mathematics: Issues and Directions

    ERIC Educational Resources Information Center

    Askew, Mike

    2013-01-01

    This article is located within the literature arguing for attention to Big Ideas in teaching and learning mathematics for understanding. The focus is on surveying the literature of Big Ideas and clarifying what might constitute Big Ideas in the primary Mathematics Curriculum based on both theoretical and pragmatic considerations. This is…

  3. ["Big data" - large data, a lot of knowledge?].

    PubMed

    Hothorn, Torsten

    2015-01-28

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  4. New Evidence on the Development of the Word "Big."

    ERIC Educational Resources Information Center

    Sena, Rhonda; Smith, Linda B.

    1990-01-01

    Results indicate that curvilinear trend in children's understanding of word "big" is not obtained in all stimulus contexts. This suggests that meaning and use of "big" is complex, and may not refer simply to larger objects in a set. Proposes that meaning of "big" constitutes a dynamic system driven by many perceptual,…

  5. 9. View from middle adit Wawona Tunnel of Big Oak ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. View from middle adit Wawona Tunnel of Big Oak Flat Road with retaining walls at lower left and center left with east portal of tunnel #1. - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  6. 16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2161962 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2-16-1962 BY L.A. COUNTY PUBLIC WORKS PHOTOGRAPHER SINGER. PHOTO SHOWS THE RESERVOIR NEAR FULL CAPACITY AND WATER BEING RELEASED ON THE DOWNSTREAM SIDE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  7. 15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR FULL CAPACITY AFTER CONSTRUCTION. PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 2-15-1973 BY PHOTOGRAPHER D. MEIER OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  8. 15. AERIAL VIEW OF BIG TUJUNGA DAM TAKEN ON FEBRUARY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. AERIAL VIEW OF BIG TUJUNGA DAM TAKEN ON FEBRUARY 17, 1962, BY L.A. COUNTY PUBLIC WORKS PHOTOGRAPHER WEBB. PHOTO SHOWS THE RESERVOIR NEAR FULL CAPACITY AND WATER BEING RELEASED ON THE DOWNSTREAM SIDE. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  9. View of New Big Oak Flat Road seen from Old ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of New Big Oak Flat Road seen from Old Wawona Road near location of photograph HAER CA-148-17. Note road cuts, alignment, and tunnels. Devils Dance Floor at left distance. Looking northwest - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  10. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its...

  11. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    PubMed

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  12. The "BIG BIRD" of the "YELLOW YOUNG" man: do nontarget properties cascade?

    PubMed

    Roux, Sébastien; Bonin, Patrick; Kandel, Sonia

    2014-01-01

    This study investigated whether in speech production object properties flow in a cascaded manner or whether cascaded processing is restricted to the object's identity. In Experiments 1 and 2, participants saw pictured objects and had to state either their size (GRAND or PETIT-meaning big and small) or their name. The size of the objects varied as a function of the way they were presented on the computer screen (Experiment 1) or their real size in the world (Experiment 2). In Experiment 3, faces of young and old men were coloured in yellow or in green. The task was to name either the colour (JAUNE or VERT, meaning yellow and green, respectively) or the age (JEUNE or VIEUX, meaning young and old, respectively) of the face. In Experiments 1 and 2, no reliable effects of phonological relatedness ("GORILLE-grand"-a big gorilla) were found on the object-naming latencies. However, size-naming latencies were shorter when the adjective shared the initial phoneme of the picture name (i.e., "GRAND-gorille") than when it did not (i.e., "GRAND-dinosaure"-saying "big" in response to a big dinosaur). In Experiment 3, phonological overlap did not affect colour naming latencies, or age naming latencies. Overall, these findings strongly suggest that cascaded processing is restricted to the object's identity in conceptually driven naming tasks.

  13. Reducing waste in Big D

    SciTech Connect

    Woods, R.

    1994-03-01

    The city of Dallas is in a unique position among major metropolitan areas of the US. Despite boasting a population of more than 1 million, which grows by 1.3% each year, and a municipal solid waste (MSW) stream averaging 1.2 million tons per year, the city -- the eighth-largest in the country -- has another 50 years of landfill disposal capacity left. This fact is even more impressive when one considers that Dallas' only municipal facility, known as the McCommas Bluff Landfill, accepted 33% more MSW in 1993 than in 1992. The positive landfill situation, however, does not mean Dallas lacks ambition for recycling and waste reduction. Last, April, the city set an aggressive goal of reducing 40% of its MSW from the landfill by 1997. To help reach this mark over the next three years, the city has banned lead-acid batteries, grass clippings, and leaves from landfills, mandated news and office paper collection, and set up a composting operation at the landfill. At present, Dallas recycles about 13% of its waste. Surprisingly, most of the residential recycling effort has been based on voluntary participation through dozens of drop-off sites scattered around the city, rather than on curbside service. The only curbside program the city is conducting is a once-per-week, blue bag pilot program involving 2,000 Dallas homes.

  14. Becoming a Runner: Big, Middle and Small Stories about Physical Activity Participation in Later Life

    ERIC Educational Resources Information Center

    Griffin, Meridith; Phoenix, Cassandra

    2016-01-01

    How do older adults learn to tell a "new" story about, through, and with the body? We know that narratives are embodied, lived and central to the process of meaning-making--and as such, they do not lie in the waiting for telling, but are an active part of everyday interaction. Telling stories about ourselves to others is one way in which…

  15. High School Students as Mentors: Findings from the Big Brothers Big Sisters School-Based Mentoring Impact Study

    ERIC Educational Resources Information Center

    Herrera, Carla; Kauh, Tina J.; Cooney, Siobhan M.; Grossman, Jean Baldwin; McMaken, Jennifer

    2008-01-01

    High schools have recently become a popular source of mentors for school-based mentoring (SBM) programs. The high school Bigs program of Big Brothers Big Sisters of America, for example, currently involves close to 50,000 high-school-aged mentors across the country. While the use of these young mentors has several potential advantages, their age…

  16. SETI as a part of Big History

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50

  17. Fixing the Big Bang Theory's Lithium Problem

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines

  18. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry.

  19. BigNeuron dataset V.0.0

    DOE Data Explorer

    Ramanathan, Arvind

    2016-01-01

    The cleaned bench testing reconstructions for the gold166 datasets have been put online at github https://github.com/BigNeuron/Events-and-News/wiki/BigNeuron-Events-and-News https://github.com/BigNeuron/Data/releases/tag/gold166_bt_v1.0 The respective image datasets were released a while ago from other sites (major pointer is available at github as well https://github.com/BigNeuron/Data/releases/tag/Gold166_v1 but since the files were big, the actual downloading was distributed at 3 continents separately)

  20. Big Data - What is it and why it matters.

    PubMed

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers.

  1. Analysis of operator participation

    NASA Technical Reports Server (NTRS)

    Zarakovskiy, G. M.; Zinchenko, V. P.

    1973-01-01

    The problem of providing a psychological conception of the analysis of operator participation in a form that will allow the qualitative approach to be combined with the quantitative approach is examined. This conception is based on an understanding of the essence of human endeavor in automated control systems that now determine the development of society's productive forces and that are the main object of ergonomic research. Two main types of operator participation were examined; information retrieval with immediate service and information retrieval with delayed service.

  2. WE-H-BRB-00: Big Data in Radiation Oncology.

    PubMed

    Benedict, Stanley

    2016-06-01

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at the NIH Campus on August 13-14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis.

  3. Consumer participation in power market balancing. A real-life step towards smart grids

    NASA Astrophysics Data System (ADS)

    Per, Lund

    2014-09-01

    With the increasing role of wind and solar power, the power balance authorities are facing a big challenge: How to manage the increasing need for fast balancing power brought on by increased penetration of variable and difficult-to-forecast renewable generation? Could more active participation by the residential customers in managing electricity demand be a smart way to go?

  4. Widening Participation; Widening Capability

    ERIC Educational Resources Information Center

    Walker, Melanie

    2008-01-01

    This paper proposes that widening participation in higher education might distinctively be conceptualised beyond economically driven human capital outcomes, as a matter of widening capability. Specifically, the paper proposes forming the capability of students to become and to be "strong evaluators", able to make reflexive and informed…

  5. Participative Decision-Making.

    ERIC Educational Resources Information Center

    Lindelow, John; And Others

    Chapter 6 in a volume on school leadership, this chapter makes a case for the use of participative decision-making (PDM) at the school-site level, outlines guidelines for its implementation, and describes the experiences of some schools with PDM systems. It begins by citing research indicating the advantages of PDM, including better decisions,…

  6. Narrowing Participation Gaps

    ERIC Educational Resources Information Center

    Hand, Victoria; Kirtley, Karmen; Matassa, Michael

    2015-01-01

    Shrinking the achievement gap in mathematics is a tall order. One way to approach this challenge is to think about how the achievement gap manifests itself in the classroom and take concrete action. For example, opportunities to participate in activities that involve mathematical reasoning and argumentation in a safe and supportive manner are…

  7. Family Participation in Policymaking.

    ERIC Educational Resources Information Center

    Caplan, Elizabeth, Ed.; Blankenship, Kelly, Ed.; McManus, Marilyn, Ed.

    1998-01-01

    This bulletin focuses on family participation in mental health policymaking and highlights state efforts to increase family involvement. Articles include: (1) "Promoting Family Member Involvement in Children's Mental Health Policy Making Bodies," which describes how different states are promoting family member involvement in various statutory and…

  8. Increasing Participation through Differentiation

    ERIC Educational Resources Information Center

    Christenson, Bridget; Wager, Anita A.

    2012-01-01

    One of the many challenges teachers face is trying to differentiate instruction so all students have equal opportunities to participate, learn, and engage. To provide guidelines for differentiated instruction in mathematics, staff from the Madison Metropolitan School District in Wisconsin created a pedagogical framework for teaching called…

  9. Asking Questions about Participation

    ERIC Educational Resources Information Center

    Davies, Ian; Flanagan, Bernie; Hogarth, Sylvia; Mountford, Paula; Philpott, Jenny

    2009-01-01

    We raise questions about young people's participation in light of findings from a project ("Democracy through Citizenship") funded by the Joseph Rowntree Reform Trust Limited, and managed by the Institute for Citizenship. Following a six-month feasibility study the project took place over a three-year period in one local authority in the…

  10. Participative AIDS Education Methods.

    ERIC Educational Resources Information Center

    Chambliss, Catherine; And Others

    Since assuring quality health care delivery to patients suffering from Acquired Immunodeficiency Syndrome (AIDS) and those who test positive for Human Immunodeficiency Virus (HIV) is a priority, development of effective staff training methods is imperative. This pilot study assessed the effect on staff attitudes of a participative AIDS/HIV staff…

  11. Putting the five-factor model into context: evidence linking big five traits to narrative identity.

    PubMed

    Raggatt, Peter

    2006-10-01

    The study examined relationships between the Big Five personality traits and thematic content extracted from self-reports of life history data. One hundred and five "mature age" university students (M=30.1 years) completed the NEO PI-R trait measure, and the Personality Web Protocol. The protocol examines constituents of identity by asking participants to describe 24 key "attachments" from their life histories (significant events, people, places, objects, and possessions). Participants sorted these attachments into clusters and provided a self-descriptive label for each cluster (e.g., "adventurous self"). It was predicted that the thematic content of these cluster labels would be systematically related to Big Five trait scores (e.g., that labels referring to strength or positive emotions would be linked to Extraversion). The hypothesized links were obtained for each of the Big Five trait domains except Conscientiousness. Results are discussed with a view to broadening our understanding of the Five-Factor Model in relation to units of personality other than traits.

  12. 'Big Crater' in 360-degree panorama

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The crater dubbed 'Big Crater', approximately 2200 meters (7200 feet)away was imaged by the Imager for Mars Pathfinder (IMP) as part of a 360-degree color panorama, taken over sols 8, 9 and 10. 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona.

    Mars Pathfinder is the second in NASA's Discovery program of low-cost spacecraft with highly focused science goals. The Jet Propulsion Laboratory, Pasadena, CA, developed and manages the Mars Pathfinder mission for NASA's Office of Space Science, Washington, D.C. JPL is an operating division of the California Institute of Technology (Caltech). The Imager for Mars Pathfinder (IMP) was developed by the University of Arizona Lunar and Planetary Laboratory under contract to JPL. Peter Smith is the Principal Investigator.

  13. Big Data Analytics in Chemical Engineering.

    PubMed

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-02-27

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation. Expected final online publication date for the Annual Review of Chemical and Biomolecular Engineering Volume 8 is June 7, 2017. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.

  14. The discovery value of "Big Science".

    PubMed

    Esparza, José; Yamada, Tadataka

    2007-04-16

    The increasing complexity of biomedical research is leading to the exploration of new models for large-scale collaborative research. This Big Science approach, however, has created anxieties and potential tensions between investigator-driven research, and research guided by a more organized, collaborative effort. Another potential tension exists between research conducted purely in search of new knowledge and research aimed at finding solutions. We argue that big biomedicine--the work of coordinated multidisciplinary groups that use the latest technologies to solve complex problems--can be an important way to harness the creativity of individual investigators, stimulate innovation, and supply the infrastructure, experimental systems, and resources needed to solve the urgent health problems confronted by our global society. We discuss this using the example of the Global HIV Vaccine Enterprise.

  15. Singularities in big-bang cosmology

    NASA Astrophysics Data System (ADS)

    Penrose, R.

    1988-03-01

    A review of the history of the development of the big bang theory is presented, including the nature of singularities in black holes and their contribution to the study of the origin of the universe. Various models of the origin of the universe, the question of cosmic censorship, and the possible effects of gravitational collapse are examined. The relationship between considerations of quantum gravity and the structure of quantum theory is discussed.

  16. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money.

  17. Big Data Challenges for Large Radio Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  18. Meaningfully Integrating Big Earth Science Data

    NASA Astrophysics Data System (ADS)

    Pebesma, E. J.; Stasch, C.

    2014-12-01

    After taking the technical hurdles to deal with big earth observationdata, large challenges remain to avoid that operations are carried out that are not meaningful. Examples of this are summing things that should not be summed, or interpolating phenomena that shouldnot be interpolated. We propose a description of data at the level of their meaning, to allow for notifying data users whenmeaningless operations are being executed. We present a prototypicalimplementation in R.

  19. Livermore Big Trees Park: 1998 summary results

    SciTech Connect

    Gallegos, G; MacQueen, D; Surano, K

    1999-08-13

    This report summarizes work conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) to determine the extent and origin of plutonium at concentrations above background levels at Big Trees Park in the city of Livermore. This summary includes the project background and sections that explain the sampling, radiochemical and data analysis, and data interpretation. This report is a summary report only and is not intended as a rigorous technical or statistical analysis of the data.

  20. The Big Idea. Dynamic Stakeholder Management

    DTIC Science & Technology

    2014-12-01

    Defense AT&L: November–December 2014 8 The Big IDEA Dynamic Stakeholder Management Lt. Col. Franklin D. Gaillard II, USAF Frank Gaillard, Ph.D...information systems at Global Campus, Troy University. Report Documentation Page Form ApprovedOMB No. 0704-0188 Public reporting burden for the...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources

  1. Can big business save health care?

    PubMed

    Dunn, Philip

    2007-01-01

    Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is.

  2. Big Bend National Park, TX, USA, Mexico

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Sierra del Carmen of Mexico, across the Rio Grande River from Big Bend National Park, TX, (28.5N, 104.0W) is centered in this photo. The Rio Grande River bisects the scene; Mexico to the east, USA to the west. The thousand ft. Boquillas limestone cliff on the Mexican side of the river changes colors from white to pink to lavender at sunset. This severely eroded sedimentary landscape was once an ancient seabed later overlaid with volcanic activity.

  3. Dark radiation emerging after big bang nucleosynthesis?

    SciTech Connect

    Fischler, Willy; Meyers, Joel

    2011-03-15

    We show how recent data from observations of the cosmic microwave background may suggest the presence of additional radiation density which appeared after big bang nucleosynthesis. We propose a general scheme by which this radiation could be produced from the decay of nonrelativistic matter, we place constraints on the properties of such matter, and we give specific examples of scenarios in which this general scheme may be realized.

  4. Temperament and Character Inventory-R (TCI-R) and Big Five Questionnaire (BFQ): convergence and divergence.

    PubMed

    Capanna, Cristina; Struglia, Francesca; Riccardi, Ilaria; Daneluzzo, Enrico; Stratta, Paolo; Rossi, Alessandro

    2012-06-01

    This study evaluated the correspondence between measures of two competing theories of personality, the five-factor model as measured by the Big Five Questionnaire (BFQ), and Cloninger's psychobiological theory measured by the Temperament and Character Inventory-Revised (TCI-R). A sample of 900 Italian participants, balanced with respect to sex (393 men and 507 women), and representative of the adult population with respect to age (range 18 to 70 years; M = 39.6, SD = 15.7) completed the TCI-R and the Big Five Questionnaire. All TCI-R personality dimensions except Self-Transcendence were moderately correlated with one or more of the Big Five dimensions (from r = .40 to .61), and the two instruments showed areas of convergence. However, the differences outweighed the similarities, indicating that these current conceptualizations and measures of personality are somewhat inconsistent with each other.

  5. Body image and personality among British men: associations between the Big Five personality domains, drive for muscularity, and body appreciation.

    PubMed

    Benford, Karis; Swami, Viren

    2014-09-01

    The present study examined associations between the Big Five personality domains and measures of men's body image. A total of 509 men from the community in London, UK, completed measures of drive for muscularity, body appreciation, the Big Five domains, and subjective social status, and provided their demographic details. The results of a hierarchical regression showed that, once the effects of participant body mass index (BMI) and subjective social status had been accounted for, men's drive for muscularity was significantly predicted by Neuroticism (β=.29). In addition, taking into account the effects of BMI and subjective social status, men's body appreciation was significantly predicted by Neuroticism (β=-.35) and Extraversion (β=.12). These findings highlight potential avenues for the development of intervention approaches based on the relationship between the Big Five personality traits and body image.

  6. Envisioning the Future of 'Big Data' Biomedicine.

    PubMed

    Bui, Alex A T; Darrell Van Horn, John

    2017-03-30

    In our era of digital biomedicine, data take many forms, from "omics" to imaging, mobile health (mHealth), and electronic health records (EHRs). With the availability of more efficient digital collection methods, scientists in many domains now find themselves confronting ever larger sets of data and trying to make sense of it all (1-4). Indeed, data which used to be considered large now seems small as the amount of data now being collected in a single day by an investigator can surpass what might have been generated over his/her career even a decade ago (e.g., (5)). This deluge of biomedical information requires new thinking about how data are generated, managed, and ultimately leveraged to further scientific understanding and for improving healthcare. Responding to this challenge, the National Institutes of Health (NIH) has spearheaded the "Big Data to Knowledge" (BD2K) program (6). Data scientists are being engaged through BD2K to guide biomedical researchers through the thickets of data they are producing. NIH Director, Francis Collins, has noted, "Indeed, we are at a point in history where Big Data should not intimidate, but inspire us. We are in the midst of a revolution that is transforming the way we do biomedical research…we just have to devise creative ways to sift through this mountain of data and make sense of it" (7). The NIH is now taking its first major steps toward realizing biomedical science as an interdisciplinary "big data" science.

  7. Partnership between small biotech and big pharma.

    PubMed

    Wiederrecht, Gregory J; Hill, Raymond G; Beer, Margaret S

    2006-08-01

    The process involved in the identification and development of novel breakthrough medicines at big pharma has recently undergone significant changes, in part because of the extraordinary complexity that is associated with tackling diseases of high unmet need, and also because of the increasingly demanding requirements that have been placed on the pharmaceutical industry by investors and regulatory authorities. In addition, big pharma no longer have a monopoly on the tools and enabling technologies that are required to identify and discover new drugs, as many biotech companies now also have these capabilities. As a result, researchers at biotech companies are able to identify credible drug leads, as well as compounds that have the potential to become marketed medicinal products. This diversification of companies that are involved in drug discovery and development has in turn led to increased partnering interactions between the biotech sector and big pharma. This article examines how Merck and Co Inc, which has historically relied on a combination of internal scientific research and licensed products, has poised itself to become further engaged in partnering with biotech companies, as well as academic institutions, to increase the probability of success associated with identifying novel medicines to treat unmet medical needs--particularly in areas such as central nervous system disorders, obesity/metabolic diseases, atheroma and cancer, and also to cultivate its cardiovascular, respiratory, arthritis, bone, ophthalmology and infectious disease franchises.

  8. Navigating a Sea of Big Data

    NASA Astrophysics Data System (ADS)

    Kinkade, D.; Chandler, C. L.; Groman, R. C.; Shepherd, A.; Allison, M. D.; Rauch, S.; Wiebe, P. H.; Glover, D. M.

    2014-12-01

    Oceanographic research is evolving rapidly. New technologies, strategies, and related infrastructures have catalyzed a change in the nature of oceanographic data. Heterogeneous and complex data types can be produced and transferred at great speeds. This shift in volume, variety, and velocity of data produced has led to increased challenges in managing these Big Data. In addition, distributed research communities have greater needs for data quality control, discovery and public accessibility, and seamless integration for interdisciplinary study. Organizations charged with curating oceanographic data must also evolve to meet these needs and challenges, by employing new technologies and strategies. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in 2006, to fulfill the data management needs of investigators funded by the NSF Ocean Sciences Biological and Chemical Sections and Polar Programs Antarctic Organisms and Ecosystems Program. Since its inception, the Office has had to modify internal systems and operations to address Big Data challenges to meet the needs of the ever-evolving oceanographic research community. Some enhancements include automated procedures replacing labor-intensive manual tasks, adoption of metadata standards facilitating machine client access, a geospatial interface and the use of Semantic Web technologies to increase data discovery and interoperability. This presentation will highlight some of the BCO-DMO advances that enable us to successfully fulfill our mission in a Big Data world.

  9. Big bounce from spin and torsion

    NASA Astrophysics Data System (ADS)

    Popławski, Nikodem J.

    2012-04-01

    The Einstein-Cartan-Sciama-Kibble theory of gravity naturally extends general relativity to account for the intrinsic spin of matter. Spacetime torsion, generated by spin of Dirac fields, induces gravitational repulsion in fermionic matter at extremely high densities and prevents the formation of singularities. Accordingly, the big bang is replaced by a bounce that occurred when the energy density {ɛ ∝ gT^4} was on the order of {n^2/m_Pl^2} (in natural units), where {n ∝ gT^3} is the fermion number density and g is the number of thermal degrees of freedom. If the early Universe contained only the known standard-model particles ( g ≈ 100), then the energy density at the big bounce was about 15 times larger than the Planck energy. The minimum scale factor of the Universe (at the bounce) was about 1032 times smaller than its present value, giving ≈ 50 μm. If more fermions existed in the early Universe, then the spin-torsion coupling causes a bounce at a lower energy and larger scale factor. Recent observations of high-energy photons from gamma-ray bursts indicate that spacetime may behave classically even at scales below the Planck length, supporting the classical spin-torsion mechanism of the big bounce. Such a classical bounce prevents the matter in the contracting Universe from reaching the conditions at which a quantum bounce could possibly occur.

  10. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  11. Human neuroimaging as a "Big Data" science.

    PubMed

    Van Horn, John Darrell; Toga, Arthur W

    2014-06-01

    The maturation of in vivo neuroimaging has led to incredible quantities of digital information about the human brain. While much is made of the data deluge in science, neuroimaging represents the leading edge of this onslaught of "big data". A range of neuroimaging databasing approaches has streamlined the transmission, storage, and dissemination of data from such brain imaging studies. Yet few, if any, common solutions exist to support the science of neuroimaging. In this article, we discuss how modern neuroimaging research represents a multifactorial and broad ranging data challenge, involving the growing size of the data being acquired; sociological and logistical sharing issues; infrastructural challenges for multi-site, multi-datatype archiving; and the means by which to explore and mine these data. As neuroimaging advances further, e.g. aging, genetics, and age-related disease, new vision is needed to manage and process this information while marshalling of these resources into novel results. Thus, "big data" can become "big" brain science.

  12. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  13. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  14. Vertical landscraping, a big regionalism for Dubai.

    PubMed

    Wilson, Matthew

    2010-01-01

    Dubai's ecologic and economic complications are exacerbated by six years of accelerated expansion, a fixed top-down approach to urbanism and the construction of iconic single-phase mega-projects. With recent construction delays, project cancellations and growing landscape issues, Dubai's tower typologies have been unresponsive to changing environmental, socio-cultural and economic patterns (BBC, 2009; Gillet, 2009; Lewis, 2009). In this essay, a theory of "Big Regionalism" guides an argument for an economically and ecologically linked tower typology called the Condenser. This phased "box-to-tower" typology is part of a greater Landscape Urbanist strategy called Vertical Landscraping. Within this strategy, the Condenser's role is to densify the city, facilitating the creation of ecologic voids that order the urban region. Delineating "Big Regional" principles, the Condenser provides a time-based, global-local urban growth approach that weaves Bigness into a series of urban-regional, economic and ecological relationships, builds upon the environmental performance of the city's regional architecture and planning, promotes a continuity of Dubai's urban history, and responds to its landscape issues while condensing development. These speculations permit consideration of the overlooked opportunities embedded within Dubai's mega-projects and their long-term impact on the urban morphology.

  15. Abraham Pais Prize for History of Physics Lecture: Big, Bigger, Too Big? From Los Alamos to Fermilab and the SSC

    NASA Astrophysics Data System (ADS)

    Hoddeson, Lillian

    2012-03-01

    The modern era of big science emerged during World War II. Oppenheimer's Los Alamos laboratory offered the quintessential model of a government-funded, mission-oriented facility directed by a strong charismatic leader. The postwar beneficiaries of this model included the increasingly ambitious large laboratories that participated in particle physics--in particular, Brookhaven, SLAC, and Fermilab. They carried the big science they practiced into a new realm where experiments eventually became as large and costly as entire laboratories had been. Meanwhile the available funding grew more limited causing the physics research to be concentrated into fewer and bigger experiments that appeared never to end. The next phase in American high-energy physics was the Superconducting Super Collider, the most costly pure physics project ever attempted. The SSC's termination was a tragedy for American science, but for historians it offers an opportunity to understand what made the success of earlier large high-energy physics laboratories possible, and what made the continuation of the SSC impossible. The most obvious reason for the SSC's failure was its enormous and escalating budget, which Congress would no longer support. Other factors need to be recognized however: no leader could be found with directing skills as strong as those of Wilson, Panofsky, Lederman, or Richter; the scale of the project subjected it to uncomfortable public and Congressional scrutiny; and the DOE's enforcement of management procedures of the military-industrial complex that clashed with those typical of the scientific community led to the alienation and withdrawal of many of the most creative scientists, and to the perception and the reality of poor management. These factors, exacerbated by negative pressure from scientists in other fields and a post-Cold War climate in which physicists had little of their earlier cultural prestige, discouraged efforts to gain international support. They made the SSC

  16. Big questions come in bundles, hence they should be tackled systemically.

    PubMed

    Bunge, Mario

    2014-01-01

    A big problem is, by definition, one involving either multiple traits of a thing, or a collection of things. Because such problems are systemic or global rather than local or sectoral, they call for a systemic approach, and often a multidisciplinary one as well. Just think of individual and public health problems, or of income inequality, gender discrimination, housing, environmental, or political participation issues. All of them are inter-related, so that focusing on one of them at a time is bound to produce either short-term solutions or utter failure. This is also why single-issue political movements are bound to fail. In other words, big problems are systemic and must, therefore be approached systemically--though with the provisos that systemism must not be confused with holism and that synthesis complements analysis instead of replacing it.

  17. Big Five personality traits: are they really important for the subjective well-being of Indians?

    PubMed

    Tanksale, Deepa

    2015-02-01

    This study empirically examined the relationship between the Big Five personality traits and subjective well-being (SWB) in India. SWB variables used were life satisfaction, positive affect and negative affect. A total of 183 participants in the age range 30-40 years from Pune, India, completed the personality and SWB measures. Backward stepwise regression analysis showed that the Big Five traits accounted for 17% of the variance in life satisfaction, 35% variance in positive affect and 28% variance in negative affect. Conscientiousness emerged as the strongest predictor of life satisfaction. In line with the earlier research findings, neuroticism and extraversion were found to predict negative affect and positive affect, respectively. Neither openness to experience nor agreeableness contributed to SWB. The research emphasises the need to revisit the association between personality and SWB across different cultures, especially non-western cultures.

  18. Poverty, health and participation.

    PubMed

    Cosgrove, S

    2007-09-01

    Poverty is an important influence on health and despite continuing economic growth, poverty and health inequalities persist. Current public policy aims to reduce the inequalities in the health, by focussing on the social factors influencing health, improving access to health and personal social services for those who are poor or socially excluded and by improving the information and research base in respect of the health status and service access for the poor and socially excluded groups. It is important that processes for target setting and evaluation involve people experiencing poverty, at all levels through consultative and participative structures and processes and in the roll-out of primary care teams. A number of projects throughout the country aim to address health inequalities using community development. These are essentially about widening participation in the development, planning and delivery of health services and ensuring that the community is actively involved in the decision making process about health services in their area.

  19. Biobanking with Big Data: A Need for Developing "Big Data Metrics".

    PubMed

    Zisis, Kozlakidis

    2016-10-01

    The term "big data" has often been used as an all-encompassing phrase for research that involves the use of large-scale data sets. However, the use of the term does little to signify the underlying complexity of definitions, of data sets, and of the requirements that need to be taken into consideration for sustainable research and the estimation of downstream impact. In particular, "big data" is frequently connected with biobanks and biobank networks as the institutions involved in tissue preservation are increasingly and perhaps unavoidably linked to the de facto preservation of information.

  20. Big Data, Big Problems: Incorporating Mission, Values, and Culture in Provider Affiliations.

    PubMed

    Shaha, Steven H; Sayeed, Zain; Anoushiravani, Afshin A; El-Othmani, Mouhanad M; Saleh, Khaled J

    2016-10-01

    This article explores how integration of data from clinical registries and electronic health records produces a quality impact within orthopedic practices. Data are differentiated from information, and several types of data that are collected and used in orthopedic outcome measurement are defined. Furthermore, the concept of comparative effectiveness and its impact on orthopedic clinical research are assessed. This article places emphasis on how the concept of big data produces health care challenges balanced with benefits that may be faced by patients and orthopedic surgeons. Finally, essential characteristics of an electronic health record that interlinks musculoskeletal care and big data initiatives are reviewed.

  1. [Big data and their perspectives in radiation therapy].

    PubMed

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article.

  2. Data management by using R: big data clinical research series.

    PubMed

    Zhang, Zhongheng

    2015-11-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research.

  3. Floods in the Big Creek basin, Linn County, Iowa

    USGS Publications Warehouse

    Heinitz, Albert J.

    1977-01-01

    Flood information for the Big Creek basin in Linn County, Iowa, should be of use to those concerned with the design of bridges and other structures on the flood plains of the streams. Water-surface profiles for the flood of May 1974 are given for Big Creek and its major tributaries, East Big, Crabapple, Elbow, and Abbe Creeks. The May 1974 flood was at least a 50-year flood on East Big Creek and along certain reaches of Big and Abbe Creeks. Also included for Big Creek are a profile of the December 1971 medium-stage flow and a partial profile for the minor flood of July 1971. Profiles for the low-water condition of October 26, 1972, are shown for all reaches. Water-surface profiles for the 25- and 50-year floods are estimated in relation to the May 1974 flood.

  4. Age-related trends of inhibitory control in Stroop-like big-small task in 3 to 12-year-old children and young adults.

    PubMed

    Ikeda, Yoshifumi; Okuzumi, Hideyuki; Kokubun, Mitsuru

    2014-01-01

    Inhibitory control is the ability to suppress competing, dominant, automatic, or prepotent cognitive processing at perceptual, intermediate, and output stages. Inhibitory control is a key cognitive function of typical and atypical child development. This study examined age-related trends of Stroop-like interference in 3 to 12-year-old children and young adults by administration of a computerized Stroop-like big-small task with reduced working memory demand. This task used a set of pictures displaying a big and small circle in black and included the same condition and the opposite condition. In the same condition, each participant was instructed to say "big" when viewing the big circle and to say "small" when viewing the small circle. In the opposite condition, each participant was instructed to say "small" when viewing the big circle and to say "big" when viewing the small circle. The opposite condition required participants to inhibit the prepotent response of saying the same, a familiar response to a perceptual stimulus. The results of this study showed that Stroop-like interference decreased markedly in children in terms of error rates and correct response time. There was no deterioration of performance occurring between the early trials and the late trials in the sessions of the day-night task. Moreover, pretest failure rate was relatively low in this study. The Stroop-like big-small task is a useful tool to assess the development of inhibitory control in young children in that the task is easy to understand and has small working memory demand.

  5. Community-Academic Partnership Participation.

    PubMed

    Meza, Rosemary; Drahota, Amy; Spurgeon, Emily

    2016-10-01

    Community-academic partnerships (CAPs) improve the research process, outcomes, and yield benefits for the community and researchers. This exploratory study examined factors important in community stakeholders' decision to participate in CAPs. Autism spectrum disorder (ASD) community stakeholders, previously contacted to participate in a CAP (n = 18), completed the 15-item Decision to Participate Questionnaire (DPQ). The DPQ assessed reasons for participating or declining participation in the ASD CAP. CAP participants rated networking with other providers, fit of collaboration with agency philosophy, and opportunity for future training/consultations as factors more important in their decision to participate in the ASD CAP than nonparticipants. Nonparticipants reported the number of requests to participate in research as more important in their decision to decline participation than participants. Findings reveal important factors in community stakeholders' decision to participate in CAPs that may provide guidance on increasing community engagement in CAPs and help close the science-to-service gap.

  6. [Women's participation in science].

    PubMed

    Sánchez-Guzmán, María Alejandra; Corona-Vázquez, Teresa

    2009-01-01

    The participation of women in higher education in Mexico took place in the late 19th and early 20th century. The rise of women's enrollment in universities known as the "feminization of enrollment" occurred in the last thirty years. In this review we analyze how the new conditions that facilitated better access to higher education are reflected in the inclusion of women in science. We include an overview of the issues associated with a change in the demographics of enrollment, segregation of academic areas between men and women and participation in post graduate degrees. We also review the proportion of women in science. While in higher education the ratio between male and women is almost 50-50 and in some areas the presence of women is even higher, in the field of scientific research women account for barely 30% of professionals. This is largely due to structural conditions that limit the access of women to higher positions of power that have been predominantly taken by men.

  7. 'Big data' in pharmaceutical science: challenges and opportunities.

    PubMed

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  8. [Contemplation on the application of big data in clinical medicine].

    PubMed

    Lian, Lei

    2015-01-01

    Medicine is another area where big data is being used. The link between clinical treatment and outcome is the key step when applying big data in medicine. In the era of big data, it is critical to collect complete outcome data. Patient follow-up, comprehensive integration of data resources, quality control and standardized data management are the predominant approaches to avoid missing data and data island. Therefore, establishment of systemic patients follow-up protocol and prospective data management strategy are the important aspects of big data in medicine.

  9. Differential Privacy Preserving in Big Data Analytics for Connected Health.

    PubMed

    Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei

    2016-04-01

    In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy.

  10. Big Impacts and Transient Oceans on Titan

    NASA Technical Reports Server (NTRS)

    Zahnle, K. J.; Korycansky, D. G.; Nixon, C. A.

    2014-01-01

    We have studied the thermal consequences of very big impacts on Titan [1]. Titan's thick atmosphere and volatile-rich surface cause it to respond to big impacts in a somewhat Earth-like manner. Here we construct a simple globally-averaged model that tracks the flow of energy through the environment in the weeks, years, and millenia after a big comet strikes Titan. The model Titan is endowed with 1.4 bars of N2 and 0.07 bars of CH4, methane lakes, a water ice crust, and enough methane underground to saturate the regolith to the surface. We assume that half of the impact energy is immediately available to the atmosphere and surface while the other half is buried at the site of the crater and is unavailable on time scales of interest. The atmosphere and surface are treated as isothermal. We make the simplifying assumptions that the crust is everywhere as methane saturated as it was at the Huygens landing site, that the concentration of methane in the regolith is the same as it is at the surface, and that the crust is made of water ice. Heat flow into and out of the crust is approximated by step-functions. If the impact is great enough, ice melts. The meltwater oceans cool to the atmosphere conductively through an ice lid while at the base melting their way into the interior, driven down in part through Rayleigh-Taylor instabilities between the dense water and the warm ice. Topography, CO2, and hydrocarbons other than methane are ignored. Methane and ethane clathrate hydrates are discussed quantitatively but not fully incorporated into the model.

  11. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  12. Probing the Big Bang with LEP

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1990-01-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis, and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is approximately 6 percent of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting than the favorite non-baryonic dark matter candidates of a few years ago.

  13. Nuclear Receptors, RXR, and the Big Bang.

    PubMed

    Evans, Ronald M; Mangelsdorf, David J

    2014-03-27

    Isolation of genes encoding the receptors for steroids, retinoids, vitamin D, and thyroid hormone and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors and, in particular, of the retinoid X receptor (RXR) positioned nuclear receptors at the epicenter of the "Big Bang" of molecular endocrinology. This Review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multicellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism.

  14. Probing the Big Bang with LEP

    SciTech Connect

    Schramm, D.N. Fermi National Accelerator Lab., Batavia, IL )

    1990-06-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is {approximately}6% of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting that the favorite non-baryonic dark matter candidates of a few years ago. 59 refs., 4 figs., 2 tabs.

  15. Nuclear Receptors, RXR & the Big Bang

    PubMed Central

    Evans, Ronald M.; Mangelsdorf, David J.

    2014-01-01

    Summary Isolation of genes encoding the receptors for steroids, retinoids, vitamin D and thyroid hormone, and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors, and in particular of the retinoid X receptor (RXR), positioned nuclear receptors at the epicenter of the “Big Bang” of molecular endocrinology. This review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multi-cellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism. PMID:24679540

  16. The Big Bang and Cosmic Inflation

    NASA Astrophysics Data System (ADS)

    Guth, Alan H.

    2014-03-01

    A summary is given of the key developments of cosmology in the 20th century, from the work of Albert Einstein to the emergence of the generally accepted hot big bang model. The successes of this model are reviewed, but emphasis is placed on the questions that the model leaves unanswered. The remainder of the paper describes the inflationary universe model, which provides plausible answers to a number of these questions. It also offers a possible explanation for the origin of essentially all the matter and energy in the observed universe.

  17. Nanobiotech in big pharma: a business perspective.

    PubMed

    Würmseher, Martin; Firmin, Lea

    2017-03-01

    Since the early 2000s, numerous publications have presented major scientific opportunities that can be achieved through integrating insights from the area of nanotech into biotech (nanobiotech). This paper aims to explore the economic significance that nanobiotech has gained in the established pharmaceutical industry (big pharma). The empirical investigation draws on patent data as well as product revenue data; and to put the results into perspective, the amounts are compared with the established/traditional biotech sector. The results indicate that the new technology still plays only a minor role - at least from a commercial perspective.

  18. The Next Big Thing - Eric Haseltine

    ScienceCinema

    Eric Haseltine

    2016-07-12

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  19. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  20. [Research with big data: the European perspective].

    PubMed

    Bender, Stefan; Elias, P

    2015-08-01

    The article examines the impact that legislative developments in the European Union have had, still have and are continuing to have on cross-border access to microdata for research purposes. Therefore, we describe two competing aims: the tension between the ambitions of the EU to create a European Research Area within which research communities gain access to and share data across national boundaries; and the desire within the EU to establish a harmonious legislative framework that provides protection from the misuse of personal information. We attempt to examine which new developments at the EU level will have an impact upon research plans and the challenges researchers face when analysing big data.

  1. Pre - big bang inflation requires fine tuning

    SciTech Connect

    Turner, Michael S.; Weinberg, Erick J.

    1997-10-01

    The pre-big-bang cosmology inspired by superstring theories has been suggested as an alternative to slow-roll inflation. We analyze, in both the Jordan and Einstein frames, the effect of spatial curvature on this scenario and show that too much curvature --- of either sign --- reduces the duration of the inflationary era to such an extent that the flatness and horizon problems are not solved. Hence, a fine-tuning of initial conditions is required to obtain enough inflation to solve the cosmological problems.

  2. The Next Big Thing - Eric Haseltine

    SciTech Connect

    Eric Haseltine

    2009-09-16

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  3. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    USGS Publications Warehouse

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  4. Making big sense from big data in toxicology by read-across.

    PubMed

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  5. Big science and big administration. Confronting the governance, financial and legal challenges of FuturICT

    NASA Astrophysics Data System (ADS)

    Smart, J.; Scott, M.; McCarthy, J. B.; Tan, K. T.; Argyrakis, P.; Bishop, S.; Conte, R.; Havlin, S.; San Miguel, M.; Stauffacher, D.

    2012-11-01

    This paper considers the issues around managing large scientific projects, and draws conclusions for the governance and management of FuturICT, based on previous experience of Big Science projects, such as CERN and ATLAS. We also consider the legal and ethical issues of the FuturICT project as the funding instrument moves from the Seventh Framework Programme to Horizon 2020.

  6. Big Data, Little Data, and Care Coordination for Medicare Beneficiaries with Medigap Coverage.

    PubMed

    Ozminkowski, Ronald J; Wells, Timothy S; Hawkins, Kevin; Bhattarai, Gandhi R; Martel, Charles W; Yeh, Charlotte S

    2015-06-01

    Most healthcare data warehouses include big data such as health plan, medical, and pharmacy claims information for many thousands and sometimes millions of insured individuals. This makes it possible to identify those with multiple chronic conditions who may benefit from participation in care coordination programs meant to improve their health. The objective of this article is to describe how large databases, including individual and claims data, and other, smaller types of data from surveys and personal interviews, are used to support a care coordination program. The program described in this study was implemented for adults who are generally 65 years of age or older and have an AARP(®) Medicare Supplement Insurance Plan (i.e., a Medigap plan) insured by UnitedHealthcare Insurance Company (or, for New York residents, UnitedHealthcare Insurance Company of New York). Individual and claims data were used first to calculate risk scores that were then utilized to identify the majority of individuals who were qualified for program participation. For efficient use of time and resources, propensity to succeed modeling was used to prioritize referrals based upon their predicted probabilities of (1) engaging in the care coordination program, (2) saving money once engaged, and (3) receiving higher quality of care. To date, program evaluations have reported positive returns on investment and improved quality of healthcare among program participants. In conclusion, the use of data sources big and small can help guide program operations and determine if care coordination programs are working to help older adults live healthier lives.

  7. The big bang? An eventful year in workers' compensation.

    PubMed

    Guidotti, Tee L

    2006-01-01

    Workers' compensation in the past two years has been dominated by events in California, which have been so fundamental as to merit the term big bang. Passage of Senate Bill 899 has led to a comprehensive program of reform in access to medical care, access to rehabilitation services, temporary and permanent disability, evidence-based management, dispute resolution, and system innovation. Two noteworthy developments thus arose: a new requirement for apportionment by cause in causation analysis, and the adoption of evidence-based criteria for impairment assessment, treatment guidelines, and, soon, utilization review. Elsewhere in the United States, changes were modest, but extensive legislative activity in Texas suggests that Texas will be next to make major changes. In Canada, the Workers' Compensation Board of British Columbia has adopted an ambitious strategic initiative, and there is a Canadawide movement to establish presumption for certain diseases in firefighters. Suggestions for future directions include an increased emphasis on prevention, integration of programs, worker participation, enhancing the expertise of health care professionals, evidence-based management, process evaluation, and opportunities for innovation.

  8. High energy neutrinos from big bang particles.

    NASA Astrophysics Data System (ADS)

    Berezinskij, V. S.

    1992-10-01

    The production of high energy neutrinos by big bang particles is reviewed. The big bang particles are divided into two categories: dark matter particles (DMP) and the exotic relics whose mass density can be smaller than the critical one. For the case of DMP the neutralino and the gravitino are considered. High energy neutrinos can be produced due to the capture of the neutralinos in the earth and the sun, with the subsequent annihilation of these particles there. If R-parity is weakly violated, the neutralino decay can be a source of high energy neutrinos. The gravitino as DMP is unobservable directly, unless R-parity is violated and the gravitino decays. For thermal exotic relics a very general conclusion is reached: the detectable neutrino flux can be produced only by long-lived particles with τx > t0, where t0 is the age of the universe. Very large neutrino fluxes can be produced by superheavy metastable relics in the particular cosmological scenario where the violent entropy production occurs.

  9. High energy neutrinos from big bang particles

    NASA Astrophysics Data System (ADS)

    Berezinsky, V. S.

    1993-04-01

    The production of high energy neutrinos by big bang particles is reviewed. The big bang particles are divided into two categories: dark matter particles (DMP), i.e. those with the critical mass density (ϱX = ϱc) at present, and the exotic relics whose mass density can be smaller than the critical one. For the case of DMP the neutralino and the gravitino are considered. High energy neutrinos can be produced due to the capture of the neutralinos in the earth and the sun, with the subsequent annihilation of these particles there. If R-parity is weakly violated, the neutralino decay can be a source of high energy neutrinos. The gravitino as DMP is unobservable directly, unless R-parity is violated and the gravitino decays. For thermal exotic relics a very general conclusion is reached: the detectable neutrino flux can be produced only by long-lived particles with τX > to, where to is the age of the Universe (the exceptional case is the decay only to the neutrinos). Very large neutrino fluxes can be produced by superheavy (up to ~ 1018 GeV) metastable relics in the particular cosmological scenario where the violent entropy production occurs.

  10. Big Bang Cosmic Titanic: Cause for Concern?

    NASA Astrophysics Data System (ADS)

    Gentry, Robert

    2013-04-01

    This abstract alerts physicists to a situation that, unless soon addressed, may yet affect PRL integrity. I refer to Stanley Brown's and DAE Robert Caldwell's rejection of PRL submission LJ12135, A Cosmic Titanic: Big Bang Cosmology Unravels Upon Discovery of Serious Flaws in Its Foundational Expansion Redshift Assumption, by their claim that BB is an established theory while ignoring our paper's Titanic, namely, that BB's foundational spacetime expansion redshifts assumption has now been proven to be irrefutably false because it is contradicted by our seminal discovery that GPS operation unequivocally proves that GR effects do not produce in-flight photon wavelength changes demanded by this central assumption. This discovery causes the big bang to collapse as quickly as did Ptolemaic cosmology when Copernicus discovered its foundational assumption was heliocentric, not geocentric. Additional evidence that something is amiss in PRL's treatment of LJ12135 comes from both Brown and EiC Gene Spouse agreeing to meet at my exhibit during last year's Atlanta APS to discuss this cover-up issue. Sprouse kept his commitment; Brown didn't. Question: If Brown could have refuted my claim of a cover-up, why didn't he come to present it before Gene Sprouse? I am appealing LJ12135's rejection.

  11. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  12. Data, Big Data, and Metadata in Anesthesiology.

    PubMed

    Levin, Matthew A; Wanderer, Jonathan P; Ehrenfeld, Jesse M

    2015-12-01

    The last decade has seen an explosion in the growth of digital data. Since 2005, the total amount of digital data created or replicated on all platforms and devices has been doubling every 2 years, from an estimated 132 exabytes (132 billion gigabytes) in 2005 to 4.4 zettabytes (4.4 trillion gigabytes) in 2013, and a projected 44 zettabytes (44 trillion gigabytes) in 2020. This growth has been driven in large part by the rise of social media along with more powerful and connected mobile devices, with an estimated 75% of information in the digital universe generated by individuals rather than entities. Transactions and communications including payments, instant messages, Web searches, social media updates, and online posts are all becoming part of a vast pool of data that live "in the cloud" on clusters of servers located in remote data centers. The amount of accumulating data has become so large that it has given rise to the term Big Data. In many ways, Big Data is just a buzzword, a phrase that is often misunderstood and misused to describe any sort of data, no matter the size or complexity. However, there is truth to the assertion that some data sets truly require new management and analysis techniques.

  13. Harnessing Big Data for Systems Pharmacology.

    PubMed

    Xie, Lei; Draizen, Eli J; Bourne, Philip E

    2017-01-06

    Systems pharmacology aims to holistically understand mechanisms of drug actions to support drug discovery and clinical practice. Systems pharmacology modeling (SPM) is data driven. It integrates an exponentially growing amount of data at multiple scales (genetic, molecular, cellular, organismal, and environmental). The goal of SPM is to develop mechanistic or predictive multiscale models that are interpretable and actionable. The current explosions in genomics and other omics data, as well as the tremendous advances in big data technologies, have already enabled biologists to generate novel hypotheses and gain new knowledge through computational models of genome-wide, heterogeneous, and dynamic data sets. More work is needed to interpret and predict a drug response phenotype, which is dependent on many known and unknown factors. To gain a comprehensive understanding of drug actions, SPM requires close collaborations between domain experts from diverse fields and integration of heterogeneous models from biophysics, mathematics, statistics, machine learning, and semantic webs. This creates challenges in model management, model integration, model translation, and knowledge integration. In this review, we discuss several emergent issues in SPM and potential solutions using big data technology and analytics. The concurrent development of high-throughput techniques, cloud computing, data science, and the semantic web will likely allow SPM to be findable, accessible, interoperable, reusable, reliable, interpretable, and actionable.

  14. [Algorithms, machine intelligence, big data : general considerations].

    PubMed

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges.

  15. [To die alone, a big city problem?].

    PubMed

    Smith, E; Larsen, D F; Rosdahl, N

    2001-05-28

    The question whether being found dead is a problem related to living in a big city was examined by using death certificates from a 3-month period in 1994. The rates of persons found dead were higher for both sexes in the City of Copenhagen than in the rural County of Storstrøm. The age distribution showed that the rate increased with advancing age. The vast majority were found dead in their own home, although the rate was higher in Copenhagen than in Storstrøm. Whereas the manner of death was natural for the majority of cases in both areas, more deaths with an uncertain manner of death were recorded in Copenhagen than in Storstrøm (32% vs 9%). In general, suicide or accident was more often reported in those found dead in the County of Storstrøm than in those in Copenhagen. The cause of death was unknown in 46% of deaths in Copenhagen where 16% had died from external causes, proportions that were 58% and 21%, respectively, in Storstrøm. The rate of legal autopsies was higher in subjects with an uncertain manner of death than in the rest, and was also associated with an age below 40 years. We conclude that being found dead is related to living in a big city, although the reason(s) for this remains unidentified.

  16. Big 5 personality traits and interleukin-6: evidence for "healthy Neuroticism" in a US population sample.

    PubMed

    Turiano, Nicholas A; Mroczek, Daniel K; Moynihan, Jan; Chapman, Benjamin P

    2013-02-01

    The current study investigated if the Big 5 personality traits predicted interleukin-6 (IL-6) levels in a national sample over the course of 5years. In addition, interactions among the Big 5 were tested to provide a more accurate understanding of how personality traits may influence an inflammatory biomarker. Data included 1054 participants in the Midlife Development in the U.S. (MIDUS) biomarkers subproject. The Big 5 personality traits were assessed in 2005-2006 as part of the main MIDUS survey. Medication use, comorbid conditions, smoking behavior, alcohol use, body mass index, and serum levels of IL-6 were assessed in 2005-2009 as part of the biomarkers subproject. Linear regression analyses examined personality associations with IL-6. A significant Conscientiousness*Neuroticism interaction revealed that those high in both Conscientiousness and Neuroticism had lower circulating IL-6 levels than people with all other configurations of Conscientiousness and Neuroticism. Adjustment for health behaviors diminished the magnitude of this association but did not eliminate it, suggesting that lower comorbid conditions and obesity may partly explain the lower inflammation of those high in both Conscientiousness and Neuroticism. Our findings suggest, consistent with prior speculation, that average to higher levels of Neuroticism can in some cases be associated with health benefits - in this case when it is accompanied by high Conscientiousness. Using personality to identify those at risk may lead to greater personalization in the prevention and remediation of chronic inflammation.

  17. Superfund at work: Hazardous waste cleanup efforts nationwide, Fall 1993 (Big D Campground Site Profile, Ashtabula County, Kingsville, Ohio)

    SciTech Connect

    Not Available

    1993-01-01

    A quarter mile from the old Big D Campground, a sand and gravel quarry in Ashtabula County, Ohio served as a landfill for solvents, caustic chemicals and oily substances. Highlights of the overall effort included: Destruction of 93,000 cubic yards and 14,000 drums of hazardous materials; Extraction and treatment of ground water, including a 30-year monitoring program; and An interactive community relations program that fostered public participation in the cleanup process.

  18. 76 FR 63714 - Big Spring Rail System, Inc.;Operation Exemption;Transport Handling Specialists, Inc.

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-13

    ...] Big Spring Rail System, Inc.;Operation Exemption;Transport Handling Specialists, Inc. Big Spring Rail... Howard County, Tex., owned by the City of Big Spring, Tex. (City). BSRS will be operating the line...

  19. What's the Big Sweat about Dehydration? (For Kids)

    MedlinePlus

    ... Video: Getting an X-ray What's the Big Sweat About Dehydration? KidsHealth > For Kids > What's the Big Sweat About Dehydration? Print A A A What's in ... When it's hot outside and you've been sweating, you get thirsty. Why? Thirst can be a ...

  20. The Whole Shebang: How Science Produced the Big Bang Model.

    ERIC Educational Resources Information Center

    Ferris, Timothy

    2002-01-01

    Offers an account of the accumulation of evidence that has led scientists to have confidence in the big bang theory of the creation of the universe. Discusses the early work of Ptolemy, Copernicus, Kepler, Galileo, and Newton, noting the rise of astrophysics, and highlighting the birth of the big bang model (the cosmic microwave background theory…

  1. Research on Implementing Big Data: Technology, People, & Processes

    ERIC Educational Resources Information Center

    Rankin, Jenny Grant; Johnson, Margie; Dennis, Randall

    2015-01-01

    When many people hear the term "big data", they primarily think of a technology tool for the collection and reporting of data of high variety, volume, and velocity. However, the complexity of big data is not only the technology, but the supporting processes, policies, and people supporting it. This paper was written by three experts to…

  2. The Big Six & Electronic Resources: A Natural Fit.

    ERIC Educational Resources Information Center

    Eisenberg, Michael; Berkowitz, Robert E.

    1997-01-01

    Illustrates how the "Big Six Skills" model of information problem solving (Task Definition, Information Seeking Strategies, Location and Access, Use of Information, Synthesis, and Evaluation) is a perfect match to the influx of technology in student research. Outlines two typical assignments that focus the Big Six perspective on the…

  3. 3. BIG HOUSE (left) AND CORN CRIB (right) IN THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. BIG HOUSE (left) AND CORN CRIB (right) IN THE BACKGROUND. See also individual HABS documentation: Walker Family Farm, Big House (HABS No. TN-121 A), and Walker Family Farm, Corn Crib (HABS No. TN-121 C). - Walker Family Farm (General views), Gatlinburg, Sevier County, TN

  4. What Is Big Data and Why Is It Important?

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2014-01-01

    Big Data Analytics is a topic fraught with both positive and negative potential. Big Data is defined not just by the amount of information involved but also its variety and complexity, as well as the speed with which it must be analyzed or delivered. The amount of data being produced is already incredibly great, and current developments suggest…

  5. Toward a manifesto for the 'public understanding of big data'.

    PubMed

    Michael, Mike; Lupton, Deborah

    2016-01-01

    In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article.

  6. Big Earth Data Initiative: Metadata Improvement: Case Studies

    NASA Technical Reports Server (NTRS)

    Kozimor, John; Habermann, Ted; Farley, John

    2016-01-01

    Big Earth Data Initiative (BEDI) The Big Earth Data Initiative (BEDI) invests in standardizing and optimizing the collection, management and delivery of U.S. Government's civil Earth observation data to improve discovery, access use, and understanding of Earth observations by the broader user community. Complete and consistent standard metadata helps address all three goals.

  7. Semantic Web technologies for the big data in life sciences.

    PubMed

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  8. The BIG Data Center: from deposition to integration to translation.

    PubMed

    2017-01-04

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn.

  9. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  10. The BIG Data Center: from deposition to integration to translation

    PubMed Central

    2017-01-01

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. PMID:27899658

  11. 78 FR 52523 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on August 16, 2013, Big Rivers Electric Corporation filed its proposed revenue requirements for reactive...

  12. Bioprospecting for podophyllotoxin in the Big Horn Mountains, Wyoming

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of this study was to evaluate variations in podophyllotoxin concentrations in Juniperus species found in the Big Horn Mountains in Wyoming. It was found that Juniperus species in the Big Horn Mountains included three species; J. communis L. (common juniper), J. horizontalis Moench. (c...

  13. Big Books from Little Voices: Reaching High Risk Beginning Readers.

    ERIC Educational Resources Information Center

    Trachtenburg, Phyllis; Ferruggia, Ann

    1989-01-01

    Discusses how interactive, whole class techniques (using a student-generated Big Book adaptation of "Corduroy") improved the reading skills of high risk first grade readers. Describes several activities, including sight word strategies, decoding techniques, and word processing, and suggests 27 Big Books for use with these activities. (MM)

  14. Will Big Data Mean the End of Privacy?

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2015-01-01

    Big Data is currently a hot topic in the field of technology, and many campuses are considering the addition of this topic into their undergraduate courses. Big Data tools are not just playing an increasingly important role in many commercial enterprises; they are also combining with new digital devices to dramatically change privacy. This article…

  15. The Role of Big Data in the Social Sciences

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2013-01-01

    Big Data is an increasingly popular term across scholarly and popular literature but lacks a formal definition (Lohr 2012). This is beneficial in that it keeps the term flexible. For librarians, Big Data represents a few important ideas. One idea is the idea of balancing accessibility with privacy. Librarians tend to want information to be as open…

  16. Curriculum: Big Decisions--Making Healthy, Informed Choices about Sex

    ERIC Educational Resources Information Center

    Davis, Melanie

    2009-01-01

    Big Decisions is a 10-lesson abstinence-plus curriculum for ages 12-18 that emphasizes sex as a big decision, abstinence as the healthiest choice, and the mandate that sexually active teens use condoms and be tested for sexually transmitted diseases. This program can be implemented with limited resources and facilitator training when abstinence…

  17. 33 CFR 117.267 - Big Carlos Pass.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Big Carlos Pass. 117.267 Section 117.267 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.267 Big Carlos Pass. The draw of...

  18. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  19. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  20. What's the Big Deal? Collection Evaluation at the National Level

    ERIC Educational Resources Information Center

    Jurczyk, Eva; Jacobs, Pamela

    2014-01-01

    This article discusses a project undertaken to assess the journals in a Big Deal package by applying a weighted value algorithm measuring quality, utility, and value of individual titles. Carried out by a national library consortium in Canada, the project confirmed the value of the Big Deal package while providing a quantitative approach for…

  1. A proposed framework of big data readiness in public sectors

    NASA Astrophysics Data System (ADS)

    Ali, Raja Haslinda Raja Mohd; Mohamad, Rosli; Sudin, Suhizaz

    2016-08-01

    Growing interest over big data mainly linked to its great potential to unveil unforeseen pattern or profiles that support organisation's key business decisions. Following private sector moves to embrace big data, the government sector has now getting into the bandwagon. Big data has been considered as one of the potential tools to enhance service delivery of the public sector within its financial resources constraints. Malaysian government, particularly, has considered big data as one of the main national agenda. Regardless of government commitment to promote big data amongst government agencies, degrees of readiness of the government agencies as well as their employees are crucial in ensuring successful deployment of big data. This paper, therefore, proposes a conceptual framework to investigate perceived readiness of big data potentials amongst Malaysian government agencies. Perceived readiness of 28 ministries and their respective employees will be assessed using both qualitative (interview) and quantitative (survey) approaches. The outcome of the study is expected to offer meaningful insight on factors affecting change readiness among public agencies on big data potentials and the expected outcome from greater/lower change readiness among the public sectors.

  2. "Big Society" in the UK: A Policy Review

    ERIC Educational Resources Information Center

    Evans, Kathy

    2011-01-01

    Alongside the UK Coalition Government's historic public spending cuts, the "Big Society" has become a major narrative in UK political discourse. This article reviews key features of Big Society policies against their aims of rebalancing the economy and mending "Broken Britain", with particular reference to their implications…

  3. Deal or No Deal? Evaluating Big Deals and Their Journals

    ERIC Educational Resources Information Center

    Blecic, Deborah D.; Wiberley, Stephen E., Jr.; Fiscella, Joan B.; Bahnmaier-Blaszczak, Sara; Lowery, Rebecca

    2013-01-01

    This paper presents methods to develop metrics that compare Big Deal journal packages and the journals within those packages. Deal-level metrics guide selection of a Big Deal for termination. Journal-level metrics guide selection of individual subscriptions from journals previously provided by a terminated deal. The paper argues that, while the…

  4. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 1 2014-07-01 2014-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  5. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 1 2013-07-01 2013-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  6. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 1 2012-07-01 2012-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  7. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  8. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  9. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  10. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  11. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  12. Big Data: You Are Adding to . . . and Using It

    ERIC Educational Resources Information Center

    Makela, Carole J.

    2016-01-01

    "Big data" prompts a whole lexicon of terms--data flow; analytics; data mining; data science; smart you name it (cars, houses, cities, wearables, etc.); algorithms; learning analytics; predictive analytics; data aggregation; data dashboards; digital tracks; and big data brokers. New terms are being coined frequently. Are we paying…

  13. Big Canyon Creek Ecological Restoration Strategy.

    SciTech Connect

    Rasmussen, Lynn; Richardson, Shannon

    2007-10-01

    He-yey, Nez Perce for steelhead or rainbow trout (Oncorhynchus mykiss), are a culturally and ecologically significant resource within the Big Canyon Creek watershed; they are also part of the federally listed Snake River Basin Steelhead DPS. The majority of the Big Canyon Creek drainage is considered critical habitat for that DPS as well as for the federally listed Snake River fall chinook (Oncorhynchus tshawytscha) ESU. The Nez Perce Soil and Water Conservation District (District) and the Nez Perce Tribe Department of Fisheries Resources Management-Watershed (Tribe), in an effort to support the continued existence of these and other aquatic species, have developed this document to direct funding toward priority restoration projects in priority areas for the Big Canyon Creek watershed. In order to achieve this, the District and the Tribe: (1) Developed a working group and technical team composed of managers from a variety of stakeholders within the basin; (2) Established geographically distinct sub-watershed areas called Assessment Units (AUs); (3) Created a prioritization framework for the AUs and prioritized them; and (4) Developed treatment strategies to utilize within the prioritized AUs. Assessment Units were delineated by significant shifts in sampled juvenile O. mykiss (steelhead/rainbow trout) densities, which were found to fall at fish passage barriers. The prioritization framework considered four aspects critical to determining the relative importance of performing restoration in a certain area: density of critical fish species, physical condition of the AU, water quantity, and water quality. It was established, through vigorous data analysis within these four areas, that the geographic priority areas for restoration within the Big Canyon Creek watershed are Big Canyon Creek from stream km 45.5 to the headwaters, Little Canyon from km 15 to 30, the mainstem corridors of Big Canyon (mouth to 7km) and Little Canyon (mouth to 7km). The District and the Tribe

  14. A Solution to ``Too Big to Fail''

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-10-01

    Its a tricky business to reconcile simulations of our galaxys formation with our current observations of the Milky Way and its satellites. In a recent study, scientists have addressed one discrepancy between simulations and observations: the so-called to big to fail problem.From Missing Satellites to Too Big to FailThe favored model of the universe is the lambda-cold-dark-matter (CDM) cosmological model. This model does a great job of correctly predicting the large-scale structure of the universe, but there are still a few problems with it on smaller scales.Hubble image of UGC 5497, a dwarf galaxy associated with Messier 81. In the missing satellite problem, simulations of galaxy formation predict that there should be more such satellite galaxies than we observe. [ESA/NASA]The first is the missing satellites problem: CDM cosmology predicts that galaxies like the Milky Way should have significantly more satellite galaxies than we observe. A proposed solution to this problem is the argument that there may exist many more satellites than weve observed, but these dwarf galaxies have had their stars stripped from them during tidal interactions which prevents us from being able to see them.This solution creates a new problem, though: the too big to fail problem. This problem states that many of the satellites predicted by CDM cosmology are simply so massive that theres no way they couldnt have visible stars. Another way of looking at it: the observed satellites of the Milky Way are not massive enough to be consistent with predictions from CDM.Artists illustration of a supernova, a type of stellar feedback that can modify the dark-matter distribution of a satellite galaxy. [NASA/CXC/M. Weiss]Density Profiles and Tidal StirringLed by Mihai Tomozeiu (University of Zurich), a team of scientists has published a study in which they propose a solution to the too big to fail problem. By running detailed cosmological zoom simulations of our galaxys formation, Tomozeiu and

  15. Integrative methods for analyzing big data in precision medicine.

    PubMed

    Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša

    2016-03-01

    We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face.

  16. Big data is essential for further development of integrative medicine.

    PubMed

    Li, Guo-zheng; Liu, Bao-yan

    2015-05-01

    To give a short summary on achievements, opportunities and challenges of big data in integrative medicine (IM) and explore the future works on breaking the bottleneck to make IM develop rapidly, this paper presents the growing field of big data from IM, describes the systems of data collection and the techniques of data analytics, introduces the advances, and discusses the future works especially the challenges in this field. Big data is increasing dramatically as the time flies, whatever we face it or not. Big data is evolving into a promising way for deep insight IM, the ancient medicine integrating with modern medicine. We have great achievements in data collection and data analysis, where existing results show it is possible to discover the knowledge and rules behind the clinical records. Transferring from experience-based medicine to evidence-based medicine, IM depends on the big data technology in this great era.

  17. Big data and visual analytics in anaesthesia and health care.

    PubMed

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences.

  18. [Utilization of Big Data in Medicine and Future Outlook].

    PubMed

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  19. Design Principles for Effective Knowledge Discovery from Big Data

    SciTech Connect

    Begoli, Edmon; Horey, James L

    2012-01-01

    Big data phenomenon refers to the practice of collection and processing of very large data sets and associated systems and algorithms used to analyze these massive datasets. Architectures for big data usually range across multiple machines and clusters, and they commonly consist of multiple special purpose sub-systems. Coupled with the knowledge discovery process, big data movement offers many unique opportunities for organizations to benefit (with respect to new insights, business optimizations, etc.). However, due to the difficulty of analyzing such large datasets, big data presents unique systems engineering and architectural challenges. In this paper, we present three sys- tem design principles that can inform organizations on effective analytic and data collection processes, system organization, and data dissemination practices. The principles presented derive from our own research and development experiences with big data problems from various federal agencies, and we illustrate each principle with our own experiences and recommendations.

  20. Commentary: Epidemiology in the era of big data.

    PubMed

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-05-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.

  1. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development.

  2. Curating Big Data Made Simple: Perspectives from Scientific Communities.

    PubMed

    Sowe, Sulayman K; Zettsu, Koji

    2014-03-01

    The digital universe is exponentially producing an unprecedented volume of data that has brought benefits as well as fundamental challenges for enterprises and scientific communities alike. This trend is inherently exciting for the development and deployment of cloud platforms to support scientific communities curating big data. The excitement stems from the fact that scientists can now access and extract value from the big data corpus, establish relationships between bits and pieces of information from many types of data, and collaborate with a diverse community of researchers from various domains. However, despite these perceived benefits, to date, little attention is focused on the people or communities who are both beneficiaries and, at the same time, producers of big data. The technical challenges posed by big data are as big as understanding the dynamics of communities working with big data, whether scientific or otherwise. Furthermore, the big data era also means that big data platforms for data-intensive research must be designed in such a way that research scientists can easily search and find data for their research, upload and download datasets for onsite/offsite use, perform computations and analysis, share their findings and research experience, and seamlessly collaborate with their colleagues. In this article, we present the architecture and design of a cloud platform that meets some of these requirements, and a big data curation model that describes how a community of earth and environmental scientists is using the platform to curate data. Motivation for developing the platform, lessons learnt in overcoming some challenges associated with supporting scientists to curate big data, and future research directions are also presented.

  3. Big data and biomedical informatics: a challenging opportunity.

    PubMed

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  4. Big Data and Biomedical Informatics: A Challenging Opportunity

    PubMed Central

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  5. Social Mobility and Social Participation

    ERIC Educational Resources Information Center

    Sewell, William H.

    1978-01-01

    Examines data related to social mobility and social participation of Americans. Topics include educational and occupational mobility; voting; volunteer work; charitable giving; community participation; views on religion; and anomie. For journal availability, see SO 506 144. (Author/DB)

  6. 37 CFR 1.98 - Content of information disclosure statement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... information, of each patent, publication, or other information listed that is not in the English language. The... copy of the translation if a written English-language translation of a non-English-language document... statement. (2) A legible copy of: (i) Each foreign patent; (ii) Each publication or that portion...

  7. D-branes in a big bang/big crunch universe: Misner space

    NASA Astrophysics Data System (ADS)

    Hikida, Yasuaki; Nayak, Rashmi R.; Panigrahi, Kamal L.

    2005-09-01

    We study D-branes in a two-dimensional lorentzian orbifold Bbb R1,1/Γ with a discrete boost Γ. This space is known as Misner or Milne space, and includes big crunch/big bang singularity. In this space, there are D0-branes in spiral orbits and D1-branes with or without flux on them. In particular, we observe imaginary parts of partition functions, and interpret them as the rates of open string pair creation for D0-branes and emission of winding closed strings for D1-branes. These phenomena occur due to the time-dependence of the background. Open string 2→2 scattering amplitude on a D1-brane is also computed and found to be less singular than closed string case.

  8. Bioinformatics clouds for big data manipulation

    PubMed Central

    2012-01-01

    Abstract As advances in life sciences and information technology bring profound influences on bioinformatics due to its interdisciplinary nature, bioinformatics is experiencing a new leap-forward from in-house computing infrastructure into utility-supplied cloud computing delivered over the Internet, in order to handle the vast quantities of biological data generated by high-throughput experimental technologies. Albeit relatively new, cloud computing promises to address big data storage and analysis issues in the bioinformatics field. Here we review extant cloud-based services in bioinformatics, classify them into Data as a Service (DaaS), Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS), and present our perspectives on the adoption of cloud computing in bioinformatics. Reviewers This article was reviewed by Frank Eisenhaber, Igor Zhulin, and Sandor Pongor. PMID:23190475

  9. The Economics of Big Area Addtiive Manufacturing

    SciTech Connect

    Post, Brian; Lloyd, Peter D; Lindahl, John; Lind, Randall F; Love, Lonnie J; Kunc, Vlastimil

    2016-01-01

    Case studies on the economics of Additive Manufacturing (AM) suggest that processing time is the dominant cost in manufacturing. Most additive processes have similar performance metrics: small part sizes, low production rates and expensive feedstocks. Big Area Additive Manufacturing is based on transitioning polymer extrusion technology from a wire to a pellet feedstock. Utilizing pellets significantly increases deposition speed and lowers material cost by utilizing low cost injection molding feedstock. The use of carbon fiber reinforced polymers eliminates the need for a heated chamber, significantly reducing machine power requirements and size constraints. We hypothesize that the increase in productivity coupled with decrease in feedstock and energy costs will enable AM to become more competitive with conventional manufacturing processes for many applications. As a test case, we compare the cost of using traditional fused deposition modeling (FDM) with BAAM for additively manufacturing composite tooling.

  10. The big data challenges of connectomics

    PubMed Central

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2015-01-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them. PMID:25349911

  11. The big data challenges of connectomics

    SciTech Connect

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    2014-10-28

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.

  12. The big data challenges of connectomics

    DOE PAGES

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    2014-10-28

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less

  13. New nuclear physics for big bang nucleosynthesis

    SciTech Connect

    Boyd, Richard N.; Brune, Carl R.; Fuller, George M.; Smith, Christel J.

    2010-11-15

    We discuss nuclear reactions which could play a role in big bang nucleosynthesis. Most of these reactions involve lithium and beryllium isotopes and the rates for some of these have not previously been included in BBN calculations. Few of these reactions are well studied in the laboratory. We also discuss novel effects in these reactions, including thermal population of nuclear target states, resonant enhancement, and nonthermal neutron reaction products. We perform sensitivity studies which show that even given considerable nuclear physics uncertainties, most of these nuclear reactions have minimal leverage on the standard BBN abundance yields of {sup 6}Li and {sup 7}Li. Although a few have the potential to alter the yields significantly, we argue that this is unlikely.

  14. Big data era in meteor science

    NASA Astrophysics Data System (ADS)

    Vinković, D.; Gritsevich, M.; Srećković, V.; Pečnik, B.; Szabó, G.; Debattista, V.; Škoda, P.; Mahabal, A.; Peltoniemi, J.; Mönkölä, S.; Mickaelian, A.; Turunen, E.; Kákona, J.; Koskinen, J.; Grokhovsky, V.

    2016-01-01

    Over the last couple of decades technological advancements in observational techniques in meteor science have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced science goals. We review some of the developments that push meteor science into the big data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere.

  15. The natural science underlying big history.

    PubMed

    Chaisson, Eric J

    2014-01-01

    Nature's many varied complex systems-including galaxies, stars, planets, life, and society-are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density-contrasting with information content or entropy production-is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated.

  16. The Uses of Big Data in Cities.

    PubMed

    Bettencourt, Luís M A

    2014-03-01

    There is much enthusiasm currently about the possibilities created by new and more extensive sources of data to better understand and manage cities. Here, I explore how big data can be useful in urban planning by formalizing the planning process as a general computational problem. I show that, under general conditions, new sources of data coordinated with urban policy can be applied following fundamental principles of engineering to achieve new solutions to important age-old urban problems. I also show that comprehensive urban planning is computationally intractable (i.e., practically impossible) in large cities, regardless of the amounts of data available. This dilemma between the need for planning and coordination and its impossibility in detail is resolved by the recognition that cities are first and foremost self-organizing social networks embedded in space and enabled by urban infrastructure and services. As such, the primary role of big data in cities is to facilitate information flows and mechanisms of learning and coordination by heterogeneous individuals. However, processes of self-organization in cities, as well as of service improvement and expansion, must rely on general principles that enforce necessary conditions for cities to operate and evolve. Such ideas are the core of a developing scientific theory of cities, which is itself enabled by the growing availability of quantitative data on thousands of cities worldwide, across different geographies and levels of development. These three uses of data and information technologies in cities constitute then the necessary pillars for more successful urban policy and management that encourages, and does not stifle, the fundamental role of cities as engines of development and innovation in human societies.

  17. A Spectrograph for BigBOSS

    NASA Astrophysics Data System (ADS)

    CARTON, Pierre-Henri; Bebek, C.; Cazaux, S.; Ealet, A.; Eppelle, D.; Kneib, J.; Karst, P.; levi, M.; magneville, C.; Palanque-Delabrouille, N.; Ruhlmann-Kleider, V.; Schlegel, D.; Yeche, C.

    2012-01-01

    The Big-Boss spectrographs assembly will take in charge the light from the fiber output to the detector, including the optics, gratings, mechanics and cryostats. The 5000 fibers are split in 10 bundles of 500 ones. Each of these channel feed one spectrograph. The full bandwidth from 0.36µm to 1.05µm is split in 3 bands. Each channel is composed with one collimator (doublet lenses), a VPH grating, and a 6 lenses camera. The 500 fiber spectrum are imaged onto a 4kx4k detector thanks to the F/2 camera. Each fiber core is imaged onto 4 pixels. Each channel of the BigBOSS spectrograph will be equipped with a single-CCD camera, resulting in 30 cryostats in total for the instrument. Based on its experience of CCD cameras for projects like EROS and MegaCam, CEA/Saclay has designed small and autonomous cryogenic vessels which integrate cryo-cooling, CCD positioning and slow control interfacing capabilities. The use of a Linear Pulse Tube with its own control unit, both developed by Thales Cryogenics BV, will ensure versatility, reliability and operational flexibility. CCD's will be cooled down to 140K, with stability better than 1K. CCD's will be positioned within 15µm along the optical axis and 50µm in the XY Plan. Slow Control machines will be directly interfaced to an Ethernet network, which will allow them to be operated remotely. The concept of spectrograph leads to a very robust concept without any mechanics (except the shutters). This 30 channels has a impressive compactness with its 3m3 volume. The development of such number of channel will drive to a quasi mass production philosophy.

  18. The Natural Science Underlying Big History

    PubMed Central

    Chaisson, Eric J.

    2014-01-01

    Nature's many varied complex systems—including galaxies, stars, planets, life, and society—are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density—contrasting with information content or entropy production—is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated. PMID:25032228

  19. Integrating the Apache Big Data Stack with HPC for Big Data

    NASA Astrophysics Data System (ADS)

    Fox, G. C.; Qiu, J.; Jha, S.

    2014-12-01

    There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.

  20. Classical propagation of strings across a big crunch/big bang singularity

    SciTech Connect

    Niz, Gustavo; Turok, Neil

    2007-01-15

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z{sub 2}, the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example.

  1. The Big-Fish-Little-Pond Effect for Academic Self-Concept, Test Anxiety, and School Grades in Gifted Children.

    PubMed

    Zeidner; Schleyer

    1999-10-01

    This study reports data extending work by Marsh and colleagues on the "big-fish-little-pond effect" (BFLPE). The BFLPE hypothesizes that it is better for academic self-concept to be a big fish in a little pond (gifted student in regular reference group) than to be a small fish in a big pond (gifted student in gifted reference group). The BFLPE effect was examined with respect to academic self-concept, test anxiety, and school grades in a sample of 1020 gifted Israeli children participating in two different educational programs: (a) special homogeneous classes for the gifted and (b) regular mixed-ability classes. The central hypothesis, deduced from social comparison and reference group theory, was that academically talented students enrolled in special gifted classes will perceive their academic ability and chances for academic success less favorably compared to students in regular mixed-ability classes. These negative self-perceptions, in turn, will serve to deflate students' academic self-concept, elevate their levels of evaluative anxiety, and result in depressed school grades. A path-analytic model linking reference group, academic self-concept, evaluative anxiety, and school performance, was employed to test this conceptualization. Overall, the data lend additional support to reference group theory, with the big-fish-little-pond effect supported for all three variables tested. In addition, academic self-concept and test anxiety were observed to mediate the effects of reference group on school grades. Copyright 1999 Academic Press.

  2. Opportunities and Challenges for Drug Development: Public-Private Partnerships, Adaptive Designs and Big Data.

    PubMed

    Yildirim, Oktay; Gottwald, Matthias; Schüler, Peter; Michel, Martin C

    2016-01-01

    Drug development faces the double challenge of increasing costs and increasing pressure on pricing. To avoid that lack of perceived commercial perspective will leave existing medical needs unmet, pharmaceutical companies and many other stakeholders are discussing ways to improve the efficiency of drug Research and Development. Based on an international symposium organized by the Medical School of the University of Duisburg-Essen (Germany) and held in January 2016, we discuss the opportunities and challenges of three specific areas, i.e., public-private partnerships, adaptive designs and big data. Public-private partnerships come in many different forms with regard to scope, duration and type and number of participants. They range from project-specific collaborations to strategic alliances to large multi-party consortia. Each of them offers specific opportunities and faces distinct challenges. Among types of collaboration, investigator-initiated studies are becoming increasingly popular but have legal, ethical, and financial implications. Adaptive trial designs are also increasingly discussed. However, adaptive should not be used as euphemism for the repurposing of a failed trial; rather it requires carefully planning and specification before a trial starts. Adaptive licensing can be a counter-part of adaptive trial design. The use of Big Data is another opportunity to leverage existing information into knowledge useable for drug discovery and development. Respecting limitations of informed consent and privacy is a key challenge in the use of Big Data. Speakers and participants at the symposium were convinced that appropriate use of the above new options may indeed help to increase the efficiency of future drug development.

  3. Opportunities and Challenges for Drug Development: Public–Private Partnerships, Adaptive Designs and Big Data

    PubMed Central

    Yildirim, Oktay; Gottwald, Matthias; Schüler, Peter; Michel, Martin C.

    2016-01-01

    Drug development faces the double challenge of increasing costs and increasing pressure on pricing. To avoid that lack of perceived commercial perspective will leave existing medical needs unmet, pharmaceutical companies and many other stakeholders are discussing ways to improve the efficiency of drug Research and Development. Based on an international symposium organized by the Medical School of the University of Duisburg-Essen (Germany) and held in January 2016, we discuss the opportunities and challenges of three specific areas, i.e., public–private partnerships, adaptive designs and big data. Public–private partnerships come in many different forms with regard to scope, duration and type and number of participants. They range from project-specific collaborations to strategic alliances to large multi-party consortia. Each of them offers specific opportunities and faces distinct challenges. Among types of collaboration, investigator-initiated studies are becoming increasingly popular but have legal, ethical, and financial implications. Adaptive trial designs are also increasingly discussed. However, adaptive should not be used as euphemism for the repurposing of a failed trial; rather it requires carefully planning and specification before a trial starts. Adaptive licensing can be a counter-part of adaptive trial design. The use of Big Data is another opportunity to leverage existing information into knowledge useable for drug discovery and development. Respecting limitations of informed consent and privacy is a key challenge in the use of Big Data. Speakers and participants at the symposium were convinced that appropriate use of the above new options may indeed help to increase the efficiency of future drug development. PMID:27999543

  4. Participation in Quality Measurement Nationwide

    PubMed Central

    Irani, Jennifer Lynn

    2014-01-01

    In the interest of improving patient care quality and reducing costs, many hospitals across the nation participate in quality measurements. The three programs most applicable to colon and rectal surgery are the National Surgical Quality Improvement Project, the Surgical Care Improvement Project (SCIP), and the Surgical Care and Outcomes Assessment Program. Participation in each is variable, although many hospitals are eligible and welcome to participate. Currently, SCIP is the only one with a financial incentive to participate. This article will focus on participation; however, the motivation for such is elusive in the literature. It is likely that a combination of resource utilization and faith in the concept that participation results in improvements in patient care actually drive participation. PMID:24587700

  5. Application and Exploration of Big Data Mining in Clinical Medicine

    PubMed Central

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-01-01

    Objective: To review theories and technologies of big data mining and their application in clinical medicine. Data Sources: Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Study Selection: Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. Results: This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster–Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Conclusion: Big data mining has the potential to play an important role in clinical medicine. PMID:26960378

  6. Big Data Management in US Hospitals: Benefits and Barriers.

    PubMed

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  7. Dreaming and personality: Wake-dream continuity, thought suppression, and the Big Five Inventory.

    PubMed

    Malinowski, Josie E

    2015-12-15

    Studies have found relationships between dream content and personality traits, but there are still many traits that have been underexplored or have had questionable conclusions drawn about them. Experimental work has found a 'rebound' effect in dreams when thoughts are suppressed prior to sleep, but the effect of trait thought suppression on dream content has not yet been researched. In the present study participants (N=106) reported their Most Recent Dream, answered questions about the content of the dream, and completed questionnaires measuring trait thought suppression and the 'Big Five' personality traits. Of these, 83 were suitably recent for analyses. A significant positive correlation was found between trait thought suppression and participants' ratings of dreaming of waking-life emotions, and high suppressors reported dreaming more of their waking-life emotions than low suppressors did. The results may lend support to the compensation theory of dreams, and/or the ironic process theory of mental control.

  8. Personality prototypes in individuals with compulsive buying based on the Big Five Model.

    PubMed

    Mueller, Astrid; Claes, Laurence; Mitchell, James E; Wonderlich, Steve A; Crosby, Ross D; de Zwaan, Martina

    2010-09-01

    Personality prototypes based on the Big Five factor model were investigated in a treatment-seeking sample of 68 individuals with compulsive buying (CB). Cluster analysis of the NEO Five-Factor Inventory (NEO-FFI) scales yielded two distinct personality clusters. Participants in cluster II scored significantly higher than those in cluster I on neuroticism and lower on the other four personality traits. Subjects in cluster II showed higher severity of CB, lower degree of control over CB symptoms, and were more anxious, interpersonally sensitive and impulsive. Furthermore, cluster II was characterized by higher rates of comorbid anxiety disorders, and cluster B personality disorders. The two personality prototypes did not differ with respect to obsessive-compulsive features. Finally and of considerable clinical significance, participants in cluster II reported lower remission rates after undergoing cognitive-behavioral therapy. Implications of the results for treatment are discussed.

  9. Big bang photosynthesis and pregalactic nucleosynthesis of light elements

    NASA Technical Reports Server (NTRS)

    Audouze, J.; Lindley, D.; Silk, J.

    1985-01-01

    Two nonstandard scenarios for pregalactic synthesis of the light elements (H-2, He-3, He-4, and Li-7) are developed. Big bang photosynthesis occurs if energetic photons, produced by the decay of massive neutrinos or gravitinos, partially photodisintegrate He-4 (formed in the standard hot big bang) to produce H-2 and He-3. In this case, primordial nucleosynthesis no longer constrains the baryon density of the universe, or the number of neutrino species. Alternatively, one may dispense partially or completely with the hot big bang and produce the light elements by bombardment of primordial gas, provided that He-4 is synthesized by a later generation of massive stars.

  10. ELM Meets Urban Big Data Analysis: Case Studies

    PubMed Central

    Chen, Huajun; Chen, Jiaoyan

    2016-01-01

    In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203

  11. Research Activities at Fermilab for Big Data Movement

    SciTech Connect

    Mhashilkar, Parag; Wu, Wenji; Kim, Hyun W; Garzoglio, Gabriele; Dykstra, Dave; Slyz, Marko; DeMar, Phil

    2013-01-01

    Adaptation of 100GE Networking Infrastructure is the next step towards management of Big Data. Being the US Tier-1 Center for the Large Hadron Collider's (LHC) Compact Muon Solenoid (CMS) experiment and the central data center for several other large-scale research collaborations, Fermilab has to constantly deal with the scaling and wide-area distribution challenges of the big data. In this paper, we will describe some of the challenges involved in the movement of big data over 100GE infrastructure and the research activities at Fermilab to address these challenges.

  12. Cardio-oncology: The Role of Big Data.

    PubMed

    Mandawat, Anant; Williams, Andrew E; Francis, Sanjeev A

    2017-04-01

    Despite its challenges, a "big data" approach offers a unique opportunity within the field of cardio-oncology. A pharmacovigilant approach using large data sets can help characterize cardiovascular toxicities of the rapidly expanding armamentarium of targeted therapies. Creating a broad coalition of data sharing can provide insights into the incidence of cardiotoxicity and stimulate research into the underlying mechanisms. Population health necessitates the use of big data and can help inform public health interventions to prevent both cancer and cardiovascular disease. As a relatively new discipline, cardio-oncology is poised to take advantage of big data.

  13. pp wave big bangs: Matrix strings and shrinking fuzzy spheres

    SciTech Connect

    Das, Sumit R.; Michelson, Jeremy

    2005-10-15

    We find pp wave solutions in string theory with null-like linear dilatons. These provide toy models of big bang cosmologies. We formulate matrix string theory in these backgrounds. Near the big bang 'singularity', the string theory becomes strongly coupled but the Yang-Mills description of the matrix string is weakly coupled. The presence of a second length scale allows us to focus on a specific class of non-Abelian configurations, viz. fuzzy cylinders, for a suitable regime of parameters. We show that, for a class of pp waves, fuzzy cylinders which start out big at early times dynamically shrink into usual strings at sufficiently late times.

  14. [Discussion paper on participation and participative methods in gerontology].

    PubMed

    Aner, Kirsten

    2016-02-01

    The concept of "participation" and the demand for the use of "participative methods" in human, healthcare, nursing and gerontological research as well as the corresponding fields of practice are in great demand; however, the targets and organization of "participation" are not always sufficiently explicated. The working group on critical gerontology of the German Society of Gerontology and Geriatrics uses this phenomenon as an opportunity for positioning and develops a catalogue of criteria for reflection and assessment of participation of elderly people in science and practice, which can also be considered a stimulus for further discussions.

  15. Big, Dark Dunes Northeast of Syrtis Major

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Big sand dunes! Mars is home to some very large, windblown dunes. The dunes shown here rise to almost 100 meters (275 feet) at their crests. Unlike dunes on Earth, the larger dunes of Mars are composed of dark, rather than light grains. This is probably related to the composition of the sand, since different materials will have different brightnesses. For example, beaches on the island of Oahu in Hawaii are light colored because they consist of ground-up particles of seashells, while beaches in the southern shores of the island of Hawaii (the 'Big Island' in the Hawaiian island chain) are dark because they consist of sand derived from dark lava rock.

    The dunes in this picture taken by the Mars Orbiter Camera (MOC) are located on the floor of an old, 72 km-(45 mi)-diameter crater located northeast of Syrtis Major. The sand is being blown from the upper right toward the lower left. The surface that the dunes have been travelling across is pitted and cratered. The substrate is also hard and bright--i.e., it is composed of a material of different composition than the sand in the dunes. The dark streaks on the dune surfaces area puzzle...at first glance one might conclude they are the result of holiday visitors with off-road vehicles. However, the streaks more likely result from passing dust devils or wind gusts that disturb the sand surface just enough to leave a streak. The image shown here covers an area approximately 2.6 km (1.6 mi) wide, and is illuminated from the lower right.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  16. Where Big Data and Prediction Meet

    SciTech Connect

    Ahrens, James; Brase, Jim M.; Hart, Bill; Kusnezov, Dimitri; Shalf, John

    2014-09-11

    Our ability to assemble and analyze massive data sets, often referred to under the title of “big data”, is an increasingly important tool for shaping national policy. This in turn has introduced issues from privacy concerns to cyber security. But as IBM’s John Kelly emphasized in the last Innovation, making sense of the vast arrays of data will require radically new computing tools. In the past, technologies and tools for analysis of big data were viewed as quite different from the traditional realm of high performance computing (HPC) with its huge models of phenomena such as global climate or supporting the nuclear test moratorium. Looking ahead, this will change with very positive benefits for both worlds. Societal issues such as global security, economic planning and genetic analysis demand increased understanding that goes beyond existing data analysis and reduction. The modeling world often produces simulations that are complex compositions of mathematical models and experimental data. This has resulted in outstanding successes such as the annual assessment of the state of the US nuclear weapons stockpile without underground nuclear testing. Ironically, while there were historically many test conducted, this body of data provides only modest insight into the underlying physics of the system. A great deal of emphasis was thus placed on the level of confidence we can develop for the predictions. As data analytics and simulation come together, there is a growing need to assess the confidence levels in both data being gathered and the complex models used to make predictions. An example of this is assuring the security or optimizing the performance of critical infrastructure systems such as the power grid. If one wants to understand the vulnerabilities of the system or impacts of predicted threats, full scales tests of the grid against threat scenarios are unlikely. Preventive measures would need to be predicated on well-defined margins of confidence in order

  17. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  18. Big Explosions and Strong Gravity: From Maryland to the Nation

    NASA Astrophysics Data System (ADS)

    Eyermann, Sarah E.; Hornschemeier, A.; Krishnamurthi, A.; Feaga, L.

    2008-05-01

    We are looking for national partners for our Big Explosions and Strong Gravity (BESG) Girl Scout collaboration. This is an activity designed to put real astronomers in the classroom, and was originally set into motion using money from the Chandra X-ray Observatory E/PO program. The girls who participate in this event range from 11 to 17 years old. Although all the activities are gender-neutral, we have chosen girls due to their underrepresentation in science. We target this age range due to the general decline in interest in math and science that occurs at or after children reach this critical age (meaning that we reach them early enough to have a positive effect). BESG is a Girl Scout patch-earning event where over the course of a day, girls explore Supernovae, Black Holes, the abundance of elements in the universe, and spectroscopy. This event has been run three times over the past four years with the Girl Scouts of Central Maryland, and by the time of this meeting will have been run once more as a test run of our new manual. Thanks to a NASA ROSES grant, we are now working to expand this program nationally. Within the next year, it will be run at a second test council, and then we would like for it to run in approximately 5 new locations around the country. Towards this end, we are looking for Girl Scout councils and astronomers who can partner up to run this activity. We can supply manuals, remote support through our experienced team, and through our NASA ROSES grant, may be able to help provide supplies for the first five Girl Scout/astronomer teams available to conduct BESG in 2009.

  19. Political participation of registered nurses.

    PubMed

    Vandenhouten, Christine L; Malakar, Crystalmichelle L; Kubsch, Sylvia; Block, Derryl E; Gallagher-Lepak, Susan

    2011-08-01

    Level of political participation and factors contributing to participation were measured among Midwest RNs (n = 468) via an online survey (Cronbach's α = .95). Respondents reported engaging in primarily "low cost" activities (e.g., voting, discussing politics, and contacting elected officials), with fewer reporting speaking at public gatherings, participating in demonstrations, and membership in nursing organizations. Psychological engagement was most predictive (p < .001) of political participation with the dimensions of political interest, political efficacy, and political information/knowledge highly significant (p < .001). Resources (time/money/civic skills) significantly contributed to political participation (p < .001). Less than half (40%) felt they could impact local decisions, and fewer (32%) felt they could impact state or national government decisions. Most respondents (80%) indicated their nursing courses lacked political content and did not prepare them for political participation. Findings showed that nurse educators and leaders of professional nursing organizations need to model and cultivate greater psychological engagement among students and nurses.

  20. Keeping up with Big Data--Designing an Introductory Data Analytics Class

    ERIC Educational Resources Information Center

    Hijazi, Sam

    2016-01-01

    Universities need to keep up with the demand of the business world when it comes to Big Data. The exponential increase in data has put additional demands on academia to meet the big gap in education. Business demand for Big Data has surpassed 1.9 million positions in 2015. Big Data, Business Intelligence, Data Analytics, and Data Mining are the…

  1. Astronaut Health Participant Summary Application

    NASA Technical Reports Server (NTRS)

    Johnson, Kathy; Krog, Ralph; Rodriguez, Seth; Wear, Mary; Volpe, Robert; Trevino, Gina; Eudy, Deborah; Parisian, Diane

    2011-01-01

    The Longitudinal Study of Astronaut Health (LSAH) Participant Summary software captures data based on a custom information model designed to gather all relevant, discrete medical events for its study participants. This software provides a summarized view of the study participant s entire medical record. The manual collapsing of all the data in a participant s medical record into a summarized form eliminates redundancy, and allows for the capture of entire medical events. The coding tool could be incorporated into commercial electronic medical record software for use in areas like public health surveillance, hospital systems, clinics, and medical research programs.

  2. [Social participation after childhood craniopharyngioma].

    PubMed

    Olivari-Philiponnet, C; Roumenoff, F; Schneider, M; Chantran, C; Picot, M; Berlier, P; Mottolese, C; Bernard, J-C; Vuillerot, C

    2016-12-01

    Craniopharyngioma is a rare, benign central nervous system tumor, which may be a source of multiple complications, from endocrinology to vision, neurology and neurocognitive functions. This morbidity can lead to reduced participation in life activities, as described in the International Classification of Functioning, Disability, and Health. The primary objective of this study was to measure participation in life activities in a population of children and young adults affected by childhood craniopharyngioma, using the LIFE-H questionnaire (Assessment of Life Habits), validated as a social participation measurement tool in various pediatric disabilities. We also describe complications in our population and examined the potential links between tumor characteristics, complications, and participation in life activities.

  3. Re-Os Systematics in the Allende CAI: Big AL

    NASA Astrophysics Data System (ADS)

    Chen, J. H.; Papanastassiou, D. A.; Wasserburg, G. J.

    1999-03-01

    A sample of coarse-grained Allende CAI, Big Al analyzed using reverse aqua regia, plots on the IIA iron meteorite reference isochron, suggesting a very small time difference between the formation of CAIs, chondrites, and iron.

  4. Demonstration of Black Liquor Gasification at Big Island

    SciTech Connect

    Robert DeCarrera

    2007-04-14

    This Final Technical Report provides an account of the project for the demonstration of Black Liquor Gasification at Georgia-Pacific LLC's Big Island, VA facility. This report covers the period from May 5, 2000 through November 30, 2006.

  5. Machine learning for Big Data analytics in plants.

    PubMed

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences.

  6. The big bang - the origin and evolution of the universe.

    NASA Astrophysics Data System (ADS)

    Nicolson, I.

    The big bang theory, proposed by the Belgian abbé Georges Lemaitre in 1927 and improved by George Gamow in the 1940s, has been very successful in explaining the expansion of the universe which we observe today.

  7. AirMSPI PODEX Big Sur Ellipsoid Images

    Atmospheric Science Data Center

    2013-12-11

    ... Browse Images from the PODEX 2013 Campaign   Big Sur target 02/03/2013 Ellipsoid-projected   Select link to ...   Version number   For more information, see the  Data Product Specifications (DPS) ...

  8. Down Syndrome May Not Be Big Financial Burden on Families

    MedlinePlus

    ... page: https://medlineplus.gov/news/fullstory_162595.html Down Syndrome May Not Be Big Financial Burden on Families ... HealthDay News) -- Although families with a child with Down syndrome do face extra medical expenses, they probably won' ...

  9. Lack of Sleep Takes Big Bite Out of World Economies

    MedlinePlus

    ... medlineplus.gov/news/fullstory_162298.html Lack of Sleep Takes Big Bite Out of World Economies More ... increased risk of death linked to lack of sleep among U.S. workers cost the nation's economy as ...

  10. 10. GENERAL VIEW OF 'BIG RAILWAY' SHOWING CRADLE AND WINCH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    10. GENERAL VIEW OF 'BIG RAILWAY' SHOWING CRADLE AND WINCH MANUFACTURED BY MEAD-MORRISON MANUFACTURING COMPANY, HAVING A 450 TON CAPACITY - Anderson-Christofani Shipyard, Innes Avenue & Griffith Street, San Francisco, San Francisco County, CA

  11. "Small Steps, Big Rewards": You Can Prevent Type 2 Diabetes

    MedlinePlus

    ... Steps, Big Rewards": You Can Prevent Type 2 Diabetes Past Issues / Winter 2008 Table of Contents For ... million Americans are at risk for type 2 diabetes." "Fifty four million Americans are at risk for ...

  12. Skin Diseases Take Big Slice Out of America's Health, Economy

    MedlinePlus

    ... Diseases Take Big Slice Out of America's Health, Economy The sometimes deadly conditions cost $75 billion in ... a major impact on Americans and the U.S. economy, a new report finds. "The impact of skin ...

  13. Pre-Big Bang Cosmology: A Long History of Time?

    NASA Astrophysics Data System (ADS)

    Veneziano, Gabriele

    The popular myth according to which the Universe - and time itself - started with/near a big bang singularity is questioned. After claiming that the two main puzzles of standard cosmology allow for two possible logical answers, the author argues that superstring theory strongly favours the the pre-big bang (PBB) alternative. He then explains why PBB inflation is as generic as classical gravitational collapse, and why, as a result of symmetries in the latter problem, recent fine-tuning objections to the PBB scenario are unfounded. A hot big bang state naturally results from the powerful amplification of vacuum quantum fluctuations before the big bang, a phenomenon whose observable consequences will be briefly summarized.

  14. Big Data and Nursing: Implications for the Future.

    PubMed

    Topaz, Maxim; Pruinelli, Lisiane

    2017-01-01

    Big data is becoming increasingly more prevalent and it affects the way nurses learn, practice, conduct research and develop policy. The discipline of nursing needs to maximize the benefits of big data to advance the vision of promoting human health and wellbeing. However, current practicing nurses, educators and nurse scientists often lack the required skills and competencies necessary for meaningful use of big data. Some of the key skills for further development include the ability to mine narrative and structured data for new care or outcome patterns, effective data visualization techniques, and further integration of nursing sensitive data into artificial intelligence systems for better clinical decision support. We provide growth-path vision recommendations for big data competencies for practicing nurses, nurse educators, researchers, and policy makers to help prepare the next generation of nurses and improve patient outcomes trough better quality connected health.

  15. Big Results From a Smaller Gearbox

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Many people will be sad to see the Hubble Space Telescope go, as it was the first instrument of its kind to provide us with such a wealth of imagery and information about the galaxy. The telescope has served us well since its launch in spring of 1990, but it is nearly time for its retirement. The science, however, will continue, as NASA plans the launch of a new, more modern orbiting telescope, the James Webb Space Telescope. Named after the man who ran NASA from 1961 to 1968, years fraught with the anxiety and uncertainty of the Space Race, the scope is scheduled for launch in fall of 2011. It is designed to study the earliest galaxies and some of the first stars formed after the Big Bang. NASA scientists at the Goddard Space Flight Center are busy developing the technologies to build this new machine. Many of the new technologies are available for commercial licensing and development. For example, the NASA Planetary Gear System technology developed to give precise nanometer positioning capabilities for the James Webb Space Telescope is now being employed by Turnkey Design Services, LLC (TDS), of Blue Island, Illinois, to improve electric motors. This revolutionary piece of technology allows more efficient operation of the motors, and is more cost- effective than traditional gearbox designs.

  16. Primordial Lithium and Big Bang Nucleosynthesis.

    PubMed

    Ryan; Beers; Olive; Fields; Norris

    2000-02-20

    Recent determinations of the abundance of the light-element Li in very metal-poor stars show that its intrinsic dispersion is essentially zero and that the random error in the estimated mean Li abundance is negligible. However, a decreasing trend in the Li abundance toward lower metallicity indicates that the primordial abundance of Li can be inferred only after allowing for nucleosynthesis processes that must have been in operation in the early history of the Galaxy. We show that the observed Li versus Fe trend provides a strong discriminant between alternative models for Galactic chemical evolution of the light elements at early epochs. We critically assess current systematic uncertainties and determine the primordial Li abundance within new, much tighter limits: &parl0;Li&solm0;H&parr0;p=1.23+0.68-0.32x10-10. We show that the Li constraint on OmegaB is now limited as much by uncertainties in the nuclear cross sections used in big bang nucleosynthesis (BBN) calculations as by the observed abundance itself. A clearer understanding of systematics allows us to sharpen the comparison with 4He and deuterium and the resulting test of BBN.

  17. Big Data of the Cosmic Web

    NASA Astrophysics Data System (ADS)

    Kitaura, Francisco-Shu

    2016-10-01

    One of the main goals in cosmology is to understand how the Universe evolves, how it forms structures, why it expands, and what is the nature of dark matter and dark energy. Next decade large and expensive observational projects will bring information on the structure and the distribution of many millions of galaxies at different redshifts enabling us to make great progress in answering these questions. However, these data require a very special and complex set of analysis tools to extract the maximum valuable information. Statistical inference techniques are being developed, bridging the gaps between theory, simulations, and observations. In particular, we discuss the efforts to address the question: What is the underlying nonlinear matter distribution and dynamics at any cosmic time corresponding to a set of observed galaxies in redshift space? An accurate reconstruction of the initial conditions encodes the full phase-space information at any later cosmic time (given a particular structure formation model and a set of cosmological parameters). We present advances to solve this problem in a self-consistent way with Big Data techniques of the Cosmic Web.

  18. ATLAS: Big Data in a Small Package

    NASA Astrophysics Data System (ADS)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of <5%. This ever-growing dataset must be searched in real-time for moving objects then archived for further analysis, and alerts for newly discovered near-Earth NEAs disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  19. NSIDC: Coping with a Big Data World

    NASA Astrophysics Data System (ADS)

    Duerr, R.; Beitler, J.; Gallaher, D.; Heightley, K. A.; Serreze, M. C.; Weaver, R.

    2014-12-01

    The National Snow and Ice Data Center (NSIDC) has been serving cryospheric and related data to the research community since 1976, though our actual holdings include data extending back into the earliest days of polar exploration and in other cases back to the earliest days of scientific data collection itself. Reflecting diversity in both funding sources and research themes, NSIDC's data holdings vary dramatically in size and variety. A considerable amount of NSIDC's data is not yet even available digitally! Digital holdings range all the way from simple spreadsheets from an individual investigator to large and constantly updated NASA and NOAA satellite data sets to model output to photographs and visualized data. While most of our data holding relate to the physical sciences, we are increasingly handling data from the social sciences that may include oral histories, photographs, and surveys. Over the years the audiences interested in these data have broadened dramatically, from investigators within the discipline that provided the data, to investigators in other disciplines, journalists, decision makers, and the motivated public. How is NSIDC addressing these challenges and evolving its data management practices? In this invited presentation NSIDC's approaches for tackling the Big Data challenges of volume, velocity, veracity, visibility, and variety are described.

  20. Big data analytics in the building industry

    DOE PAGES

    Berger, Michael A.; Mathew, Paul A.; Walter, Travis

    2016-07-01

    Catalyzed by recent market, technology, and policy trends, energy data collection in the building industry is becoming more widespread. This wealth of information allows more data-driven decision-making by designers, commissioning agents, facilities staff, and energy service providers during the course of building design, operation and retrofit. The U.S. Department of Energy’s Building Performance Database (BPD) has taken advantage of this wealth of building asset- and energy-related data by collecting, cleansing, and standardizing data from across the U.S. on over 870,00 buildings, and is designed to support building benchmarking, energy efficiency project design, and buildings-related policy development with real-world data. Here,more » this article explores the promises and perils energy professionals are faced with when leveraging such tools, presenting example analyses for commercial and residential buildings, highlighting potential issues, and discussing solutions and best practices that will enable designers, operators and commissioning agents to make the most of ‘big data’ resources such as the BPD.« less

  1. MISR Views the Big Island of Hawaii

    NASA Technical Reports Server (NTRS)

    2000-01-01

    MISR images of the Big Island of Hawaii. The images have been rotated so that north is at the left.

    Upper left: April 2, 2000 (Terra orbit 1551) Upper right: May 4, 2000 (Terra orbit 2017) Lower left: June 5, 2000 (Terra orbit 2483) Lower right: June 21, 2000 (Terra orbit 2716)

    The first three images are color views acquired by the vertical (nadir) camera. The last image is a stereo anaglyph generated from the aftward cameras viewing at 60.0 and 70.5 degree look angles. It requires red/blue glasses with the red filter over the left eye.

    The color images show the greater prevalence of vegetation on the eastern side of the island due to moisture brought in by the prevailing Pacific trade winds. The western (lee) side of the island is drier. In the center of the island, and poking through the clouds in the stereo image are the Mauna Kea and Mauna Loa volcanoes, each peaking at about 4.2 km above sea level. The southern face of a line of cumulus clouds off the north coast of Hawaii is also visible in the stereo image.

    MISR was built and is managed by NASA's Jet Propulsion Laboratory, Pasadena, CA, for NASA's Office of Earth Science, Washington, DC. The Terra satellite is managed by NASA's Goddard Space Flight Center, Greenbelt, MD. JPL is a division of the California Institute of Technology.

  2. Big data analytics in the building industry

    SciTech Connect

    Berger, Michael A.; Mathew, Paul A.; Walter, Travis

    2016-07-01

    Catalyzed by recent market, technology, and policy trends, energy data collection in the building industry is becoming more widespread. This wealth of information allows more data-driven decision-making by designers, commissioning agents, facilities staff, and energy service providers during the course of building design, operation and retrofit. The U.S. Department of Energy’s Building Performance Database (BPD) has taken advantage of this wealth of building asset- and energy-related data by collecting, cleansing, and standardizing data from across the U.S. on over 870,00 buildings, and is designed to support building benchmarking, energy efficiency project design, and buildings-related policy development with real-world data. Here, this article explores the promises and perils energy professionals are faced with when leveraging such tools, presenting example analyses for commercial and residential buildings, highlighting potential issues, and discussing solutions and best practices that will enable designers, operators and commissioning agents to make the most of ‘big data’ resources such as the BPD.

  3. Big Pharma: a former insider's view.

    PubMed

    Badcott, David

    2013-05-01

    There is no lack of criticisms frequently levelled against the international pharmaceutical industry (Big Pharma): excessive profits, dubious or even dishonest practices, exploiting the sick and selective use of research data. Neither is there a shortage of examples used to support such opinions. A recent book by Brody (Hooked: Ethics, the Medical Profession and the Pharmaceutical Industry, 2008) provides a précis of the main areas of criticism, adopting a twofold strategy: (1) An assumption that the special nature and human need for pharmaceutical medicines requires that such products should not be treated like other commodities and (2) A multilevel descriptive approach that facilitates an ethical analysis of relationships and practices. At the same time, Brody is fully aware of the nature of the fundamental dilemma: the apparent addiction to (and denial of) the widespread availability of gifts and financial support for conferences etc., but recognises that 'Remove the industry and its products, and a considerable portion of scientific medicine's power to help the patient vanishes' (Brody 2008, p. 5). The paper explores some of the relevant issues, and argues that despite the identified shortcomings and a need for rigorous and perhaps enhanced regulation, and realistic price control, the commercially competitive pharmaceutical industry remains the best option for developing safer and more effective medicinal treatments. At the same time, adoption of a broader ethical basis for the industry's activities, such as a triple bottom line policy, would register an important move in the right direction and go some way toward answering critics.

  4. Big Bang Nucleosynthesis in the New Cosmology

    SciTech Connect

    Fields, Brian D.

    2008-01-24

    Big bang nucleosynthesis (BBN) describes the production of the lightest elements in the first minutes of cosmic time. We review the physics of cosmological element production, and the observations of the primordial element abundances. The comparison between theory and observation has heretofore provided our earliest probe of the universe, and given the best measure of the cosmic baryon content. However, BBN has now taken a new role in cosmology, in light of new precision measurements of the cosmic microwave background (CMB). Recent CMB anisotropy data yield a wealth of cosmological parameters; in particular, the baryon-to-photon ratio {eta} = n{sub B}/n{sub {gamma}} is measured to high precision. The confrontation between the BBN and CMB ''baryometers'' poses a new and stringent test of the standard cosmology; the status of this test are discussed. Moreover, it is now possible to recast the role of BBN by using the CMB to fix the baryon density and even some light element abundances. This strategy sharpens BBN into a more powerful probe of early universe physics, and of galactic nucleosynthesis processes. The impact of the CMB results on particle physics beyond the Standard Model, and on non-standard cosmology, are illustrated. Prospects for improvement of these bounds via additional astronomical observations and nuclear experiments are discussed, as is the lingering ''lithium problem.''.

  5. Natural regeneration processes in big sagebrush (Artemisia tridentata)

    USGS Publications Warehouse

    Schlaepfer, Daniel R.; Lauenroth, William K.; Bradford, John B.

    2014-01-01

    Big sagebrush, Artemisia tridentata Nuttall (Asteraceae), is the dominant plant species of large portions of semiarid western North America. However, much of historical big sagebrush vegetation has been removed or modified. Thus, regeneration is recognized as an important component for land management. Limited knowledge about key regeneration processes, however, represents an obstacle to identifying successful management practices and to gaining greater insight into the consequences of increasing disturbance frequency and global change. Therefore, our objective is to synthesize knowledge about natural big sagebrush regeneration. We identified and characterized the controls of big sagebrush seed production, germination, and establishment. The largest knowledge gaps and associated research needs include quiescence and dormancy of embryos and seedlings; variation in seed production and germination percentages; wet-thermal time model of germination; responses to frost events (including freezing/thawing of soils), CO2 concentration, and nutrients in combination with water availability; suitability of microsite vs. site conditions; competitive ability as well as seedling growth responses; and differences among subspecies and ecoregions. Potential impacts of climate change on big sagebrush regeneration could include that temperature increases may not have a large direct influence on regeneration due to the broad temperature optimum for regeneration, whereas indirect effects could include selection for populations with less stringent seed dormancy. Drier conditions will have direct negative effects on germination and seedling survival and could also lead to lighter seeds, which lowers germination success further. The short seed dispersal distance of big sagebrush may limit its tracking of suitable climate; whereas, the low competitive ability of big sagebrush seedlings may limit successful competition with species that track climate. An improved understanding of the

  6. Beyond simple charts: Design of visualizations for big health data.

    PubMed

    Ola, Oluwakemi; Sedig, Kamran

    2016-01-01

    Health data is often big data due to its high volume, low veracity, great variety, and high velocity. Big health data has the potential to improve productivity, eliminate waste, and support a broad range of tasks related to disease surveillance, patient care, research, and population health management. Interactive visualizations have the potential to amplify big data's utilization. Visualizations can be used to support a variety of tasks, such as tracking the geographic distribution of diseases, analyzing the prevalence of disease, triaging medical records, predicting outbreaks, and discovering at-risk populations. Currently, many health visualization tools use simple charts, such as bar charts and scatter plots, that only represent few facets of data. These tools, while beneficial for simple perceptual and cognitive tasks, are ineffective when dealing with more complex sensemaking tasks that involve exploration of various facets and elements of big data simultaneously. There is need for sophisticated and elaborate visualizations that encode many facets of data and support human-data interaction with big data and more complex tasks. When not approached systematically, design of such visualizations is labor-intensive, and the resulting designs may not facilitate big-data-driven tasks. Conceptual frameworks that guide the design of visualizations for big data can make the design process more manageable and result in more effective visualizations. In this paper, we demonstrate how a framework-based approach can help designers create novel, elaborate, non-trivial visualizations for big health data. We present four visualizations that are components of a larger tool for making sense of large-scale public health data.

  7. Beyond simple charts: Design of visualizations for big health data

    PubMed Central

    Ola, Oluwakemi; Sedig, Kamran

    2016-01-01

    Health data is often big data due to its high volume, low veracity, great variety, and high velocity. Big health data has the potential to improve productivity, eliminate waste, and support a broad range of tasks related to disease surveillance, patient care, research, and population health management. Interactive visualizations have the potential to amplify big data’s utilization. Visualizations can be used to support a variety of tasks, such as tracking the geographic distribution of diseases, analyzing the prevalence of disease, triaging medical records, predicting outbreaks, and discovering at-risk populations. Currently, many health visualization tools use simple charts, such as bar charts and scatter plots, that only represent few facets of data. These tools, while beneficial for simple perceptual and cognitive tasks, are ineffective when dealing with more complex sensemaking tasks that involve exploration of various facets and elements of big data simultaneously. There is need for sophisticated and elaborate visualizations that encode many facets of data and support human-data interaction with big data and more complex tasks. When not approached systematically, design of such visualizations is labor-intensive, and the resulting designs may not facilitate big-data-driven tasks. Conceptual frameworks that guide the design of visualizations for big data can make the design process more manageable and result in more effective visualizations. In this paper, we demonstrate how a framework-based approach can help designers create novel, elaborate, non-trivial visualizations for big health data. We present four visualizations that are components of a larger tool for making sense of large-scale public health data. PMID:28210416

  8. The 1980 Archeological Investigations at the Big Hill Lake, Kansas.

    DTIC Science & Technology

    1980-01-01

    rodents and other mammals such as coyotes, raccoons, bobcats, opossums, etc. The bottomland forests are also represented on a small scale in the Big... forests are adjacent to Big Hill creek and its feeder tributaries. These streams, some intermittent, provide suitable habitation for many groups of...points recovered from the areas near the hearths have suggested Preceramic cultural affiliations and have been identified as Afton , Ellis, Lange

  9. Watching Charlotte Climb: Little Steps toward Big Questions

    ERIC Educational Resources Information Center

    Connor, W. Robert

    2007-01-01

    In this article, the author talks about big questions of meaning and value that young people pose and how to respond to their concerns about big questions. He relates the story of his granddaughter, Charlotte, who, at the age of one, would climb up on the stairs not from choice or whim, but "because they're there." For her, it was not play, but…

  10. MX Siting Investigation. Gravity Survey - Big Smokey Valley, Nevada.

    DTIC Science & Technology

    1980-11-28

    ground- water resources. 1.2 LOCATION Big Smoky Valley is in northeastern Esmeralda and northwestern Nye counties, Nevada. The town of Tonopah, Nevada...sediments, predominantly of the Esmeralda Formation (sandstones, siltstones, and mudstones) (Kleinhampl and Ziony, 1967). The southern Toquima Range, at...Survey, Open file map, scale 1:200,000. Rush, F. E., and Schroer, C. V., 1979, Water resources of Big Smoky Valley, Landu, Nye and Esmeralda counties

  11. Comment on 'Heavy element production in inhomogeneous big bang nucleosynthesis'

    SciTech Connect

    Rauscher, Thomas

    2007-03-15

    The work of Matsuura et al. [Phys. Rev. D 72, 123505 (2005)] claims that heavy nuclei could have been produced in a combined p- and r-process in very high baryon density regions of an inhomogeneous big bang. However, they do not account for observational constraints and previous studies which show that such high baryon density regions did not significantly contribute to big bang abundances.

  12. Big Bang as Big Bounce among Myriads Others in the Eternal Universe

    NASA Astrophysics Data System (ADS)

    Nurgaliev, Ildus

    Missed components in the standard cosmological dynamics are vorticity and shear, kinematic terms. Averaged term of squared vorticity is term of accelerated expansion caused by negative energy of the repulsive factor. Cosmologic singularity was a consequence of the unrealistically excessive cosmological principle (too detailed symmetry of flow) such as Hubble law. Appropriated realistic one is also linear but has tensor character. Cosmologic principle is applied to irregularities - they are homogeneous and isotropic in average. The Big Bang is a reminiscence of the local bounce which is typical among zillions others. Exact solutions are presented.

  13. Clinical research of traditional Chinese medicine in big data era.

    PubMed

    Zhang, Junhua; Zhang, Boli

    2014-09-01

    With the advent of big data era, our thinking, technology and methodology are being transformed. Data-intensive scientific discovery based on big data, named "The Fourth Paradigm," has become a new paradigm of scientific research. Along with the development and application of the Internet information technology in the field of healthcare, individual health records, clinical data of diagnosis and treatment, and genomic data have been accumulated dramatically, which generates big data in medical field for clinical research and assessment. With the support of big data, the defects and weakness may be overcome in the methodology of the conventional clinical evaluation based on sampling. Our research target shifts from the "causality inference" to "correlativity analysis." This not only facilitates the evaluation of individualized treatment, disease prediction, prevention and prognosis, but also is suitable for the practice of preventive healthcare and symptom pattern differentiation for treatment in terms of traditional Chinese medicine (TCM), and for the post-marketing evaluation of Chinese patent medicines. To conduct clinical studies involved in big data in TCM domain, top level design is needed and should be performed orderly. The fundamental construction and innovation studies should be strengthened in the sections of data platform creation, data analysis technology and big-data professionals fostering and training.

  14. The Big, the Bad, and the Boozed-Up: Weight Moderates the Effect of Alcohol on Aggression

    PubMed Central

    DeWall, C. Nathan; Bushman, Brad J.; Giancola, Peter R.; Webster, Gregory D.

    2010-01-01

    Most people avoid the “big, drunk guy” in bars because they don’t want to get assaulted. Is this stereotype supported by empirical evidence? Unfortunately, no scientific work has investigated this topic. Based on the recalibrational theory of anger and embodied cognition theory, we predicted that heavier men would behave the most aggressively when intoxicated. In two independent experiments (Ns= 553 and 327, respectively), participants consumed either alcohol or placebo beverages and then completed an aggression task in which they could administer painful electric shocks to a fictitious opponent. Both experiments showed that weight interacted with alcohol and gender to predict the highest amount of aggression among intoxicated heavy men. The results suggest that an embodied cognition approach is useful in understanding intoxicated aggression. Apparently there is a kernel of truth in the stereotype of the “big, drunk, aggressive guy.” PMID:20526451

  15. Fiscal Management Training. Participant's Guide.

    ERIC Educational Resources Information Center

    Office of Student Financial Assistance (ED), Washington, DC.

    This document is the participant's guide for fiscal management training for administrators managing an institution's Title IV program funds. The workshop is designed to prepare participants to understand an institution's responsibilities with regard to Title IV. It describes the recordkeeping requirements of the Title IV program and the accounting…

  16. Educational Participation and Inmate Misconduct

    ERIC Educational Resources Information Center

    Lahm, Karen F.

    2009-01-01

    The majority of extant literature on correctional education focuses on the relationship between program participation and recidivism while ignoring the possible relationship between educational program participation and inmate misconduct. The present study sought to fill in this gap in the literature by investigating the effect of several types of…

  17. Investigating Seed Longevity of Big Sagebrush (Artemisia tridentata)

    USGS Publications Warehouse

    Wijayratne, Upekala C.; Pyke, David A.

    2009-01-01

    The Intermountain West is dominated by big sagebrush communities (Artemisia tridentata subspecies) that provide habitat and forage for wildlife, prevent erosion, and are economically important to recreation and livestock industries. The two most prominent subspecies of big sagebrush in this region are Wyoming big sagebrush (A. t. ssp. wyomingensis) and mountain big sagebrush (A. t. ssp. vaseyana). Increased understanding of seed bank dynamics will assist with sustainable management and persistence of sagebrush communities. For example, mountain big sagebrush may be subjected to shorter fire return intervals and prescribed fire is a tool used often to rejuvenate stands and reduce tree (Juniperus sp. or Pinus sp.) encroachment into these communities. A persistent seed bank for mountain big sagebrush would be advantageous under these circumstances. Laboratory germination trials indicate that seed dormancy in big sagebrush may be habitat-specific, with collections from colder sites being more dormant. Our objective was to investigate seed longevity of both subspecies by evaluating viability of seeds in the field with a seed retrieval experiment and sampling for seeds in situ. We chose six study sites for each subspecies. These sites were dispersed across eastern Oregon, southern Idaho, northwestern Utah, and eastern Nevada. Ninety-six polyester mesh bags, each containing 100 seeds of a subspecies, were placed at each site during November 2006. Seed bags were placed in three locations: (1) at the soil surface above litter, (2) on the soil surface beneath litter, and (3) 3 cm below the soil surface to determine whether dormancy is affected by continued darkness or environmental conditions. Subsets of seeds were examined in April and November in both 2007 and 2008 to determine seed viability dynamics. Seed bank samples were taken at each site, separated into litter and soil fractions, and assessed for number of germinable seeds in a greenhouse. Community composition data

  18. Metal atom dynamics in superbulky metallocenes: a comparison of (Cp(BIG))2Sn and (Cp(BIG))2Eu.

    PubMed

    Harder, Sjoerd; Naglav, Dominik; Schwerdtfeger, Peter; Nowik, Israel; Herber, Rolfe H

    2014-02-17

    Cp(BIG)2Sn (Cp(BIG) = (4-n-Bu-C6H4)5cyclopentadienyl), prepared by reaction of 2 equiv of Cp(BIG)Na with SnCl2, crystallized isomorphous to other known metallocenes with this ligand (Ca, Sr, Ba, Sm, Eu, Yb). Similarly, it shows perfect linearity, C-H···C(π) bonding between the Cp(BIG) rings and out-of-plane bending of the aryl substituents toward the metal. Whereas all other Cp(BIG)2M complexes show large disorder in the metal position, the Sn atom in Cp(BIG)2Sn is perfectly ordered. In contrast, (119)Sn and (151)Eu Mößbauer investigations on the corresponding Cp(BIG)2M metallocenes show that Sn(II) is more dynamic and loosely bound than Eu(II). The large displacement factors in the group 2 and especially in the lanthanide(II) metallocenes Cp(BIG)2M can be explained by static metal disorder in a plane parallel to the Cp(BIG) rings. Despite parallel Cp(BIG) rings, these metallocenes have a nonlinear Cpcenter-M-Cpcenter geometry. This is explained by an ionic model in which metal atoms are polarized by the negatively charged Cp rings. The extent of nonlinearity is in line with trends found in M(2+) ion polarizabilities. The range of known calculated dipole polarizabilities at the Douglas-Kroll CCSD(T) level was extended with values (atomic units) for Sn(2+) 15.35, Sm(2+)(4f(6) (7)F) 9.82, Eu(2+)(4f(7) (8)S) 8.99, and Yb(2+)(4f(14) (1)S) 6.55. This polarizability model cannot be applied to predominantly covalently bound Cp(BIG)2Sn, which shows a perfectly ordered structure. The bent geometry of Cp*2Sn should therefore not be explained by metal polarizability but is due to van der Waals Cp*···Cp* attraction and (to some extent) to a small p-character component in the Sn lone pair.

  19. Effect of furosemide and dietary sodium on kidney and plasma big and small renin

    SciTech Connect

    Iwao, H.; Michelakis, A.M.

    1981-12-01

    Renin was found in mouse plasma in high-molecular-weight forms (big big renin, big renin) and a low-molecular-weight form (small renin). They were measuerd by a radioimmunoassay procedure for the direct measurement of renin. In the kidney, 89% of total renin was small renin and the rest was big big and big renin. This distribution pattern of renins was not changed when the kideny tissue was homogenized in the presence of protease inhibitors. Low-sodium or high-sodium diets changed renal renin content, but not the distribution pattern of renins in the kidney. Acute stimulation of renin release by furosemide increased small renin but not big big and big renin in plasma. However, dietary sodium depletion for 2 weeks significantly increased big big, big, and small renin in plasma of mice with or without submaxillary glands. In contrast, high-sodium intake significantly decreased big big, big, and small renin in plasma of mice with or without submaxillary glands.

  20. BigDebug: Debugging Primitives for Interactive Big Data Processing in Spark

    PubMed Central

    Gulzar, Muhammad Ali; Interlandi, Matteo; Yoo, Seunghyun; Tetali, Sai Deep; Condie, Tyson; Millstein, Todd; Kim, Miryung

    2016-01-01

    Developers use cloud computing platforms to process a large quantity of data in parallel when developing big data analytics. Debugging the massive parallel computations that run in today’s data-centers is time consuming and error-prone. To address this challenge, we design a set of interactive, real-time debugging primitives for big data processing in Apache Spark, the next generation data-intensive scalable cloud computing platform. This requires re-thinking the notion of step-through debugging in a traditional debugger such as gdb, because pausing the entire computation across distributed worker nodes causes significant delay and naively inspecting millions of records using a watchpoint is too time consuming for an end user. First, BIGDEBUG’s simulated breakpoints and on-demand watchpoints allow users to selectively examine distributed, intermediate data on the cloud with little overhead. Second, a user can also pinpoint a crash-inducing record and selectively resume relevant sub-computations after a quick fix. Third, a user can determine the root causes of errors (or delays) at the level of individual records through a fine-grained data provenance capability. Our evaluation shows that BIGDEBUG scales to terabytes and its record-level tracing incurs less than 25% overhead on average. It determines crash culprits orders of magnitude more accurately and provides up to 100% time saving compared to the baseline replay debugger. The results show that BIGDEBUG supports debugging at interactive speeds with minimal performance impact. PMID:27390389