Science.gov

Sample records for big 1-98 participants

  1. Symptoms of endocrine treatment and outcome in the BIG 1-98 study.

    PubMed

    Huober, J; Cole, B F; Rabaglio, M; Giobbie-Hurder, A; Wu, J; Ejlertsen, B; Bonnefoi, H; Forbes, J F; Neven, P; Láng, I; Smith, I; Wardley, A; Price, K N; Goldhirsch, A; Coates, A S; Colleoni, M; Gelber, R D; Thürlimann, B

    2014-01-01

    There may be a relationship between the incidence of vasomotor and arthralgia/myalgia symptoms and treatment outcomes for postmenopausal breast cancer patients with endocrine-responsive disease who received adjuvant letrozole or tamoxifen. Data on patients randomized into the monotherapy arms of the BIG 1-98 clinical trial who did not have either vasomotor or arthralgia/myalgia/carpal tunnel (AMC) symptoms reported at baseline, started protocol treatment and were alive and disease-free at the 3-month landmark (n = 4,798) and at the 12-month landmark (n = 4,682) were used for this report. Cohorts of patients with vasomotor symptoms, AMC symptoms, neither, or both were defined at both 3 and 12 months from randomization. Landmark analyses were performed for disease-free survival (DFS) and for breast cancer free interval (BCFI), using regression analysis to estimate hazard ratios (HR) and 95 % confidence intervals (CI). Median follow-up was 7.0 years. Reporting of AMC symptoms was associated with better outcome for both the 3- and 12-month landmark analyses [e.g., 12-month landmark, HR (95 % CI) for DFS = 0.65 (0.49-0.87), and for BCFI = 0.70 (0.49-0.99)]. By contrast, reporting of vasomotor symptoms was less clearly associated with DFS [12-month DFS HR (95 % CI) = 0.82 (0.70-0.96)] and BCFI (12-month DFS HR (95 % CI) = 0.97 (0.80-1.18). Interaction tests indicated no effect of treatment group on associations between symptoms and outcomes. While reporting of AMC symptoms was clearly associated with better DFS and BCFI, the association between vasomotor symptoms and outcome was less clear, especially with respect to breast cancer-related events.

  2. Letrozole as upfront endocrine therapy for postmenopausal women with hormone-sensitive breast cancer: BIG 1-98

    PubMed Central

    Thuerlimann, Beat

    2007-01-01

    The BIG 1-98 trial is a large, randomized, independently conducted clinical trial designed to compare the efficacy of upfront letrozole versus tamoxifen monotherapy and to compare sequential or up-front use of letrozole and/or tamoxifen as an early adjuvant therapy for patients with early breast cancer. We report on the results from the primary core analysis of the BIG 1-98 trial of 8,010 patients, which compares monotherapy with letrozole versus tamoxifen. This pre-planned core analysis allowed the use of patient data from the monotherapy arms of letrozole and tamoxifen and from the sequential arms prior to the drug switch point. Patients randomized to letrozole had a 19% improved disease-free survival (hazard ratio [HR] = 0.81; P = 0.003), due especially to reduced distant metastases (HR = 0.73; P = 0.001). A 14% risk reduction of fatal events in favor of letrozole was also observed (P = NS). The results from the monotherapy arms alone confirmed the findings from the primary core analysis. Based on the results from this trial, the aromatase inhibitor letrozole (Femara®) is currently recommended as a part of standard adjuvant therapy for postmenopausal women with endocrine-responsive breast cancer and has recently been approved in the early adjuvant setting in both Europe and the United States. A subsequent analysis after additional follow-up will address the question of monotherapy versus sequential therapy. PMID:17912636

  3. Relative Effectiveness of Letrozole Compared With Tamoxifen for Patients With Lobular Carcinoma in the BIG 1-98 Trial

    PubMed Central

    Metzger Filho, Otto; Giobbie-Hurder, Anita; Mallon, Elizabeth; Gusterson, Barry; Viale, Giuseppe; Winer, Eric P.; Thürlimann, Beat; Gelber, Richard D.; Colleoni, Marco; Ejlertsen, Bent; Debled, Marc; Price, Karen N.; Regan, Meredith M.; Coates, Alan S.; Goldhirsch, Aron

    2015-01-01

    Purpose To evaluate the relative effectiveness of letrozole compared with tamoxifen for patients with invasive ductal or lobular carcinoma. Patients and Methods Patients diagnosed with early-stage invasive ductal carcinoma (IDC) or classic invasive lobular carcinoma (ILC) who were randomly assigned onto the Breast International Group (BIG) 1-98 trial and who had centrally reviewed pathology data were included (N = 2,923). HER2-negative IDC and ILC were additionally classified as hormone receptor–positive with high (luminal B [LB] –like) or low (luminal A [LA] –like) proliferative activity by Ki-67 labeling index. Survival analyses were performed with weighted Cox models that used inverse probability of censoring weighted modeling. Results The median follow-up time was 8.1 years. In multivariable models for disease-free survival (DFS), significant interactions between treatment and histology (ILC or IDC; P = .006) and treatment and subgroup (LB like or LA like; P = .01) were observed. In the ILC subset, there was a 66% reduction in the hazard of a DFS event with letrozole for LB (hazard ratio [HR], 0.34; 95% CI, 0.21 to 0.55) and a 50% reduction for LA subtypes (HR, 0.50; 95% CI, 0.32 to 0.78). In the IDC subset, there was a significant 35% reduction in the hazard of a DFS event with letrozole for the LB subtype (HR, 0.65; 95% CI, 0.53 to 0.79), but no difference between treatments was noted for IDC and the LA subtype (HR, 0.95; 95% CI, 0.76 to 1.20). Conclusion The magnitude of benefit of adjuvant letrozole is greater for patients diagnosed with lobular carcinoma versus ductal carcinoma. PMID:26215945

  4. Bone fractures among postmenopausal patients with endocrine-responsive early breast cancer treated with 5 years of letrozole or tamoxifen in the BIG 1-98 trial

    PubMed Central

    Rabaglio, M.; Sun, Z.; Castiglione-Gertsch, M.; Hawle, H.; Thürlimann, B.; Mouridsen, H.; Campone, M.; Forbes, J. F.; Paridaens, R. J.; Colleoni, M.; Pienkowski, T.; Nogaret, J.-M.; Láng, I.; Smith, I.; Gelber, R. D.; Goldhirsch, A.; Coates, A. S.

    2009-01-01

    Background: To compare the incidence and timing of bone fractures in postmenopausal women treated with 5 years of adjuvant tamoxifen or letrozole for endocrine-responsive early breast cancer in the Breast International Group (BIG) 1-98 trial. Methods: We evaluated 4895 patients allocated to 5 years of letrozole or tamoxifen in the BIG 1-98 trial who received at least some study medication (median follow-up 60.3 months). Bone fracture information (grade, cause, site) was collected every 6 months during trial treatment. Results: The incidence of bone fractures was higher among patients treated with letrozole [228 of 2448 women (9.3%)] versus tamoxifen [160 of 2447 women (6.5%)]. The wrist was the most common site of fracture in both treatment groups. Statistically significant risk factors for bone fractures during treatment included age, smoking history, osteoporosis at baseline, previous bone fracture, and previous hormone replacement therapy. Conclusions: Consistent with other trials comparing aromatase inhibitors to tamoxifen, letrozole was associated with an increase in bone fractures. Benefits of superior disease control associated with letrozole and lower incidence of fracture with tamoxifen should be considered with the risk profile for individual patients. PMID:19474112

  5. The advantage of letrozole over tamoxifen in the BIG 1-98 trial is consistent in younger postmenopausal women and in those with chemotherapy-induced menopause

    PubMed Central

    Sun, Zhuoxin; Smith, Ian; Price, Karen N.; Thürlimann, Beat; Ejlertsen, Bent; Bonnefoi, Hervé; Regan, Meredith M.; Goldhirsch, Aron; Coates, Alan S.

    2016-01-01

    Letrozole, an aromatase inhibitor, is ineffective in the presence of ovarian estrogen production. Two subpopulations of apparently postmenopausal women might derive reduced benefit from letrozole due to residual or returning ovarian activity: younger women (who have the potential for residual subclinical ovarian estrogen production), and those with chemotherapy-induced menopause who may experience return of ovarian function. In these situations tamoxifen may be preferable to an aromatase inhibitor. Among 4,922 patients allocated to the monotherapy arms (5 years of letrozole or tamoxifen) in the BIG 1-98 trial we identified two relevant subpopulations: patients with potential residual ovarian function, defined as having natural menopause, treated without adjuvant or neoadjuvant chemotherapy and age ≤55 years (n = 641); and those with chemotherapy-induced menopause (n = 105). Neither of the subpopulations examined showed treatment effects differing from the trial population as a whole (interaction P values are 0.23 and 0.62, respectively). Indeed, both among the 641 patients aged ≤55 years with natural menopause and no chemotherapy (HR 0.77 [0.51, 1.16]) and among the 105 patients with chemotherapy-induced menopause (HR 0.51 [0.19, 1.39]), the disease-free survival (DFS) point estimate favoring letrozole was marginally more beneficial than in the trial as a whole (HR 0.84 [0.74, 0.95]). Contrary to our initial concern, DFS results for young postmenopausal patients who did not receive chemotherapy and patients with chemotherapy-induced menopause parallel the letrozole benefit seen in the BIG 1-98 population as a whole. These data support the use of letrozole even in such patients. PMID:21892704

  6. The advantage of letrozole over tamoxifen in the BIG 1-98 trial is consistent in younger postmenopausal women and in those with chemotherapy-induced menopause.

    PubMed

    Chirgwin, Jacquie; Sun, Zhuoxin; Smith, Ian; Price, Karen N; Thürlimann, Beat; Ejlertsen, Bent; Bonnefoi, Hervé; Regan, Meredith M; Goldhirsch, Aron; Coates, Alan S

    2012-01-01

    Letrozole, an aromatase inhibitor, is ineffective in the presence of ovarian estrogen production. Two subpopulations of apparently postmenopausal women might derive reduced benefit from letrozole due to residual or returning ovarian activity: younger women (who have the potential for residual subclinical ovarian estrogen production), and those with chemotherapy-induced menopause who may experience return of ovarian function. In these situations tamoxifen may be preferable to an aromatase inhibitor. Among 4,922 patients allocated to the monotherapy arms (5 years of letrozole or tamoxifen) in the BIG 1-98 trial we identified two relevant subpopulations: patients with potential residual ovarian function, defined as having natural menopause, treated without adjuvant or neoadjuvant chemotherapy and age ≤ 55 years (n = 641); and those with chemotherapy-induced menopause (n = 105). Neither of the subpopulations examined showed treatment effects differing from the trial population as a whole (interaction P values are 0.23 and 0.62, respectively). Indeed, both among the 641 patients aged ≤ 55 years with natural menopause and no chemotherapy (HR 0.77 [0.51, 1.16]) and among the 105 patients with chemotherapy-induced menopause (HR 0.51 [0.19, 1.39]), the disease-free survival (DFS) point estimate favoring letrozole was marginally more beneficial than in the trial as a whole (HR 0.84 [0.74, 0.95]). Contrary to our initial concern, DFS results for young postmenopausal patients who did not receive chemotherapy and patients with chemotherapy-induced menopause parallel the letrozole benefit seen in the BIG 1-98 population as a whole. These data support the use of letrozole even in such patients.

  7. Interpreting breast international group (BIG) 1-98: a randomized, double-blind, phase III trial comparing letrozole and tamoxifen as adjuvant endocrine therapy for postmenopausal women with hormone receptor-positive, early breast cancer

    PubMed Central

    2011-01-01

    The Breast International Group (BIG) 1-98 study is a four-arm trial comparing 5 years of monotherapy with tamoxifen or with letrozole or with sequences of 2 years of one followed by 3 years of the other for postmenopausal women with endocrine-responsive early invasive breast cancer. From 1998 to 2003, BIG -98 enrolled 8,010 women. The enhanced design f the trial enabled two complementary analyses of efficacy and safety. Collection of tumor specimens further enabled treatment comparisons based on tumor biology. Reports of BIG 1-98 should be interpreted in relation to each individual patient as she weighs the costs and benefits of available treatments. Clinicaltrials.gov ID: NCT00004205. PMID:21635709

  8. ESR1 and ESR2 polymorphisms in the BIG 1-98 trial comparing adjuvant letrozole versus tamoxifen or their sequence for early breast cancer.

    PubMed

    Leyland-Jones, Brian; Gray, Kathryn P; Abramovitz, Mark; Bouzyk, Mark; Young, Brandon; Long, Bradley; Kammler, Roswitha; Dell'Orto, Patrizia; Biasi, Maria Olivia; Thürlimann, Beat; Harvey, Vernon; Neven, Patrick; Arnould, Laurent; Maibach, Rudolf; Price, Karen N; Coates, Alan S; Goldhirsch, Aron; Gelber, Richard D; Pagani, Olivia; Viale, Giuseppe; Rae, James M; Regan, Meredith M

    2015-12-01

    Estrogen receptor 1 (ESR1) and ESR2 gene polymorphisms have been associated with endocrine-mediated physiological mechanisms, and inconsistently with breast cancer risk and outcomes, bone mineral density changes, and hot flushes/night sweats. DNA was isolated and genotyped for six ESR1 and two ESR2 single-nucleotide polymorphisms (SNPs) from tumor specimens from 3691 postmenopausal women with hormone receptor-positive breast cancer enrolled in the BIG 1-98 trial to receive tamoxifen and/or letrozole for 5 years. Associations with recurrence and adverse events (AEs) were assessed using Cox proportional hazards models. 3401 samples were successfully genotyped for five SNPs. ESR1 rs9340799(XbaI) (T>C) variants CC or TC were associated with reduced breast cancer risk (HR = 0.82,95% CI = 0.67-1.0), and ESR1 rs2077647 (T>C) variants CC or TC was associated with reduced distant recurrence risk (HR = 0.69, 95% CI = 0.53-0.90), both regardless of the treatments. No differential treatment effects (letrozole vs. tamoxifen) were observed for the association of outcome with any of the SNPs. Letrozole-treated patients with rs2077647 (T>C) variants CC and TC had a reduced risk of bone AE (HR = 0.75, 95% CI = 0.58-0.98, P interaction = 0.08), whereas patients with rs4986938 (G>A) genotype variants AA and AG had an increased risk of bone AE (HR = 1.37, 95% CI = 1.01-1.84, P interaction = 0.07). We observed that (1) rare ESR1 homozygous polymorphisms were associated with lower recurrence, and (2) ESR1 and ESR2 SNPs were associated with bone AEs in letrozole-treated patients. Genes that are involved in estrogen signaling and synthesis have the potential to affect both breast cancer recurrence and side effects, suggesting that individual treatment strategies can incorporate not only oncogenic drivers but also SNPs related to estrogen activity. PMID:26590813

  9. ESR1 and ESR2 polymorphisms in the BIG 1-98 trial comparing adjuvant letrozole versus tamoxifen or their sequence for early breast cancer.

    PubMed

    Leyland-Jones, Brian; Gray, Kathryn P; Abramovitz, Mark; Bouzyk, Mark; Young, Brandon; Long, Bradley; Kammler, Roswitha; Dell'Orto, Patrizia; Biasi, Maria Olivia; Thürlimann, Beat; Harvey, Vernon; Neven, Patrick; Arnould, Laurent; Maibach, Rudolf; Price, Karen N; Coates, Alan S; Goldhirsch, Aron; Gelber, Richard D; Pagani, Olivia; Viale, Giuseppe; Rae, James M; Regan, Meredith M

    2015-12-01

    Estrogen receptor 1 (ESR1) and ESR2 gene polymorphisms have been associated with endocrine-mediated physiological mechanisms, and inconsistently with breast cancer risk and outcomes, bone mineral density changes, and hot flushes/night sweats. DNA was isolated and genotyped for six ESR1 and two ESR2 single-nucleotide polymorphisms (SNPs) from tumor specimens from 3691 postmenopausal women with hormone receptor-positive breast cancer enrolled in the BIG 1-98 trial to receive tamoxifen and/or letrozole for 5 years. Associations with recurrence and adverse events (AEs) were assessed using Cox proportional hazards models. 3401 samples were successfully genotyped for five SNPs. ESR1 rs9340799(XbaI) (T>C) variants CC or TC were associated with reduced breast cancer risk (HR = 0.82,95% CI = 0.67-1.0), and ESR1 rs2077647 (T>C) variants CC or TC was associated with reduced distant recurrence risk (HR = 0.69, 95% CI = 0.53-0.90), both regardless of the treatments. No differential treatment effects (letrozole vs. tamoxifen) were observed for the association of outcome with any of the SNPs. Letrozole-treated patients with rs2077647 (T>C) variants CC and TC had a reduced risk of bone AE (HR = 0.75, 95% CI = 0.58-0.98, P interaction = 0.08), whereas patients with rs4986938 (G>A) genotype variants AA and AG had an increased risk of bone AE (HR = 1.37, 95% CI = 1.01-1.84, P interaction = 0.07). We observed that (1) rare ESR1 homozygous polymorphisms were associated with lower recurrence, and (2) ESR1 and ESR2 SNPs were associated with bone AEs in letrozole-treated patients. Genes that are involved in estrogen signaling and synthesis have the potential to affect both breast cancer recurrence and side effects, suggesting that individual treatment strategies can incorporate not only oncogenic drivers but also SNPs related to estrogen activity.

  10. Design, conduct, and analyses of Breast International Group (BIG) 1-98: A randomized, double-blind, phase-III study comparing letrozole and tamoxifen as adjuvant endocrine therapy for postmenopausal women with receptor-positive, early breast cancer

    PubMed Central

    Giobbie-Hurder, Anita; Price, Karen N; Gelber, Richard D

    2010-01-01

    Background Aromatase inhibitors provide superior disease control when compared with tamoxifen as adjuvant therapy for postmenopausal women with endocrine-responsive early breast cancer. Purpose To present the design, history, and analytic challenges of the Breast International Group (BIG) 1-98 trial: an international, multicenter, randomized, double-blind, phase-III study comparing the aromatase inhibitor letrozole with tamoxifen in this clinical setting. Methods From 1998–2003, BIG 1-98 enrolled 8028 women to receive monotherapy with either tamoxifen or letrozole for 5 years, or sequential therapy of 2 years of one agent followed by 3 years of the other. Randomization to one of four treatment groups permitted two complementary analyses to be conducted several years apart. The first, reported in 2005, provided a head-to-head comparison of letrozole versus tamoxifen. Statistical power was increased by an enriched design, which included patients who were assigned sequential treatments until the time of the treatment switch. The second, reported in late 2008, used a conditional landmark approach to test the hypothesis that switching endocrine agents at approximately 2 years from randomization for patients who are disease-free is superior to continuing with the original agent. Results The 2005 analysis showed the superiority of letrozole compared with tamoxifen. The patients who were assigned tamoxifen alone were unblinded and offered the opportunity to switch to letrozole. Results from other trials increased the clinical relevance about whether or not to start treatment with letrozole or tamoxifen, and analysis plans were expanded to evaluate sequential versus single-agent strategies from randomization. Limitations Due to the unblinding of patients assigned tamoxifen alone, analysis of updated data will require ascertainment of the influence of selective crossover from tamoxifen to letrozole. Conclusions BIG 1-98 is an example of an enriched design, involving

  11. A Framework for Learning about Big Data with Mobile Technologies for Democratic Participation: Possibilities, Limitations, and Unanticipated Obstacles

    ERIC Educational Resources Information Center

    Philip, Thomas M.; Schuler-Brown, Sarah; Way, Winmar

    2013-01-01

    As Big Data becomes increasingly important in policy-making, research, marketing, and commercial applications, we argue that literacy in this domain is critical for engaged democratic participation and that peer-generated data from mobile technologies offer rich possibilities for students to learn about this new genre of data. Through the lens of…

  12. 49 CFR 1.98 - The Research and Innovative Technology Administration.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false The Research and Innovative Technology Administration. 1.98 Section 1.98 Transportation Office of the Secretary of Transportation ORGANIZATION AND DELEGATION OF POWERS AND DUTIES Operating Administrations § 1.98 The Research and Innovative...

  13. 49 CFR 1.98 - The Research and Innovative Technology Administration.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false The Research and Innovative Technology Administration. 1.98 Section 1.98 Transportation Office of the Secretary of Transportation ORGANIZATION AND DELEGATION OF POWERS AND DUTIES Operating Administrations § 1.98 The Research and Innovative...

  14. When the Big Fish Turns Small: Effects of Participating in Gifted Summer Programs on Academic Self-Concepts

    ERIC Educational Resources Information Center

    Dai, David Yun; Rinn, Anne N.; Tan, Xiaoyuan

    2013-01-01

    The purposes of this study were to (a) examine the presence and prevalence of the big-fish-little-pond effect (BFLPE) in summer programs for the gifted, (b) identify group and individual difference variables that help predict those who are more susceptible to the BFLPE, and (c) put the possible BFLPE on academic self-concept in a larger context of…

  15. 4. VIEW LOOKING SOUTHEAST AT BUILDING 444. (1/1/98) Rocky ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. VIEW LOOKING SOUTHEAST AT BUILDING 444. (1/1/98) - Rocky Flats Plant, Non-Nuclear Production Facility, South of Cottonwood Avenue, west of Seventh Avenue & east of Building 460, Golden, Jefferson County, CO

  16. 5. VIEW LOOKING NORTHWEST OF BUILDING 444. (1/1/98) Rocky ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. VIEW LOOKING NORTHWEST OF BUILDING 444. (1/1/98) - Rocky Flats Plant, Non-Nuclear Production Facility, South of Cottonwood Avenue, west of Seventh Avenue & east of Building 460, Golden, Jefferson County, CO

  17. Thermoelectric properties of Zintl compound Ca1-xNaxMg2Bi1.98

    NASA Astrophysics Data System (ADS)

    Shuai, Jing; Kim, Hee Seok; Liu, Zihang; He, Ran; Sui, Jiehe; Ren, Zhifeng

    2016-05-01

    Motivated by good thermoelectric performance of Bi-based Zintl compounds Ca1-xYbxMg2Biy, we further studied the thermoelectric properties of Zintl compound CaMg2Bi1.98 by doping Na into Ca as Ca1-xNaxMg2Bi1.98 via mechanical alloying and hot pressing. We found that the electrical conductivity, Seebeck coefficient, power factor, and carrier concentration can be effectively adjusted by tuning the Na concentration. Transport measurement and calculations revealed that an optimal doping of 0.5 at. % Na achieved better average ZT and efficiency. The enhancement in thermoelectric performance is attributed to the increased carrier concentration and power factor. The low cost and nontoxicity of Ca1-xNaxMg2Bi1.98 makes it a potentially promising thermoelectric material for power generation in the mid-temperature range.

  18. ISO-PC Version 1.98: User`s guide

    SciTech Connect

    Rittmann, P.D.

    1995-05-02

    This document describes how to use Version 1.98 of the shielding program named ISO-PC. Version 1.98 corrects all known errors in ISOSHLD-II. In addition, a few numeric problems have been eliminated. There are three new namelist variables, 25 additional shielding materials, and 5 more energy groups. The two major differences with the original ISOSHLD-II are the removal of RIBD(radioisotope buildup and decay) source generator, and the removal of the non-uniform source distribution parameter, SSV1. This version of ISO-PC works with photon energies from 10 KeV to 10 MeV using 30 energy groups.

  19. Big Society, Big Deal?

    ERIC Educational Resources Information Center

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  20. Obesity and Risk of Recurrence or Death After Adjuvant Endocrine Therapy With Letrozole or Tamoxifen in the Breast International Group 1-98 Trial

    PubMed Central

    Ewertz, Marianne; Gray, Kathryn P.; Regan, Meredith M.; Ejlertsen, Bent; Price, Karen N.; Thürlimann, Beat; Bonnefoi, Hervé; Forbes, John F.; Paridaens, Robert J.; Rabaglio, Manuela; Gelber, Richard D.; Colleoni, Marco; Láng, István; Smith, Ian E.; Coates, Alan S.; Goldhirsch, Aron; Mouridsen, Henning T.

    2012-01-01

    Purpose To examine the association of baseline body mass index (BMI) with the risk of recurrence or death in postmenopausal women with early-stage breast cancer receiving adjuvant tamoxifen or letrozole in the Breast International Group (BIG) 1-98 trial at 8.7 years of median follow-up. Patients and Methods This report analyzes 4,760 patients with breast cancer randomly assigned to 5 years of monotherapy with letrozole or tamoxifen in the BIG 1-98 trial with available information on BMI at randomization. Multivariable Cox modeling assessed the association of BMI with disease-free survival, overall survival (OS), breast cancer–free interval, and distant recurrence-free interval and tested for treatment-by-BMI interaction. Median follow-up was 8.7 years. Results Seventeen percent of patients have died. Obese patients (BMI ≥ 30 kg/m2) had slightly poorer OS (hazard ratio [HR] = 1.19; 95% CI, 0.99 to 1.44) than patients with normal BMI (< 25 kg/m2), whereas no trend in OS was observed in overweight (BMI 25 to < 30 kg/m2) versus normal-weight patients (HR = 1.02; 95% CI, 0.86 to 1.20). Treatment-by-BMI interactions were not statistically significant. The HRs for OS comparing obese versus normal BMI were HR = 1.22 (95% CI, 0.93 to 1.60) and HR = 1.18 (95% CI, 0.91 to 1.52) in the letrozole and tamoxifen groups, respectively. Conclusion There was no evidence that the benefit of letrozole over tamoxifen differed according to patients' BMI. PMID:23045588

  1. Big Science! Big Problems?

    ERIC Educational Resources Information Center

    Beigel, Allan

    1991-01-01

    Lessons learned by the University of Arizona through participation in two major scientific projects, construction of an astronomical observatory and a super cyclotron, are discussed. Four criteria for institutional participation in such projects are outlined, including consistency with institutional mission, adequate resources, leadership, and…

  2. How Big Is Too Big?

    ERIC Educational Resources Information Center

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  3. Big Surveys, Big Data Centres

    NASA Astrophysics Data System (ADS)

    Schade, D.

    2016-06-01

    Well-designed astronomical surveys are powerful and have consistently been keystones of scientific progress. The Byurakan Surveys using a Schmidt telescope with an objective prism produced a list of about 3000 UV-excess Markarian galaxies but these objects have stimulated an enormous amount of further study and appear in over 16,000 publications. The CFHT Legacy Surveys used a wide-field imager to cover thousands of square degrees and those surveys are mentioned in over 1100 publications since 2002. Both ground and space-based astronomy have been increasing their investments in survey work. Survey instrumentation strives toward fair samples and large sky coverage and therefore strives to produce massive datasets. Thus we are faced with the "big data" problem in astronomy. Survey datasets require specialized approaches to data management. Big data places additional challenging requirements for data management. If the term "big data" is defined as data collections that are too large to move then there are profound implications for the infrastructure that supports big data science. The current model of data centres is obsolete. In the era of big data the central problem is how to create architectures that effectively manage the relationship between data collections, networks, processing capabilities, and software, given the science requirements of the projects that need to be executed. A stand alone data silo cannot support big data science. I'll describe the current efforts of the Canadian community to deal with this situation and our successes and failures. I'll talk about how we are planning in the next decade to try to create a workable and adaptable solution to support big data science.

  4. Big Dreams

    ERIC Educational Resources Information Center

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  5. Big Opportunities and Big Concerns of Big Data in Education

    ERIC Educational Resources Information Center

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  6. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  7. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process. PMID:27068058

  8. Five Big Ideas

    ERIC Educational Resources Information Center

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  9. Who Benefits from Participative Management?

    ERIC Educational Resources Information Center

    Benoliel, Pascale; Somech, Anit

    2010-01-01

    Purpose: This study seeks to explore the moderating role of teachers' personality traits from the Big Five typology on the relationship between participative management and teacher outcomes with respect to performance, satisfaction and strain. The study suggests that participative management may produce different results depending on teachers'…

  10. The effects of surface spin on magnetic properties of weak magnetic ZnLa0.02Fe1.98O4 nanoparticles

    PubMed Central

    2014-01-01

    In order to prominently investigate the effects of the surface spin on the magnetic properties, the weak magnetic ZnLa0.02Fe1.98O4 nanoparticles were chosen as studying objects which benefit to reduce as possibly the effects of interparticle dipolar interaction and crystalline anisotropy energies. By annealing the undiluted and diluted ZnLa0.02Fe1.98O4 nanoparticles at different temperatures, we observed the rich variations of magnetic ordering states (superparamagnetism, weak ferromagnetism, and paramagnetism). The magnetic properties can be well understood by considering the effects of the surface spin of the magnetic nanoparticles. Our results indicate that in the nano-sized magnets with weak magnetism, the surface spin plays a crucial rule in the magnetic properties. PMID:25294976

  11. Diode-pumped Tm : Sc{sub 2}SiO{sub 5} laser ({lambda} = 1.98 {mu}m)

    SciTech Connect

    Zavartsev, Yu D; Zagumennyi, A I; Kalachev, Yu L; Kutovoi, S A; Mikhailov, Viktor A; Podreshetnikov, V V; Shcherbakov, Ivan A

    2011-05-31

    Lasing at a wavelength of 1.98 {mu}m is obtained for the first time in a diode-pumped ({lambda} = 792 {mu}m) active element made of a Tm{sup 3+}: Sc{sub 2}SiO{sub 5} crystal grown by the Czochralski method. The laser slope efficiency reached 18.7% at the output power up to 520 mW. (lasers)

  12. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  13. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed. PMID:21859221

  14. Dual of big bang and big crunch

    SciTech Connect

    Bak, Dongsu

    2007-01-15

    Starting from the Janus solution and its gauge theory dual, we obtain the dual gauge theory description of the cosmological solution by the procedure of double analytic continuation. The coupling is driven either to zero or to infinity at the big-bang and big-crunch singularities, which are shown to be related by the S-duality symmetry. In the dual Yang-Mills theory description, these are nonsingular as the coupling goes to zero in the N=4 super Yang-Mills theory. The cosmological singularities simply signal the failure of the supergravity description of the full type IIB superstring theory.

  15. The Big Loser.

    ERIC Educational Resources Information Center

    Marks, Daniel

    1999-01-01

    Presents an activity in which the subject is the identity of the team in the greatest jeopardy of becoming the big loser in a basketball tournament. Explores several facts about the big loser, offering them in a hierarchy appropriate for creating various short- and long-term projects for a high school mathematics class. (ASK)

  16. Implementing Big History.

    ERIC Educational Resources Information Center

    Welter, Mark

    2000-01-01

    Contends that world history should be taught as "Big History," a view that includes all space and time beginning with the Big Bang. Discusses five "Cardinal Questions" that serve as a course structure and address the following concepts: perspectives, diversity, change and continuity, interdependence, and causes. (CMK)

  17. Prognostic and Predictive Value of Centrally Reviewed Ki-67 Labeling Index in Postmenopausal Women With Endocrine-Responsive Breast Cancer: Results From Breast International Group Trial 1-98 Comparing Adjuvant Tamoxifen With Letrozole

    PubMed Central

    Viale, Giuseppe; Giobbie-Hurder, Anita; Regan, Meredith M.; Coates, Alan S.; Mastropasqua, Mauro G.; Dell'Orto, Patrizia; Maiorano, Eugenio; MacGrogan, Gaëtan; Braye, Stephen G.; Öhlschlegel, Christian; Neven, Patrick; Orosz, Zsolt; Olszewski, Wojciech P.; Knox, Fiona; Thürlimann, Beat; Price, Karen N.; Castiglione-Gertsch, Monica; Gelber, Richard D.; Gusterson, Barry A.; Goldhirsch, Aron

    2008-01-01

    Purpose To evaluate the prognostic and predictive value of Ki-67 labeling index (LI) in a trial comparing letrozole (Let) with tamoxifen (Tam) as adjuvant therapy in postmenopausal women with early breast cancer. Patients and Methods Breast International Group (BIG) trial 1-98 randomly assigned 8,010 patients to four treatment arms comparing Let and Tam with sequences of each agent. Of 4,922 patients randomly assigned to receive 5 years of monotherapy with either agent, 2,685 had primary tumor material available for central pathology assessment of Ki-67 LI by immunohistochemistry and had tumors confirmed to express estrogen receptors after central review. The prognostic and predictive value of centrally measured Ki-67 LI on disease-free survival (DFS) were assessed among these patients using proportional hazards modeling, with Ki-67 LI values dichotomized at the median value of 11%. Results Higher values of Ki-67 LI were associated with adverse prognostic factors and with worse DFS (hazard ratio [HR; high:low] = 1.8; 95% CI, 1.4 to 2.3). The magnitude of the treatment benefit for Let versus Tam was greater among patients with high tumor Ki-67 LI (HR [Let:Tam] = 0.53; 95% CI, 0.39 to 0.72) than among patients with low tumor Ki-67 LI (HR [Let:Tam] = 0.81; 95% CI, 0.57 to 1.15; interaction P = .09). Conclusion Ki-67 LI is confirmed as a prognostic factor in this study. High Ki-67 LI levels may identify a patient group that particularly benefits from initial Let adjuvant therapy. PMID:18981464

  18. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled. PMID:26173222

  19. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  20. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority. PMID:26218867

  1. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  2. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  3. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record

  4. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record PMID:26348599

  5. The Big Bang Theory

    ScienceCinema

    Lincoln, Don

    2016-07-12

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  6. The Big Bang Theory

    SciTech Connect

    Lincoln, Don

    2014-09-30

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  7. The Big Bang Singularity

    NASA Astrophysics Data System (ADS)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  8. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    ERIC Educational Resources Information Center

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own "big" words and dreams. During the one…

  9. Cruise report, RV ocean alert cruise A1-98-HW; January 30 through February 23, 1998, Honolulu to Honolulu, Hawaii

    USGS Publications Warehouse

    Gardner, James V.; Hughes-Clarke, John E.

    1998-01-01

    The major objective of cruise A1-98 was to map portions of the insular slopes of Oahu, Kauai, Maui, Molokai, and Hawaii and to survey in detail US Environmental Protection Agency (USEPA) ocean dumping sites using a Simrad EM300 high-resolution multibeam mapping system. The cruise was a jointly funded project between the US Army Corps of Engineers (USCOE), USEPA, and the US Geological Survey (USGS). The USACOE and EPA are interested in these areas because of a series of ocean dump sites off Oahu, Kauai, Maui, and Hawaii (Fig. 1) that require high-resolution base maps for site monitoring purposes. The USGS Coastal and Marine Geology Program has several on-going projects off Oahu and Maui that lack high-precision base maps for a variety of ongoing geological studies. The cruise was conducted under a Cooperative Agreement between the USGS and the Ocean Mapping Group, University of New Brunswick, Canada.

  10. Uniform Big Bang-Chaotic Big Crunch optimization

    NASA Astrophysics Data System (ADS)

    Alatas, Bilal

    2011-09-01

    This study proposes methods to improve the convergence of the novel optimization method, Big Bang-Big Crunch (BB-BC). Uniform population method has been used to generate uniformly distributed random points in the Big Bang phase. Chaos has been utilized to rapidly shrink those points to a single representative point via a center of mass in the Big Crunch phase. The proposed algorithm has been named as Uniform Big Bang-Chaotic Big Crunch (UBB-CBC). The performance of the UBB-CBC optimization algorithm demonstrates superiority over the BB-BC optimization for the benchmark functions.

  11. Big data in biomedicine.

    PubMed

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science.

  12. Bayesian big bang

    NASA Astrophysics Data System (ADS)

    Daum, Fred; Huang, Jim

    2011-09-01

    We show that the flow of particles corresponding to Bayes' rule has a number of striking similarities with the big bang, including cosmic inflation and cosmic acceleration. We derive a PDE for this flow using a log-homotopy from the prior probability density to the posteriori probability density. We solve this PDE using the gradient of the solution to Poisson's equation, which is computed using an exact Green's function and the standard Monte Carlo approximation of integrals. The resulting flow is analogous to Coulomb's law in electromagnetics. We have used no physics per se to derive this flow, but rather we have only used Bayes' rule and the definition of normalized probability and a loghomotopy parameter that could be interpreted as time. The details of this big bang resemble very recent theories much more closely than the so-called new inflation models, which postulate enormous inflation immediately after the big bang.

  13. Big Questions: Missing Antimatter

    SciTech Connect

    Lincoln, Don

    2013-08-27

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  14. Big Questions: Missing Antimatter

    ScienceCinema

    Lincoln, Don

    2016-07-12

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  15. [Empowerment of women in difficult life situations: the BIG project].

    PubMed

    Rütten, A; Röger, U; Abu-Omar, K; Frahsa, A

    2008-12-01

    BIG is a project for the promotion of physical activity among women in difficult life situations. Following the main health promotion principles of the WHO, the women shall be enabled or empowered to take control of determinants of their health. A comprehensive participatory approach was applied and women were included in planning, implementing and evaluating the project. For measuring the effects of BIG on the empowerment of participating women, qualitative semi-structured interviews with 15 women participating in BIG were conducted. For data analysis, qualitative content analysis was used. Results showed the empowerment of the women on the individual level as they gained different competencies and perceived self-efficacy. These effects were supported through the empowerment process on the organizational and community levels where women gained control over their life situations and over policies influencing them. Therefore, the participatory approach of BIG is a key success factor for empowerment promotion of women in difficult life situations.

  16. A Sobering Big Idea

    ERIC Educational Resources Information Center

    Wineburg, Sam

    2006-01-01

    Since Susan Adler, Alberta Dougan, and Jesus Garcia like "big ideas," the author offers one to ponder: young people in this country can not read with comprehension. The saddest thing about this crisis is that it is no secret. The 2001 results of the National Assessment of Educational Progress (NAEP) for reading, published in every major newspaper,…

  17. The Big Sky inside

    ERIC Educational Resources Information Center

    Adams, Earle; Ward, Tony J.; Vanek, Diana; Marra, Nancy; Hester, Carolyn; Knuth, Randy; Spangler, Todd; Jones, David; Henthorn, Melissa; Hammill, Brock; Smith, Paul; Salisbury, Rob; Reckin, Gene; Boulafentis, Johna

    2009-01-01

    The University of Montana (UM)-Missoula has implemented a problem-based program in which students perform scientific research focused on indoor air pollution. The Air Toxics Under the Big Sky program (Jones et al. 2007; Adams et al. 2008; Ward et al. 2008) provides a community-based framework for understanding the complex relationship between poor…

  18. The Big Fish

    ERIC Educational Resources Information Center

    DeLisle, Rebecca; Hargis, Jace

    2005-01-01

    The Killer Whale, Shamu jumps through hoops and splashes tourists in hopes for the big fish, not because of passion, desire or simply the enjoyment of doing so. What would happen if those fish were obsolete? Would this killer whale be able to find the passion to continue to entertain people? Or would Shamu find other exciting activities to do…

  19. Big-City Rules

    ERIC Educational Resources Information Center

    Gordon, Dan

    2011-01-01

    When it comes to implementing innovative classroom technology programs, urban school districts face significant challenges stemming from their big-city status. These range from large bureaucracies, to scalability, to how to meet the needs of a more diverse group of students. Because of their size, urban districts tend to have greater distance…

  20. A Big Bang Lab

    ERIC Educational Resources Information Center

    Scheider, Walter

    2005-01-01

    The February 2005 issue of The Science Teacher (TST) reminded everyone that by learning how scientists study stars, students gain an understanding of how science measures things that can not be set up in lab, either because they are too big, too far away, or happened in a very distant past. The authors of "How Far are the Stars?" show how the…

  1. Big Enough for Everyone?

    ERIC Educational Resources Information Center

    Coote, Anna

    2010-01-01

    The UK's coalition government wants to build a "Big Society." The Prime Minister says "we are all in this together" and building it is the responsibility of every citizen as well as every government department. The broad vision is welcome, but everything depends on how the vision is translated into policy and practice. The government aims to put…

  2. The big bang

    NASA Astrophysics Data System (ADS)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  3. Thinking Big, Aiming High

    ERIC Educational Resources Information Center

    Berkeley, Viv

    2010-01-01

    What do teachers, providers and policymakers need to do in order to support disabled learners to "think big and aim high"? That was the question put to delegates at NIACE's annual disability conference. Some clear themes emerged, with delegates raising concerns about funding, teacher training, partnership-working and employment for disabled…

  4. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  5. Business and Science - Big Data, Big Picture

    NASA Astrophysics Data System (ADS)

    Rosati, A.

    2013-12-01

    Data Science is more than the creation, manipulation, and transformation of data. It is more than Big Data. The business world seems to have a hold on the term 'data science' and, for now, they define what it means. But business is very different than science. In this talk, I address how large datasets, Big Data, and data science are conceptually different in business and science worlds. I focus on the types of questions each realm asks, the data needed, and the consequences of findings. Gone are the days of datasets being created or collected to serve only one purpose or project. The trick with data reuse is to become familiar enough with a dataset to be able to combine it with other data and extract accurate results. As a Data Curator for the Advanced Cooperative Arctic Data and Information Service (ACADIS), my specialty is communication. Our team enables Arctic sciences by ensuring datasets are well documented and can be understood by reusers. Previously, I served as a data community liaison for the North American Regional Climate Change Assessment Program (NARCCAP). Again, my specialty was communicating complex instructions and ideas to a broad audience of data users. Before entering the science world, I was an entrepreneur. I have a bachelor's degree in economics and a master's degree in environmental social science. I am currently pursuing a Ph.D. in Geography. Because my background has embraced both the business and science worlds, I would like to share my perspectives on data, data reuse, data documentation, and the presentation or communication of findings. My experiences show that each can inform and support the other.

  6. PEG Coating Reduces NMR Relaxivity of Mn0.5Zn0.5Gd0.02Fe1.98O4 Hyperthermia Nanoparticles

    PubMed Central

    Issa, Bashar; Qadri, Shahnaz; Obaidat, Ihab M.; Bowtell, Richard W.; Haik, Yousef

    2011-01-01

    Purpose To investigate both T1 and T2 MR relaxation enhancement of Gd substituted Zn-Mn ferrite magnetic nanoparticles. Both uncoated and polyethylene glycol (PEG) coated particles were used. Materials and Methods Chemical co-precipitation was used to synthesize particles in the form Mn0.5Zn0.5Gd0.02Fe1.98O4 suitable for hyperthermia applications. Physical characterization of the magnetic nanoparticles included SEM, TEM, ICP, and SQUID. T1 and T2 measurements were performed at 1.5 T. Results The saturation magnetization was 12.86 emu/g while the particle’s magnetic moment was 1.86 × 10−19 J/T. The particle size increased due to coating, while 1/T1 and 1/T2 relaxivities (26 °C) decreased from 2.5 to 0.7 and from 201.3 to 76.6 s−1 mM−1, respectively at a magnetic field 1.5 T. Conclusion The reduction in both 1/T1 and 1/T2 is attributed to increased distance of closest approach between the protons and the magnetic core caused by the shielding provided by the high molecular weight PEG. 1/T2 data is compared to existing theoretical models using a modified radius that takes into account both possible agglomeration of the particles and increased inter-particle separation induced by PEG coating. PMID:21928382

  7. Small turbines, big unknown

    SciTech Connect

    Gipe, P.

    1995-07-01

    While financial markets focus on the wheeling and dealing of the big wind companies, the small wind turbine industry quietly keeps churning out its smaller but effective machines. Some, the micro turbines, are so small they can be carried by hand. Though worldwide sales of small wind turbines fall far short of even one large windpower plant, figures reach $8 million to $10 million annually and could be as much as twice that if batteries and engineering services are included.

  8. DARPA's Big Mechanism program.

    PubMed

    Cohen, Paul R

    2015-07-16

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  9. DARPA's Big Mechanism program

    NASA Astrophysics Data System (ADS)

    Cohen, Paul R.

    2015-07-01

    Reductionist science produces causal models of small fragments of complicated systems. Causal models of entire systems can be hard to construct because what is known of them is distributed across a vast amount of literature. The Big Mechanism program aims to have machines read the literature and assemble the causal fragments found in individual papers into huge causal models, automatically. The current domain of the program is cell signalling associated with Ras-driven cancers.

  10. The Next Big Idea

    PubMed Central

    2013-01-01

    Abstract George S. Eisenbarth will remain in our memories as a brilliant scientist and great collaborator. His quest to discover the cause and prevention of type 1 (autoimmune) diabetes started from building predictive models based on immunogenetic markers. Despite his tremendous contributions to our understanding of the natural history of pre-type 1 diabetes and potential mechanisms, George left us with several big questions to answer before his quest is completed. PMID:23786296

  11. Big Bang Circus

    NASA Astrophysics Data System (ADS)

    Ambrosini, C.

    2011-06-01

    Big Bang Circus is an opera I composed in 2001 and which was premiered at the Venice Biennale Contemporary Music Festival in 2002. A chamber group, four singers and a ringmaster stage the story of the Universe confronting and interweaving two threads: how early man imagined it and how scientists described it. Surprisingly enough fancy, myths and scientific explanations often end up using the same images, metaphors and sometimes even words: a strong tension, a drumskin starting to vibrate, a shout…

  12. Big3. Editorial

    PubMed Central

    Lehmann, Christoph U.; Séroussi, Brigitte; Jaulent, Marie-Christine

    2014-01-01

    Summary Objectives To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. Methods A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. Results ‘Big Data’ has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that ‘Big Data’ will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics – some to a higher degree than others. It was our goal to provide a comprehensive view at the state of ‘Big Data’ today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. Conclusions For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016. PMID:24853037

  13. A big first step.

    PubMed

    Jones, Howard W

    2004-11-01

    The singleton, term gestation, live birth rate per cycle initiated (BESST) endpoint proposed at the beginning of 2004 is a first big step which should be added to by the consideration of multiple pregnancy rates in relation to singleton rates, by recording of fetal reductions and of pregnancies resulting from cryopreserved material. After three or more steps we may have an accurate reporting system which helps patients to distinguish the pros and cons for singleton term delivery. PMID:15479704

  14. Big Sky Carbon Atlas

    DOE Data Explorer

    The Big Sky Carbon Atlas is an online geoportal designed for you to discover, interpret, and access geospatial data and maps relevant to decision support and education on carbon sequestration in the Big Sky Region. In serving as the public face of the Partnership's spatial Data Libraries, the Atlas provides a gateway to geographic information characterizing CO2 sources, potential geologic sinks, terrestrial carbon fluxes, civil and energy infrastructure, energy use, and related themes. In addition to directly serving the BSCSP and its stakeholders, the Atlas feeds regional data to the NatCarb Portal, contributing to a national perspective on carbon sequestration. Established components of the Atlas include a gallery of thematic maps and an interactive map that allows you to: • Navigate and explore regional characterization data through a user-friendly interface • Print your map views or publish them as PDFs • Identify technical references relevant to specific areas of interest • Calculate straight-line or pipeline-constrained distances from point sources of CO2 to potential geologic sink features • Download regional data layers (feature under development) (Acknowledgment to the Big Sky Carbon Sequestration Partnership (BSCSP); see home page at http://www.bigskyco2.org/)

  15. Big Data Technologies

    PubMed Central

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  16. How Big is Earth?

    NASA Astrophysics Data System (ADS)

    Thurber, Bonnie B.

    2015-08-01

    How Big is Earth celebrates the Year of Light. Using only the sunlight striking the Earth and a wooden dowel, students meet each other and then measure the circumference of the earth. Eratosthenes did it over 2,000 years ago. In Cosmos, Carl Sagan shared the process by which Eratosthenes measured the angle of the shadow cast at local noon when sunlight strikes a stick positioned perpendicular to the ground. By comparing his measurement to another made a distance away, Eratosthenes was able to calculate the circumference of the earth. How Big is Earth provides an online learning environment where students do science the same way Eratosthenes did. A notable project in which this was done was The Eratosthenes Project, conducted in 2005 as part of the World Year of Physics; in fact, we will be drawing on the teacher's guide developed by that project.How Big Is Earth? expands on the Eratosthenes project by providing an online learning environment provided by the iCollaboratory, www.icollaboratory.org, where teachers and students from Sweden, China, Nepal, Russia, Morocco, and the United States collaborate, share data, and reflect on their learning of science and astronomy. They are sharing their information and discussing their ideas/brainstorming the solutions in a discussion forum. There is an ongoing database of student measurements and another database to collect data on both teacher and student learning from surveys, discussions, and self-reflection done online.We will share our research about the kinds of learning that takes place only in global collaborations.The entrance address for the iCollaboratory is http://www.icollaboratory.org.

  17. Age and Gender Differences in Motivational Manifestations of the Big Five from Age 16 to 60

    ERIC Educational Resources Information Center

    Lehmann, Regula; Denissen, Jaap J. A.; Allemand, Mathias; Penke, Lars

    2013-01-01

    The present cross-sectional study investigated age and gender differences in motivational manifestations of the Big Five in a large German-speaking Internet sample (N = 19,022). Participants ranging in age from 16 to 60 years completed the Five Individual Reaction Norms Inventory (FIRNI; Denissen & Penke, 2008a), and two traditional Big Five…

  18. The International Big History Association

    ERIC Educational Resources Information Center

    Duffy, Michael; Duffy, D'Neil

    2013-01-01

    IBHA, the International Big History Association, was organized in 2010 and "promotes the unified, interdisciplinary study and teaching of history of the Cosmos, Earth, Life, and Humanity." This is the vision that Montessori embraced long before the discoveries of modern science fleshed out the story of the evolving universe. "Big History" is a…

  19. The Big Read: Case Studies

    ERIC Educational Resources Information Center

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  20. Think Big, Bigger ... and Smaller

    ERIC Educational Resources Information Center

    Nisbett, Richard E.

    2010-01-01

    One important principle of social psychology, writes Nisbett, is that some big-seeming interventions have little or no effect. This article discusses a number of cases from the field of education that confirm this principle. For example, Head Start seems like a big intervention, but research has indicated that its effects on academic achievement…

  1. The Rise of Big Data in Neurorehabilitation.

    PubMed

    Faroqi-Shah, Yasmeen

    2016-02-01

    In some fields, Big Data has been instrumental in analyzing, predicting, and influencing human behavior. However, Big Data approaches have so far been less central in speech-language pathology. This article introduces the concept of Big Data and provides examples of Big Data initiatives pertaining to adult neurorehabilitation. It also discusses the potential theoretical and clinical contributions that Big Data can make. The article also recognizes some impediments in building and using Big Data for scientific and clinical inquiry.

  2. Earth Science Big Data Activities at Research Data Alliance

    NASA Astrophysics Data System (ADS)

    Kuo, Kwo-Sen; Baumann, Peter; Evans, Ben; Riedel, Morris

    2016-04-01

    In this presentation we introduce Earth science related activities of the Big Data Interest Group (BDIG) in Research Data Alliance (RDA). "RDA is an international organization focused on the development of infrastructure and community activities that reduce barriers to data sharing and exchange, and the acceleration of data driven innovation worldwide." The participation of researchers in RDA is voluntary. As the name implies, an Interest Group is a collection of participants sharing the same interest. The BDIG seeks to address community needs on all things having to do with Big Data. The ultimate goal of RDA Big Data Interest Group is to produce a set of recommendation documents to advise diverse research communities with respect to: • How to select an appropriate Big Data solution for a particular science application to realize optimal value? and • What are the best practices in dealing with various data and computing issues associated with such a solution? The primary means to reaching such recommendations is through the establishment and work of Working Groups, each of which focuses on a specific issue. Although BDIG is not specific to Earth science, its recent activities revolve mostly around it. We introduce some of these activities that are designed to advance our knowledge and to characterize Big Data in Earth science.

  3. Big-bounce genesis

    NASA Astrophysics Data System (ADS)

    Li, Changhong; Brandenberger, Robert H.; Cheung, Yeuk-Kwan E.

    2014-12-01

    We report on the possibility of using dark matter particle's mass and its interaction cross section as a smoking gun signal of the existence of a big bounce at the early stage in the evolution of our currently observed universe. A model independent study of dark matter production in the pre-bounce contraction and the post-bounce expansion epochs of the bounce universe reveals a new venue for achieving the observed relic abundance of our present universe, in which a significantly smaller amount of dark matter with a smaller cross section—as compared to the prediction of standard cosmology—is produced and the information about the bounce universe evolution is preserved by the out-of-thermal-equilibrium process. Once the value of dark matter mass and interaction cross section are obtained by direct detection in laboratories, this alternative route becomes a signature prediction of the bounce universe scenario.

  4. BIG DATA AND STATISTICS

    PubMed Central

    Rossell, David

    2016-01-01

    Big Data brings unprecedented power to address scientific, economic and societal issues, but also amplifies the possibility of certain pitfalls. These include using purely data-driven approaches that disregard understanding the phenomenon under study, aiming at a dynamically moving target, ignoring critical data collection issues, summarizing or preprocessing the data inadequately and mistaking noise for signal. We review some success stories and illustrate how statistical principles can help obtain more reliable information from data. We also touch upon current challenges that require active methodological research, such as strategies for efficient computation, integration of heterogeneous data, extending the underlying theory to increasingly complex questions and, perhaps most importantly, training a new generation of scientists to develop and deploy these strategies. PMID:27722040

  5. Big bang and big crunch in matrix string theory

    SciTech Connect

    Bedford, J.; Ward, J.; Papageorgakis, C.; Rodriguez-Gomez, D.

    2007-04-15

    Following the holographic description of linear dilaton null cosmologies with a big bang in terms of matrix string theory put forward by Craps, Sethi, and Verlinde, we propose an extended background describing a universe including both big bang and big crunch singularities. This belongs to a class of exact string backgrounds and is perturbative in the string coupling far away from the singularities, both of which can be resolved using matrix string theory. We provide a simple theory capable of describing the complete evolution of this closed universe.

  6. Participative Design for Participative Democracy.

    ERIC Educational Resources Information Center

    Emery, Merrelyn, Ed.

    This four-part volume addresses design principles for introducing democratic forms in workplaces, educational institutions, and social institutions, based on a trend toward participative democracy in Australia. Following an introduction, part I sets the context with two papers: "The Agenda for the Next Wave" and "Educational Paradigms: An…

  7. The challenges of big data.

    PubMed

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest.

  8. Homogeneous and isotropic big rips?

    SciTech Connect

    Giovannini, Massimo

    2005-10-15

    We investigate the way big rips are approached in a fully inhomogeneous description of the space-time geometry. If the pressure and energy densities are connected by a (supernegative) barotropic index, the spatial gradients and the anisotropic expansion decay as the big rip is approached. This behavior is contrasted with the usual big-bang singularities. A similar analysis is performed in the case of sudden (quiescent) singularities and it is argued that the spatial gradients may well be non-negligible in the vicinity of pressure singularities.

  9. Big Data and Ambulatory Care

    PubMed Central

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2015-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945

  10. The challenges of big data

    PubMed Central

    2016-01-01

    ABSTRACT The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. PMID:27147249

  11. Big climate data analysis

    NASA Astrophysics Data System (ADS)

    Mudelsee, Manfred

    2015-04-01

    The Big Data era has begun also in the climate sciences, not only in economics or molecular biology. We measure climate at increasing spatial resolution by means of satellites and look farther back in time at increasing temporal resolution by means of natural archives and proxy data. We use powerful supercomputers to run climate models. The model output of the calculations made for the IPCC's Fifth Assessment Report amounts to ~650 TB. The 'scientific evolution' of grid computing has started, and the 'scientific revolution' of quantum computing is being prepared. This will increase computing power, and data amount, by several orders of magnitude in the future. However, more data does not automatically mean more knowledge. We need statisticians, who are at the core of transforming data into knowledge. Statisticians notably also explore the limits of our knowledge (uncertainties, that is, confidence intervals and P-values). Mudelsee (2014 Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Second edition. Springer, Cham, xxxii + 454 pp.) coined the term 'optimal estimation'. Consider the hyperspace of climate estimation. It has many, but not infinite, dimensions. It consists of the three subspaces Monte Carlo design, method and measure. The Monte Carlo design describes the data generating process. The method subspace describes the estimation and confidence interval construction. The measure subspace describes how to detect the optimal estimation method for the Monte Carlo experiment. The envisaged large increase in computing power may bring the following idea of optimal climate estimation into existence. Given a data sample, some prior information (e.g. measurement standard errors) and a set of questions (parameters to be estimated), the first task is simple: perform an initial estimation on basis of existing knowledge and experience with such types of estimation problems. The second task requires the computing power: explore the hyperspace to

  12. The BigBOSS Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna

    2011-01-01

    BigBOSS will obtain observational constraints that will bear on three of the four 'science frontier' questions identified by the Astro2010 Cosmology and Fundamental Phyics Panel of the Decadal Survey: Why is the universe accelerating; what is dark matter and what are the properties of neutrinos? Indeed, the BigBOSS project was recommended for substantial immediate R and D support the PASAG report. The second highest ground-based priority from the Astro2010 Decadal Survey was the creation of a funding line within the NSF to support a 'Mid-Scale Innovations' program, and it used BigBOSS as a 'compelling' example for support. This choice was the result of the Decadal Survey's Program Priorization panels reviewing 29 mid-scale projects and recommending BigBOSS 'very highly'.

  13. Big Spherules near 'Victoria'

    NASA Technical Reports Server (NTRS)

    2006-01-01

    This frame from the microscopic imager on NASA's Mars Exploration Rover Opportunity shows spherules up to about 5 millimeters (one-fifth of an inch) in diameter. The camera took this image during the 924th Martian day, or sol, of Opportunity's Mars-surface mission (Aug. 30, 2006), when the rover was about 200 meters (650 feet) north of 'Victoria Crater.'

    Opportunity discovered spherules like these, nicknamed 'blueberries,' at its landing site in 'Eagle Crater,' and investigations determined them to be iron-rich concretions that formed inside deposits soaked with groundwater. However, such concretions were much smaller or absent at the ground surface along much of the rover's trek of more than 5 kilometers (3 miles) southward to Victoria. The big ones showed up again when Opportunity got to the ring, or annulus, of material excavated and thrown outward by the impact that created Victoria Crater. Researchers hypothesize that some layer beneath the surface in Victoria's vicinity was once soaked with water long enough to form the concretions, that the crater-forming impact dispersed some material from that layer, and that Opportunity might encounter that layer in place if the rover drives down into the crater.

  14. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  15. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  16. Powering Big Data for Nursing Through Partnership.

    PubMed

    Harper, Ellen M; Parkerson, Sara

    2015-01-01

    The Big Data Principles Workgroup (Workgroup) was established with support of the Healthcare Information and Management Systems Society. Building on the Triple Aim challenge, the Workgroup sought to identify Big Data principles, barriers, and challenges to nurse-sensitive data inclusion into Big Data sets. The product of this pioneering partnership Workgroup was the "Guiding Principles for Big Data in Nursing-Using Big Data to Improve the Quality of Care and Outcomes."

  17. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  18. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany. PMID:26077871

  19. GEOSS: Addressing Big Data Challenges

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  20. Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} luminescence whisker based on vapor-phase deposition: Facile synthesis, uniform morphology and enhanced luminescence properties

    SciTech Connect

    Xu, Jian; Hassan, Dhia A.; Zeng, Renjie; Peng, Dongliang

    2015-11-15

    Highlights: • For the first time, it is possible to obtain Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} whisker. • The whiskers are smooth and uniform with L/D ratio over 50. • Durability and thermal stability of the whisker are enhanced. - Abstract: A high performance strontium silicate phosphor has been successfully synthesized though a facile vapor-phase deposition method. The product consists of single crystal whiskers which are smooth and uniform, and with a sectional equivalent diameter of around 5 μm; the aspect ratio is over 50 and no agglomeration can be observed. X-ray diffraction result confirmed that the crystal structure of the whisker was α’-Sr{sub 2}SiO{sub 4}. The exact chemical composition was Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} which was analyzed by energy dispersive spectrometer and inductively coupled plasma-mass spectrometer. The whisker shows broad green emission with peak at 523 nm ranging from 470 to 600 nm (excited at 370 nm). Compared with traditional Sr{sub 2}SiO{sub 4}:Eu phosphor, durability (at 85% humidity and 85 °C) and thermal stability of the whisker are obviously improved. Moreover, growth mechanism of the Sr{sub 1.98}Eu{sub 0.02}SiO{sub 4} whiskers is Vapor–Liquid–Solid. On a macro-scale, the product is still powder which makes it suitable for the current packaging process of WLEDs.

  1. Multiwavelength astronomy and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.

    2016-09-01

    Two major characteristics of modern astronomy are multiwavelength (MW) studies (fromγ-ray to radio) and big data (data acquisition, storage and analysis). Present astronomical databases and archives contain billions of objects observed at various wavelengths, both galactic and extragalactic, and the vast amount of data on them allows new studies and discoveries. Astronomers deal with big numbers. Surveys are the main source for discovery of astronomical objects and accumulation of observational data for further analysis, interpretation, and achieving scientific results. We review the main characteristics of astronomical surveys, compare photographic and digital eras of astronomical studies (including the development of wide-field observations), describe the present state of MW surveys, and discuss the Big Data in astronomy and related topics of Virtual Observatories and Computational Astrophysics. The review includes many numbers and data that can be compared to have a possibly overall understanding on the Universe, cosmic numbers and their relationship to modern computational facilities.

  2. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Navidi, Fatemeh; Beard, Daniel A.; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  3. Big Data: Astronomical or Genomical?

    PubMed

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade.

  4. Big Data: Astronomical or Genomical?

    PubMed Central

    Stephens, Zachary D.; Lee, Skylar Y.; Faghri, Faraz; Campbell, Roy H.; Zhai, Chengxiang; Efron, Miles J.; Iyer, Ravishankar; Schatz, Michael C.; Sinha, Saurabh; Robinson, Gene E.

    2015-01-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a “four-headed beast”—it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the “genomical” challenges of the next decade. PMID:26151137

  5. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  6. Big Data: Astronomical or Genomical?

    PubMed

    Stephens, Zachary D; Lee, Skylar Y; Faghri, Faraz; Campbell, Roy H; Zhai, Chengxiang; Efron, Miles J; Iyer, Ravishankar; Schatz, Michael C; Sinha, Saurabh; Robinson, Gene E

    2015-07-01

    Genomics is a Big Data science and is going to get much bigger, very soon, but it is not known whether the needs of genomics will exceed other Big Data domains. Projecting to the year 2025, we compared genomics with three other major generators of Big Data: astronomy, YouTube, and Twitter. Our estimates show that genomics is a "four-headed beast"--it is either on par with or the most demanding of the domains analyzed here in terms of data acquisition, storage, distribution, and analysis. We discuss aspects of new technologies that will need to be developed to rise up and meet the computational challenges that genomics poses for the near future. Now is the time for concerted, community-wide planning for the "genomical" challenges of the next decade. PMID:26151137

  7. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  8. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-01

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented. PMID:26614539

  9. Little Science to Big Science: Big Scientists to Little Scientists?

    ERIC Educational Resources Information Center

    Simonton, Dean Keith

    2010-01-01

    This article presents the author's response to Hisham B. Ghassib's essay entitled "Where Does Creativity Fit into a Productivist Industrial Model of Knowledge Production?" Professor Ghassib's (2010) essay presents a provocative portrait of how the little science of the Babylonians, Greeks, and Arabs became the Big Science of the modern industrial…

  10. The "big win" and resistance to extinction when gambling.

    PubMed

    Weatherly, Jeffrey N; Sauter, John M; King, Brent M

    2004-11-01

    One hypothesis for the reason a person might become a pathological gambler is that the individual initially experiences a big win, which creates a fallacious expectation of winning, which may then lead to persistent gambling despite suffering large losses. Although this hypothesis has been around for several decades, only one controlled empirical study has addressed it, and that study reported null results. In the present experiment, the authors tested the "big win" hypothesis by having 4 groups of participants with little to no experience gambling play a computer-simulated slot machine for credits that were exchangeable for cash. One group experienced a large win on the very 1st play. Another experienced a large win on the 5th play. A 3rd group experienced 2 small wins on the 2nd and 5th plays. No other winning outcomes were programmed. The 4th group never experienced a win. The authors observed a significant effect of group. Participants who experienced a large win on the 1st play quit playing the simulation earlier than participants who experienced a large win on the 5th play. These results appear to question the "big win" as an explanation for pathological gambling. They are more consistent with a behavioral theory of gambling behavior. The present study should also promote the use of laboratory-based research to test long-standing hypotheses in the gambling literature.

  11. Internet-based brain training games, citizen scientists, and big data: ethical issues in unprecedented virtual territories.

    PubMed

    Purcell, Ryan H; Rommelfanger, Karen S

    2015-04-22

    Internet brain training programs, where consumers serve as both subjects and funders of the research, represent the closest engagement many individuals have with neuroscience. Safeguards are needed to protect participants' privacy and the evolving scientific enterprise of big data.

  12. Big Explosives Experimental Facility - BEEF

    SciTech Connect

    2014-10-31

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  13. China: Big Changes Coming Soon

    ERIC Educational Resources Information Center

    Rowen, Henry S.

    2011-01-01

    Big changes are ahead for China, probably abrupt ones. The economy has grown so rapidly for many years, over 30 years at an average of nine percent a year, that its size makes it a major player in trade and finance and increasingly in political and military matters. This growth is not only of great importance internationally, it is already having…

  14. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  15. Big6 Turbotools and Synthesis

    ERIC Educational Resources Information Center

    Tooley, Melinda

    2005-01-01

    The different tools that are helpful during the Synthesis stage, their role in boosting students abilities in Synthesis and the way in which it can be customized to meet the needs of each group of students are discussed. Big6 TurboTools offers several tools to help complete the task. In Synthesis stage, these same tools along with Turbo Report and…

  16. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  17. The Case for "Big History."

    ERIC Educational Resources Information Center

    Christian, David

    1991-01-01

    Urges an approach to the teaching of history that takes the largest possible perspective, crossing time as well as space. Discusses the problems and advantages of such an approach. Describes a course on "big" history that begins with time, creation myths, and astronomy, and moves on to paleontology and evolution. (DK)

  18. Big Explosives Experimental Facility - BEEF

    ScienceCinema

    None

    2016-07-12

    The Big Explosives Experimental Facility or BEEF is a ten acre fenced high explosive testing facility that provides data to support stockpile stewardship and other national security programs. At BEEF conventional high explosives experiments are safely conducted providing sophisticated diagnostics such as high speed optics and x-ray radiography.

  19. Fossils of big bang turbulence

    NASA Astrophysics Data System (ADS)

    Gibson, C. H.

    2004-12-01

    A model is proposed connecting turbulence, fossil turbulence, and the big bang origin of the universe. While details are incomplete, the model is consistent with our knowledge of these processes and is supported by observations. Turbulence arises in a hot-big-bang quantum-gravitational-dynamics scenario at Planck scales. Chaotic, eddy-like-motions produce an exothermic Planck particle cascade from 10-35 m at 1032 K to 108 larger, 104 cooler, quark-gluon scales. A Planck-Kerr instability gives high-Reynolds-number (Re 106) turbulent combustion, space-time-energy-entropy and turbulent mixing. Batchelor-Obukhov-Corrsin turbulent-temperature fluctuations are preserved as the first fossil-turbulence by inflation stretching the patterns beyond the horizon ct of causal connection faster than light speed c in time t 10-33 seconds. Fossil-big-bang-temperature-turbulence re-enters the horizon and imprints nucleosynthesis of H-He densities that seed fragmentation by gravity at 1012 s in the low Reynolds number plasma before its transition to gas at t 1013 s and T 3000 K. Multi-scaling coefficients of the cosmic-microwave-background (CMB) temperature anisotropies closely match those for high Reynolds number turbulence, Bershadskii and Sreenivasan 2002, 2003. CMB spectra support the interpretation that big-bang-turbulence-fossils triggered fragmentation of the viscous plasma at supercluster to galaxy mass scales from 1046 to 1042 kg, Gibson 1996, 2000, 2004ab.

  20. 1976 Big Thompson flood, Colorado

    USGS Publications Warehouse

    Jarrett, R. D.; Vandas, S.J.

    2006-01-01

    In the early evening of July 31, 1976, a large stationary thunderstorm released as much as 7.5 inches of rainfall in about an hour (about 12 inches in a few hours) in the upper reaches of the Big Thompson River drainage. This large amount of rainfall in such a short period of time produced a flash flood that caught residents and tourists by surprise. The immense volume of water that churned down the narrow Big Thompson Canyon scoured the river channel and destroyed everything in its path, including 418 homes, 52 businesses, numerous bridges, paved and unpaved roads, power and telephone lines, and many other structures. The tragedy claimed the lives of 144 people. Scores of other people narrowly escaped with their lives. The Big Thompson flood ranks among the deadliest of Colorado's recorded floods. It is one of several destructive floods in the United States that has shown the necessity of conducting research to determine the causes and effects of floods. The U.S. Geological Survey (USGS) conducts research and operates a Nationwide streamgage network to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson Flood. Such research and streamgage information are part of an ongoing USGS effort to reduce flood hazards and to increase public awareness.

  1. The BigBoss Experiment

    SciTech Connect

    Schelgel, D.; Abdalla, F.; Abraham, T.; Ahn, C.; Allende Prieto, C.; Annis, J.; Aubourg, E.; Azzaro, M.; Bailey, S.; Baltay, C.; Baugh, C.; Bebek, C.; Becerril, S.; Blanton, M.; Bolton, A.; Bromley, B.; Cahn, R.; Carton, P.-H.; Cervanted-Cota, J.L.; Chu, Y.; Cortes, M.; /APC, Paris /Brookhaven /IRFU, Saclay /Marseille, CPPM /Marseille, CPT /Durham U. / /IEU, Seoul /Fermilab /IAA, Granada /IAC, La Laguna / /IAC, Mexico / / /Madrid, IFT /Marseille, Lab. Astrophys. / / /New York U. /Valencia U.

    2012-06-07

    BigBOSS is a Stage IV ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a wide-area galaxy and quasar redshift survey over 14,000 square degrees. It has been conditionally accepted by NOAO in response to a call for major new instrumentation and a high-impact science program for the 4-m Mayall telescope at Kitt Peak. The BigBOSS instrument is a robotically-actuated, fiber-fed spectrograph capable of taking 5000 simultaneous spectra over a wavelength range from 340 nm to 1060 nm, with a resolution R = {lambda}/{Delta}{lambda} = 3000-4800. Using data from imaging surveys that are already underway, spectroscopic targets are selected that trace the underlying dark matter distribution. In particular, targets include luminous red galaxies (LRGs) up to z = 1.0, extending the BOSS LRG survey in both redshift and survey area. To probe the universe out to even higher redshift, BigBOSS will target bright [OII] emission line galaxies (ELGs) up to z = 1.7. In total, 20 million galaxy redshifts are obtained to measure the BAO feature, trace the matter power spectrum at smaller scales, and detect redshift space distortions. BigBOSS will provide additional constraints on early dark energy and on the curvature of the universe by measuring the Ly-alpha forest in the spectra of over 600,000 2.2 < z < 3.5 quasars. BigBOSS galaxy BAO measurements combined with an analysis of the broadband power, including the Ly-alpha forest in BigBOSS quasar spectra, achieves a FOM of 395 with Planck plus Stage III priors. This FOM is based on conservative assumptions for the analysis of broad band power (k{sub max} = 0.15), and could grow to over 600 if current work allows us to push the analysis to higher wave numbers (k{sub max} = 0.3). BigBOSS will also place constraints on theories of modified gravity and inflation, and will measure the sum of neutrino masses to 0.024 eV accuracy.

  2. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  3. Big City Education: Its Challenge to Governance.

    ERIC Educational Resources Information Center

    Haskew, Laurence D.

    This chapter traces the migration from farms to cities and the later movement from cities to suburbs and discusses the impact of the resulting big city environment on the governance of big city education. The author (1) suggests how local, State, and Federal governments can improve big city education; (2) discusses ways of planning for the future…

  4. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  5. Judging Big Deals: Challenges, Outcomes, and Advice

    ERIC Educational Resources Information Center

    Glasser, Sarah

    2013-01-01

    This article reports the results of an analysis of five Big Deal electronic journal packages to which Hofstra University's Axinn Library subscribes. COUNTER usage reports were used to judge the value of each Big Deal. Limitations of usage statistics are also discussed. In the end, the author concludes that four of the five Big Deals are good…

  6. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  7. A SWOT Analysis of Big Data

    ERIC Educational Resources Information Center

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  8. Big Sagebrush Seed Bank Densities Following Wildfires

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Big sagebrush (Artemisia sp.) is a critical shrub to such sagebrush obligate species as sage grouse, (Centocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush do not sprout after wildfires wildfires and big sagebrush seed is generally sho...

  9. Big Data - Smart Health Strategies

    PubMed Central

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  10. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    ScienceCinema

    None

    2016-07-12

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  11. Solution of a braneworld big crunch/big bang cosmology

    SciTech Connect

    McFadden, Paul L.; Turok, Neil; Steinhardt, Paul J.

    2007-11-15

    We solve for the cosmological perturbations in a five-dimensional background consisting of two separating or colliding boundary branes, as an expansion in the collision speed V divided by the speed of light c. Our solution permits a detailed check of the validity of four-dimensional effective theory in the vicinity of the event corresponding to the big crunch/big bang singularity. We show that the four-dimensional description fails at the first nontrivial order in (V/c){sup 2}. At this order, there is nontrivial mixing of the two relevant four-dimensional perturbation modes (the growing and decaying modes) as the boundary branes move from the narrowly separated limit described by Kaluza-Klein theory to the well-separated limit where gravity is confined to the positive-tension brane. We comment on the cosmological significance of the result and compute other quantities of interest in five-dimensional cosmological scenarios.

  12. Antigravity and the big crunch/big bang transition

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  13. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    SciTech Connect

    2009-10-13

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  14. Big Data in Medicine is Driving Big Changes

    PubMed Central

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  15. Big data and ophthalmic research.

    PubMed

    Clark, Antony; Ng, Jonathon Q; Morlet, Nigel; Semmens, James B

    2016-01-01

    Large population-based health administrative databases, clinical registries, and data linkage systems are a rapidly expanding resource for health research. Ophthalmic research has benefited from the use of these databases in expanding the breadth of knowledge in areas such as disease surveillance, disease etiology, health services utilization, and health outcomes. Furthermore, the quantity of data available for research has increased exponentially in recent times, particularly as e-health initiatives come online in health systems across the globe. We review some big data concepts, the databases and data linkage systems used in eye research-including their advantages and limitations, the types of studies previously undertaken, and the future direction for big data in eye research.

  16. District Bets Big on Standards

    ERIC Educational Resources Information Center

    Gewertz, Catherine

    2013-01-01

    The big clock in Dowan McNair-Lee's 8th grade classroom in the Stuart-Hobson Middle School is silent, but she can hear the minutes ticking away nonetheless. On this day, like any other, the clock is a constant reminder of how little time she has to prepare her students--for spring tests, and for high school and all that lies beyond it. The…

  17. Local structure and hyperfine interactions of 57Fe and 119Sn atoms in brownmillerite-like ferrite Sr 2Fe 1.98Sn 0.02O 5+ x

    NASA Astrophysics Data System (ADS)

    Presniakov, Igor; Sobolev, Alexey; Pokholok, Konstantin; Demazeau, Gérard M.; Baranov, Alexey

    2004-10-01

    Mössbauer spectroscopy has been applied for studying local environment of 57Fe and 119Sn probe atoms within tin-doped Sr2Fe1.98Sn0.02O5+x (x⩽0.02) ferrite with the brownmillerite-type structure. 57Fe Mössbauer spectra indicate no appreciable local distortions induced by the tin dopant atoms. The 119Sn spectra recorded below the magnetic ordering temperature (TN) can be described as a superposition of two Zeeman sextets, which indicate that Sn4+ dopant ions are located in two non-equivalent crystallographic and magnetic sites. The observed hyperfine parameters were discussed supposing Sn4+ cations to replace iron cations in the octahedral (SnO) and tetrahedral (SnT) sublattices. It has been supposed that Sn4+ cations being stabilized in the tetrahedral sublattice complete their nearest anion surrounding up to the octahedral oxygen coordination "SnT4+". Annealing of the Sr2Fe1.98Sn0.02O5+x in helium flux conditions at 950°C leads to formation of divalent Sn2+ cations with a simultaneous decrease of the contribution for the SnT4+ sub-spectrum. The parameters of combined electric and magnetic hyperfine interactions of the 119Sn2+ sub-spectrum underline that impurity atoms are stabilized in the sp3d-hybrid state in the oxygen distorted tetragonal pyramid. The analysis of the 119Sn spectra indicates a chemical reversibility of the processes SnT2+⇌SnT4+ within the tetrahedral sublattice of the brownmillerite-type ferrite.

  18. Dynamic Investigation of Release Characteristics of a Streamlined Internal Store from a Simulated Bomb Bay of the Republic F-105 Airplane at Mach Numbers of 0.8, 1.4, and 1.98, Coord. No. AF-222

    NASA Technical Reports Server (NTRS)

    Lee, John B.

    1956-01-01

    An investigation has been conducted in the 27- by 27-inch preflight jet of the Langley Pilotless Aircraft Research Station at Wallops Island, Va., of the release characteristics of a dynamically scaled streamlined-type internally carried store from a simulated bomb bay at Mach numbers M(sub o) of 0.8, 1.4, and 1.98. A l/17-scale model of the Republic F-105 half-fuselage and bomb-bay configuration was used with a streamlined store shape of a fineness ratio of 6.00. Simulated altitudes were 3,400 feet at M(sub o) = 0.8, 3,400, and 29,000 feet at M(sub o) = 1.4, and 29,000 feet at M(sub o) = 1.98. At supersonic speeds, high pitching moments are induced on the store in the vicinity of the bomb bay at high dynamic pressures. Successful ejections could not be made with the original configuration at supersonic speeds at near sea-level conditions. The pitching moments caused by unsymmetrical pressures on the store in a disturbed flow field were overcome by replacing the high-aspect-ratio fin with a low-aspect-ratio fin that had a 30-percent area increase which was less subject to aeroelastic effects. Release characteristics of the store were improved by orienting the fins so that they were in a more uniform flow field at the point of store release. The store pitching moments were shown to be reduced by increasing the simulated altitude. Favorable ejections were made at subsonic speeds at near sea-level conditions.

  19. EHR Big Data Deep Phenotyping

    PubMed Central

    Lenert, L.; Lopez-Campos, G.

    2014-01-01

    Summary Objectives Given the quickening speed of discovery of variant disease drivers from combined patient genotype and phenotype data, the objective is to provide methodology using big data technology to support the definition of deep phenotypes in medical records. Methods As the vast stores of genomic information increase with next generation sequencing, the importance of deep phenotyping increases. The growth of genomic data and adoption of Electronic Health Records (EHR) in medicine provides a unique opportunity to integrate phenotype and genotype data into medical records. The method by which collections of clinical findings and other health related data are leveraged to form meaningful phenotypes is an active area of research. Longitudinal data stored in EHRs provide a wealth of information that can be used to construct phenotypes of patients. We focus on a practical problem around data integration for deep phenotype identification within EHR data. The use of big data approaches are described that enable scalable markup of EHR events that can be used for semantic and temporal similarity analysis to support the identification of phenotype and genotype relationships. Conclusions Stead and colleagues’ 2005 concept of using light standards to increase the productivity of software systems by riding on the wave of hardware/processing power is described as a harbinger for designing future healthcare systems. The big data solution, using flexible markup, provides a route to improved utilization of processing power for organizing patient records in genotype and phenotype research. PMID:25123744

  20. Big data: the management revolution.

    PubMed

    McAfee, Andrew; Brynjolfsson, Erik

    2012-10-01

    Big data, the authors write, is far more powerful than the analytics of the past. Executives can measure and therefore manage more precisely than ever before. They can make better predictions and smarter decisions. They can target more-effective interventions in areas that so far have been dominated by gut and intuition rather than by data and rigor. The differences between big data and analytics are a matter of volume, velocity, and variety: More data now cross the internet every second than were stored in the entire internet 20 years ago. Nearly real-time information makes it possible for a company to be much more agile than its competitors. And that information can come from social networks, images, sensors, the web, or other unstructured sources. The managerial challenges, however, are very real. Senior decision makers have to learn to ask the right questions and embrace evidence-based decision making. Organizations must hire scientists who can find patterns in very large data sets and translate them into useful business information. IT departments have to work hard to integrate all the relevant internal and external sources of data. The authors offer two success stories to illustrate how companies are using big data: PASSUR Aerospace enables airlines to match their actual and estimated arrival times. Sears Holdings directly analyzes its incoming store data to make promotions much more precise and faster.

  1. Genesis of the big bang

    NASA Astrophysics Data System (ADS)

    Alpher, Ralph A.; Herman, Robert

    The authors of this volume have been intimately connected with the conception of the big bang model since 1947. Following the late George Gamov's ideas in 1942 and more particularly in 1946 that the early universe was an appropriate site for the synthesis of the elements, they became deeply involved in the question of cosmic nucleosynthesis and particularly the synthesis of the light elements. In the course of this work they developed a general relativistic model of the expanding universe with physics folded in, which led in a progressive, logical sequence to our prediction of the existence of a present cosmic background radiation some seventeen years before the observation of such radiation was reported by Penzias and Wilson. In addition, they carried out with James W. Follin, Jr., a detailed study of the physics of what was then considered to be the very early universe, starting a few seconds after the big bang, which still provides a methodology for studies of light element nucleosynthesis. Because of their involvement, they bring a personal perspective to the subject. They present a picture of what is now believed to be the state of knowledge about the evolution of the expanding universe and delineate the story of the development of the big bang model as they have seen and lived it from their own unique vantage point.

  2. Turning big bang into big bounce. I. Classical dynamics

    SciTech Connect

    Dzierzak, Piotr; Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-11-15

    The big bounce (BB) transition within a flat Friedmann-Robertson-Walker model is analyzed in the setting of loop geometry underlying the loop cosmology. We solve the constraint of the theory at the classical level to identify physical phase space and find the Lie algebra of the Dirac observables. We express energy density of matter and geometrical functions in terms of the observables. It is the modification of classical theory by the loop geometry that is responsible for BB. The classical energy scale specific to BB depends on a parameter that should be fixed either by cosmological data or determined theoretically at quantum level, otherwise the energy scale stays unknown.

  3. Big A affair. [Big Ambejackmockamus Falls, Penobscot River, Maine

    SciTech Connect

    Laitin, J.

    1985-02-01

    This article describes the conflict between proponents of a hydroelectric power plant on Maine's Penobscot River and recreation interests. Great Northern Paper Company filed application to build a dam on a wild stretch of the river used by sports fishermen and white-water enthusiasts. Great Northern claimed that it needed to replace 380,000 barrels of oil and to forego the purchase of 36,000 megawatt-hours of electricity annually to improve its competitive edge in the marketplace. The controversy will be a big issue for Maine's Land Use Regulatory Commission. The final decision will hinge on the Commission's perception of the greater public benefit - hydropower or recreation.

  4. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    SciTech Connect

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  5. Traffic information computing platform for big data

    SciTech Connect

    Duan, Zongtao Li, Ying Zheng, Xibin Liu, Yan Dai, Jiting Kang, Jun

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  6. Big data: an introduction for librarians.

    PubMed

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included.

  7. Urgent Call for Nursing Big Data.

    PubMed

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference. PMID:27332330

  8. Big data: an introduction for librarians.

    PubMed

    Hoy, Matthew B

    2014-01-01

    Modern life produces data at an astounding rate and shows no signs of slowing. This has lead to new advances in data storage and analysis and the concept of "big data," that is, massive data sets that can yield surprising insights when analyzed. This column will briefly describe what big data is and why it is important. It will also briefly explore the possibilities and problems of big data and the implications it has for librarians. A list of big data projects and resources is also included. PMID:25023020

  9. The LHC's Next Big Mystery

    NASA Astrophysics Data System (ADS)

    Lincoln, Don

    2015-03-01

    When the sun rose over America on July 4, 2012, the world of science had radically changed. The Higgs boson had been discovered. Mind you, the press releases were more cautious than that, with "a new particle consistent with being the Higgs boson" being the carefully constructed phrase of the day. But, make no mistake, champagne corks were popped and backs were slapped. The data had spoken and a party was in order. Even if the observation turned out to be something other than the Higgs boson, the first big discovery from data taken at the Large Hadron Collider had been made.

  10. Big bang nucleosynthesis: An update

    SciTech Connect

    Olive, Keith A.

    2013-07-23

    An update on the standard model of big bang nucleosynthesis (BBN) is presented. With the value of the baryon-tophoton ratio determined to high precision by WMAP, standard BBN is a parameter-free theory. In this context, the theoretical prediction for the abundances of D, {sup 4}He, and {sup 7}Li is discussed and compared to their observational determination. While concordance for D and {sup 4}He is satisfactory, the prediction for {sup 7}Li exceeds the observational determination by a factor of about four. Possible solutions to this problem are discussed.

  11. The faces of Big Science.

    PubMed

    Schatz, Gottfried

    2014-06-01

    Fifty years ago, academic science was a calling with few regulations or financial rewards. Today, it is a huge enterprise confronted by a plethora of bureaucratic and political controls. This change was not triggered by specific events or decisions but reflects the explosive 'knee' in the exponential growth that science has sustained during the past three-and-a-half centuries. Coming to terms with the demands and benefits of 'Big Science' is a major challenge for today's scientific generation. Since its foundation 50 years ago, the European Molecular Biology Organization (EMBO) has been of invaluable help in meeting this challenge.

  12. Fitting ERGMs on big networks.

    PubMed

    An, Weihua

    2016-09-01

    The exponential random graph model (ERGM) has become a valuable tool for modeling social networks. In particular, ERGM provides great flexibility to account for both covariates effects on tie formations and endogenous network formation processes. However, there are both conceptual and computational issues for fitting ERGMs on big networks. This paper describes a framework and a series of methods (based on existent algorithms) to address these issues. It also outlines the advantages and disadvantages of the methods and the conditions to which they are most applicable. Selected methods are illustrated through examples. PMID:27480375

  13. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    ERIC Educational Resources Information Center

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  14. Untapped Potential: Fulfilling the Promise of Big Brothers Big Sisters and the Bigs and Littles They Represent

    ERIC Educational Resources Information Center

    Bridgeland, John M.; Moore, Laura A.

    2010-01-01

    American children represent a great untapped potential in our country. For many young people, choices are limited and the goal of a productive adulthood is a remote one. This report paints a picture of who these children are, shares their insights and reflections about the barriers they face, and offers ways forward for Big Brothers Big Sisters as…

  15. Preliminary development and validation of an Australian community participation questionnaire: types of participation and associations with distress in a coastal community.

    PubMed

    Berry, Helen Louise; Rodgers, Bryan; Dear, Keith B G

    2007-04-01

    Participating in the social and civic life of communities is protectively associated with the onset and course of physical and mental disorders, and is considered important in achieving health promotion goals. Despite its importance in health research, there is no systematically developed measure of community participation. Our aim was to undertake the preliminary development of a community participation questionnaire, including validating it against an external reference, general psychological distress. Participants were 963 randomly selected community members, aged 19-97, from coastal New South Wales, Australia, who completed an anonymous postal survey. There were 14 types of community participation, most of which were characterised by personal involvement, initiative and effort. Frequency of participation varied across types and between women and men. Based on multiple linear regression analyses, controlling for socio-demographic factors, nine types of participation were independently and significantly associated with general psychological distress. Unexpectedly, for two of these, "expressing opinions publicly" and "political protest", higher levels of participation were associated with higher levels of distress. The other seven were: contact with immediate household, extended family, friends, and neighbours; participating in organised community activities; taking an active interest in current affairs; and religious observance. We called these the "Big 7". Higher levels of participation in the Big 7 were associated with lower levels of distress. Participating in an increasing number of the Big 7 types of participation was strongly associated in linear fashion with decreasing distress. PMID:17241727

  16. Baryon symmetric big bang cosmology

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1978-01-01

    Both the quantum theory and Einsteins theory of special relativity lead to the supposition that matter and antimatter were produced in equal quantities during the big bang. It is noted that local matter/antimatter asymmetries may be reconciled with universal symmetry by assuming (1) a slight imbalance of matter over antimatter in the early universe, annihilation, and a subsequent remainder of matter; (2) localized regions of excess for one or the other type of matter as an initial condition; and (3) an extremely dense, high temperature state with zero net baryon number; i.e., matter/antimatter symmetry. Attention is given to the third assumption, which is the simplest and the most in keeping with current knowledge of the cosmos, especially as pertains the universality of 3 K background radiation. Mechanisms of galaxy formation are discussed, whereby matter and antimatter might have collided and annihilated each other, or have coexisted (and continue to coexist) at vast distances. It is pointed out that baryon symmetric big bang cosmology could probably be proved if an antinucleus could be detected in cosmic radiation.

  17. The BigBOSS spectrograph

    NASA Astrophysics Data System (ADS)

    Jelinsky, Patrick; Bebek, Chris; Besuner, Robert; Carton, Pierre-Henri; Edelstein, Jerry; Lampton, Michael; Levi, Michael E.; Poppett, Claire; Prieto, Eric; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a 14,000 square degree galaxy and quasi-stellar object redshift survey. It consists of a 5,000- fiber-positioner focal plane feeding the spectrographs. The optical fibers are separated into ten 500 fiber slit heads at the entrance of ten identical spectrographs in a thermally insulated room. Each of the ten spectrographs has a spectral resolution (λ/Δλ) between 1500 and 4000 over a wavelength range from 360 - 980 nm. Each spectrograph uses two dichroic beam splitters to separate the spectrograph into three arms. It uses volume phase holographic (VPH) gratings for high efficiency and compactness. Each arm uses a 4096x4096 15 μm pixel charge coupled device (CCD) for the detector. We describe the requirements and current design of the BigBOSS spectrograph. Design trades (e.g. refractive versus reflective) and manufacturability are also discussed.

  18. Evidence of big bang turbulence

    NASA Astrophysics Data System (ADS)

    Gibson, Carl H.

    2002-11-01

    Chaotic, eddy-like motions dominated by inertial-vortex forces begin at Planck scales in a hot big-bang-turbulence (BBT) cosmological model where this version of the quantum-gravitational-dynamics epoch produces not only the first space-time-energy of the universe but the first high Reynolds number turbulence and turbulent mixing with Kolmogorov and Batchelor-Obukhov-Corrsin velocity and temperature gradient spectra. Strong-force-freeze-out and inflation produced the first fossil-temperature-turbulence by stretching the fluctuations beyond the horizon scale ct of causal connection for light speed c and time t. Recent Cosmic Background Imager spectra of the cosmic microwave background (CMB) temperature anisotropies at high wavenumbers support the prediction that fossil BBT fluctuation patterns imprinted by nucleosynthesis on light element densities and the associated Sachs-Wolfe temperature fluctuations should not decay by thermal diffusion as expected if the CMB anisotropies were acoustic as commonly assumed. Extended Self Similarity coefficients of the CMB anisotropies exactly match those of high Reynolds number turbulence (Bershadskii and Sreenivasan 2002), supporting the conclusion that fossil big-bang-turbulence seeded nucleosynthesis of light elements and the first hydro-gravitational structure formation.

  19. Astronomical surveys and big data

    NASA Astrophysics Data System (ADS)

    Mickaelian, Areg M.

    Recent all-sky and large-area astronomical surveys and their catalogued data over the whole range of electromagnetic spectrum, from γ -rays to radio waves, are reviewed, including such as Fermi-GLAST and INTEGRAL in γ -ray, ROSAT, XMM and Chandra in X-ray, GALEX in UV, SDSS and several POSS I and POSS II-based catalogues (APM, MAPS, USNO, GSC) in the optical range, 2MASS in NIR, WISE and AKARI IRC in MIR, IRAS and AKARI FIS in FIR, NVSS and FIRST in radio range, and many others, as well as the most important surveys giving optical images (DSS I and II, SDSS, etc.), proper motions (Tycho, USNO, Gaia), variability (GCVS, NSVS, ASAS, Catalina, Pan-STARRS), and spectroscopic data (FBS, SBS, Case, HQS, HES, SDSS, CALIFA, GAMA). An overall understanding of the coverage along the whole wavelength range and comparisons between various surveys are given: galaxy redshift surveys, QSO/AGN, radio, Galactic structure, and Dark Energy surveys. Astronomy has entered the Big Data era, with Astrophysical Virtual Observatories and Computational Astrophysics playing an important role in using and analyzing big data for new discoveries.

  20. An Investigation of Big Five Personality Traits and Career Decidedness among Early and Middle Adolescents

    ERIC Educational Resources Information Center

    Lounsbury, John W.; Hutchens, Teresa; Loveland, James M.

    2005-01-01

    Big Five personality traits were analyzed in relation to career decidedness among adolescents in middle and high school. Participants were 248 7th-grade, 321 10th-grade, and 282 12th-grade students. As hypothesized, Conscientiousness was positively and significantly correlated with career decidedness in all three grades. Openness and Agreeableness…

  1. Effects of a Preschool and Kindergarten Mathematics Curriculum: Big Math for Little Kids

    ERIC Educational Resources Information Center

    Presser, Ashley Lewis; Clements, Margaret; Ginsburg, Herbert; Ertle, Barbrina

    2012-01-01

    "Research Findings: Big Math for Little Kids" ("BMLK") is a mathematics curriculum designed for 4- and 5-year-old children. In this study, the curriculum was evaluated for effectiveness over two years, using a cluster-randomized controlled study. Over 750 children participated in the study and experienced either the "BMLK" curriculum or…

  2. Learning English through Social Interaction: The Case of "Big Brother 2006," Finland

    ERIC Educational Resources Information Center

    Kaanta, Leila; Jauni, Heidi; Leppanen, Sirpa; Peuronen, Saija; Paakkinen, Terhi

    2013-01-01

    In line with recent Conversation Analytic work on language learning as situated practice, this article investigates how interactants can create language learning opportunities for themselves and others in and through social interaction. The study shows how the participants of "Big Brother Finland," a reality TV show, whose main…

  3. Hardiness and the Big Five Personality Traits among Chinese University Students

    ERIC Educational Resources Information Center

    Zhang, Li-fang

    2011-01-01

    This study examines the construct of hardiness with the Big Five personality traits among 362 Chinese university students. Participants in the study responded to the Dispositional Hardiness Scale (Bartone, Ursano, Wright, & Ingraham, 1989) and the Revised NEO Personality Inventory (Costa & McCrae, 1992). Results indicate that personality traits…

  4. 3. EASTERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. EASTERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING THE SOUTHEAST END OF THE DAM, AND THE HOLLOW BAYS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  5. 6. EASTERLY VIEW OF BIG DALTON DAM SHOWING THE SHELTER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. EASTERLY VIEW OF BIG DALTON DAM SHOWING THE SHELTER HOUSE IN THE BACKGROUND. PHOTO TAKEN FROM THE ACCESS ROAD LEADING TO THE CONTROL HOUSE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  6. 2. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING THE NORTHWEST END OF THE DAM, THE CONTROL HOUSE, AND SPILLWAY CHUTE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  7. Big sagebrush transplanting success in crested wheatgrass stands

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The conversion of formerly big sagebrush (Artemisia tridentate ssp. wyomingensis)/bunchgrass communities to annual grass dominance, primarily cheatgrass (Bromus tectorum), in Wyoming big sagebrush ecosystems has sparked the increasing demand to establish big sagebrush on disturbed rangelands. The e...

  8. 2. Big Creek Road, worm fence and road at trailhead. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. Big Creek Road, worm fence and road at trailhead. - Great Smoky Mountains National Park Roads & Bridges, Big Creek Road, Between State Route 284 & Big Creek Campground, Gatlinburg, Sevier County, TN

  9. Big system: Interactive graphics for the engineer

    NASA Technical Reports Server (NTRS)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  10. Efficiency, Corporate Power, and the Bigness Complex.

    ERIC Educational Resources Information Center

    Adams, Walter; Brock, James W.

    1990-01-01

    Concludes that (1) the current infatuation with corporate bigness is void of credible empirical support; (2) disproportionate corporate size and industry concentration are incompatible with and destructive to good economic performance; and (3) structurally oriented antitrust policy must be revitalized to combat the burdens of corporate bigness.…

  11. An embedding for the big bang

    NASA Technical Reports Server (NTRS)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  12. In Search of the Big Bubble

    ERIC Educational Resources Information Center

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  13. A New Look at Big History

    ERIC Educational Resources Information Center

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  14. Epidemiology in the Era of Big Data

    PubMed Central

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  15. The Big bang and the Quantum

    NASA Astrophysics Data System (ADS)

    Ashtekar, Abhay

    2010-06-01

    General relativity predicts that space-time comes to an end and physics comes to a halt at the big-bang. Recent developments in loop quantum cosmology have shown that these predictions cannot be trusted. Quantum geometry effects can resolve singularities, thereby opening new vistas. Examples are: The big bang is replaced by a quantum bounce; the `horizon problem' disappears; immediately after the big bounce, there is a super-inflationary phase with its own phenomenological ramifications; and, in presence of a standard inflation potential, initial conditions are naturally set for a long, slow roll inflation independently of what happens in the pre-big bang branch. As in my talk at the conference, I will first discuss the foundational issues and then the implications of the new Planck scale physics near the Big Bang.

  16. Development and validation of Big Four personality scales for the Schedule for Nonadaptive and Adaptive Personality--Second Edition (SNAP-2).

    PubMed

    Calabrese, William R; Rudick, Monica M; Simms, Leonard J; Clark, Lee Anna

    2012-09-01

    Recently, integrative, hierarchical models of personality and personality disorder (PD)--such as the Big Three, Big Four, and Big Five trait models--have gained support as a unifying dimensional framework for describing PD. However, no measures to date can simultaneously represent each of these potentially interesting levels of the personality hierarchy. To unify these measurement models psychometrically, we sought to develop Big Five trait scales within the Schedule for Nonadaptive and Adaptive Personality--Second Edition (SNAP-2). Through structural and content analyses, we examined relations between the SNAP-2, the Big Five Inventory (BFI), and the NEO Five-Factor Inventory (NEO-FFI) ratings in a large data set (N = 8,690), including clinical, military, college, and community participants. Results yielded scales consistent with the Big Four model of personality (i.e., Neuroticism, Conscientiousness, Introversion, and Antagonism) and not the Big Five, as there were insufficient items related to Openness. Resulting scale scores demonstrated strong internal consistency and temporal stability. Structural validity and external validity were supported by strong convergent and discriminant validity patterns between Big Four scale scores and other personality trait scores and expectable patterns of self-peer agreement. Descriptive statistics and community-based norms are provided. The SNAP-2 Big Four Scales enable researchers and clinicians to assess personality at multiple levels of the trait hierarchy and facilitate comparisons among competing big-trait models.

  17. Development and Validation of Big Four Personality Scales for the Schedule for Nonadaptive and Adaptive Personality-2nd Edition (SNAP-2)

    PubMed Central

    Calabrese, William R.; Rudick, Monica M.; Simms, Leonard J.; Clark, Lee Anna

    2012-01-01

    Recently, integrative, hierarchical models of personality and personality disorder (PD)—such as the Big Three, Big Four and Big Five trait models—have gained support as a unifying dimensional framework for describing PD. However, no measures to date can simultaneously represent each of these potentially interesting levels of the personality hierarchy. To unify these measurement models psychometrically, we sought to develop Big Five trait scales within the Schedule for Adaptive and Nonadaptive Personality–2nd Edition (SNAP-2). Through structural and content analyses, we examined relations between the SNAP-2, Big Five Inventory (BFI), and NEO-Five Factor Inventory (NEO-FFI) ratings in a large data set (N = 8,690), including clinical, military, college, and community participants. Results yielded scales consistent with the Big Four model of personality (i.e., Neuroticism, Conscientiousness, Introversion, and Antagonism) and not the Big Five as there were insufficient items related to Openness. Resulting scale scores demonstrated strong internal consistency and temporal stability. Structural and external validity was supported by strong convergent and discriminant validity patterns between Big Four scale scores and other personality trait scores and expectable patterns of self-peer agreement. Descriptive statistics and community-based norms are provided. The SNAP-2 Big Four Scales enable researchers and clinicians to assess personality at multiple levels of the trait hierarchy and facilitate comparisons among competing “Big Trait” models. PMID:22250598

  18. The big war over brackets.

    PubMed

    Alvarez, R O

    1994-01-01

    The Third Preparatory Committee Meeting for the International Conference on Population and Development (ICPD), PrepCom III, was held at UN headquarters in New York on April 4-22, 1994. It was the last big preparatory meeting leading to the ICPD to be held in Cairo, Egypt, in September 1994. The author attended the second week of meetings as the official delegate of the Institute for Social Studies and Action. Debates mostly focused upon reproductive health and rights, sexual health and rights, family planning, contraception, condom use, fertility regulation, pregnancy termination, and safe motherhood. The Vatican and its allies' preoccupation with discussing language which may imply abortion caused sustainable development, population, consumption patterns, internal and international migration, economic strategies, and budgetary allocations to be discussed less extensively than they should have been. The author describes points of controversy, the power of women at the meetings, and afterthoughts on the meetings.

  19. Exploring Relationships in Big Data

    NASA Astrophysics Data System (ADS)

    Mahabal, A.; Djorgovski, S. G.; Crichton, D. J.; Cinquini, L.; Kelly, S.; Colbert, M. A.; Kincaid, H.

    2015-12-01

    Big Data are characterized by several different 'V's. Volume, Veracity, Volatility, Value and so on. For many datasets inflated Volumes through redundant features often make the data more noisy and difficult to extract Value out of. This is especially true if one is comparing/combining different datasets, and the metadata are diverse. We have been exploring ways to exploit such datasets through a variety of statistical machinery, and visualization. We show how we have applied it to time-series from large astronomical sky-surveys. This was done in the Virtual Observatory framework. More recently we have been doing similar work for a completely different domain viz. biology/cancer. The methodology reuse involves application to diverse datasets gathered through the various centers associated with the Early Detection Research Network (EDRN) for cancer, an initiative of the National Cancer Institute (NCI). Application to Geo datasets is a natural extension.

  20. Was the Big Bang hot?

    NASA Technical Reports Server (NTRS)

    Wright, E. L.

    1983-01-01

    Techniques for verifying the spectrum defined by Woody and Richards (WR, 1981), which serves as a base for dust-distorted models of the 3 K background, are discussed. WR detected a sharp deviation from the Planck curve in the 3 K background. The absolute intensity of the background may be determined by the frequency dependence of the dipole anisotropy of the background or the frequency dependence effect in galactic clusters. Both methods involve the Doppler shift; analytical formulae are defined for characterization of the dipole anisotropy. The measurement of the 30-300 GHz spectra of cold galactic dust may reveal the presence of significant amounts of needle-shaped grains, which would in turn support a theory of a cold Big Bang.

  1. Island Universe or Big Galaxy?

    NASA Astrophysics Data System (ADS)

    Wolfschmidt, Gudrun

    In 1920, the "great debate" took place: Harlow Shapley defended his model of the "Big Galaxy", i.e. we live in a large galaxy and all nebulous objects belong to our galaxy. He got this result from the distribution of the globular nebulae. Heber D. Curtis on the other side analyzed novae and was then convinced that nebulae are far distant objects which are stellar systems themselves like our galaxy. The solution of the discussion was brought by Edwin P. Hubble who confirmed the interpretation of nebulae as extragalactic objects, i.e. galaxies, and introduced the red shift for getting the distance of galaxies. The resulting expansion of the universe led to a new cosmological world view.

  2. Evidence of the big fix

    NASA Astrophysics Data System (ADS)

    Hamada, Yuta; Kawai, Hikaru; Kawana, Kiyoharu

    2014-06-01

    We give an evidence of the Big Fix. The theory of wormholes and multiverse suggests that the parameters of the Standard Model are fixed in such a way that the total entropy at the late stage of the universe is maximized, which we call the maximum entropy principle. In this paper, we discuss how it can be confirmed by the experimental data, and we show that it is indeed true for the Higgs vacuum expectation value vh. We assume that the baryon number is produced by the sphaleron process, and that the current quark masses, the gauge couplings and the Higgs self-coupling are fixed when we vary vh. It turns out that the existence of the atomic nuclei plays a crucial role to maximize the entropy. This is reminiscent of the anthropic principle, however it is required by the fundamental law in our case.

  3. Spectral observations of big objects

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.; Sargsyan, L. A.

    2010-12-01

    This is a summary and general analysis of optical spectroscopic data on 172 BIG (Byurakan-IRAS Galaxies) objects obtained with the BAO 2.6-m, SAO 6-m, and OHP 1.93-m telescopes. 102 galaxies with star formation regions, 29 galaxies with active nuclei, and 19 galaxies with a composite spectrum were identified. The spectra of 12 of the galaxies show signs of emission, but without the possibility of a more precise determination of their activity class, 9 galaxies appear to have star formation rates that do not exceed normal, and 1 is an absorption galaxy. In order to establish the nature of these galaxies and the place they occupy in the general picture of the evolution of the universe, we compare them with 128 infrared galaxies.

  4. Big Mysteries: The Higgs Mass

    ScienceCinema

    Lincoln, Don

    2016-07-12

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  5. Big Mysteries: The Higgs Mass

    SciTech Connect

    Lincoln, Don

    2014-04-28

    With the discovery of what looks to be the Higgs boson, LHC researchers are turning their attention to the next big question, which is the predicted mass of the newly discovered particles. When the effects of quantum mechanics is taken into account, the mass of the Higgs boson should be incredibly high...perhaps upwards of a quadrillion times higher than what was observed. In this video, Fermilab's Dr. Don Lincoln explains how it is that the theory predicts that the mass is so large and gives at least one possible theoretical idea that might solve the problem. Whether the proposed idea is the answer or not, this question must be answered by experiments at the LHC or today's entire theoretical paradigm could be in jeopardy.

  6. Big Bang nucleosynthesis in crisis\\?

    NASA Astrophysics Data System (ADS)

    Hata, N.; Scherrer, R. J.; Steigman, G.; Thomas, D.; Walker, T. P.; Bludman, S.; Langacker, P.

    1995-11-01

    A new evaluation of the constraint on the number of light neutrino species (Nν) from big bang nucleosynthesis suggests a discrepancy between the predicted light element abundances and those inferred from observations, unless the inferred primordial 4He abundance has been underestimated by 0.014+/-0.004 (1σ) or less than 10% (95% C.L.) of 3He survives stellar processing. With the quoted systematic errors in the observed abundances and a conservative chemical evolution parametrization, the best fit to the combined data is Nν=2.1+/-0.3 (1σ) and the upper limit is Nν<2.6 (95% C.L.). The data are inconsistent with the standard model (Nν=3) at the 98.6% C.L.

  7. Microsystems - The next big thing

    SciTech Connect

    STINNETT,REGAN W.

    2000-05-11

    Micro-Electro-Mechanical Systems (MEMS) is a big name for tiny devices that will soon make big changes in everyday life and the workplace. These and other types of Microsystems range in size from a few millimeters to a few microns, much smaller than a human hair. These Microsystems have the capability to enable new ways to solve problems in commercial applications ranging from automotive, aerospace, telecommunications, manufacturing equipment, medical diagnostics to robotics, and in national security applications such as nuclear weapons safety and security, battlefield intelligence, and protection against chemical and biological weapons. This broad range of applications of Microsystems reflects the broad capabilities of future Microsystems to provide the ability to sense, think, act, and communicate, all in a single integrated package. Microsystems have been called the next silicon revolution, but like many revolutions, they incorporate more elements than their predecessors. Microsystems do include MEMS components fabricated from polycrystalline silicon processed using techniques similar to those used in the manufacture of integrated electrical circuits. They also include optoelectronic components made from gallium arsenide and other semiconducting compounds from the III-V groups of the periodic table. Microsystems components are also being made from pure metals and metal alloys using the LIGA process, which utilizes lithography, etching, and casting at the micron scale. Generically, Microsystems are micron scale, integrated systems that have the potential to combine the ability to sense light, heat, pressure, acceleration, vibration, and chemicals with the ability to process the collected data using CMOS circuitry, execute an electrical, mechanical, or photonic response, and communicate either optically or with microwaves.

  8. [Big data in medicine and healthcare].

    PubMed

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  9. [Big five personality factors related to face recognition].

    PubMed

    Saito, Takako; Nakamura, Tomoyasu; Endo, Toshihiko

    2005-02-01

    The present study examined whether scores on big five personality factors correlated with face-recognition response time in visual search paradigm. Sixty adjectives were used to measure personality scores of 60 participants along the five factors of Extroversion, Neuroticism, Openness to Experience, Agreeableness, and Conscientiousness. Picture of human faces or geometrical figures in a 4 x 4 array were used as stimuli. The sixteen faces or figures were either identical (absent condition) or one randomly placed target with 15 identical distracters (present condition). Participants were asked to respond 'present' or 'absent' as fast and accurately as possible. Results showed that the response time differed significantly between high and low groups of each personality factor except Agreeableness. For Extroversion, Neuroticism, and Conscientiousness, the response time difference was observed only for human face recognition. The results suggested that personality differences and face recognition were related. PMID:15782589

  10. Big questions, big science: meeting the challenges of global ecology.

    PubMed

    Schimel, David; Keller, Michael

    2015-04-01

    Ecologists are increasingly tackling questions that require significant infrastucture, large experiments, networks of observations, and complex data and computation. Key hypotheses in ecology increasingly require more investment, and larger data sets to be tested than can be collected by a single investigator's or s group of investigator's labs, sustained for longer than a typical grant. Large-scale projects are expensive, so their scientific return on the investment has to justify the opportunity cost-the science foregone because resources were expended on a large project rather than supporting a number of individual projects. In addition, their management must be accountable and efficient in the use of significant resources, requiring the use of formal systems engineering and project management to mitigate risk of failure. Mapping the scientific method into formal project management requires both scientists able to work in the context, and a project implementation team sensitive to the unique requirements of ecology. Sponsoring agencies, under pressure from external and internal forces, experience many pressures that push them towards counterproductive project management but a scientific community aware and experienced in large project science can mitigate these tendencies. For big ecology to result in great science, ecologists must become informed, aware and engaged in the advocacy and governance of large ecological projects.

  11. Processing Solutions for Big Data in Astronomy

    NASA Astrophysics Data System (ADS)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  12. Big data and the electronic health record.

    PubMed

    Peters, Steve G; Buntrock, James D

    2014-01-01

    The electronic medical record has evolved from a digital representation of individual patient results and documents to information of large scale and complexity. Big Data refers to new technologies providing management and processing capabilities, targeting massive and disparate data sets. For an individual patient, techniques such as Natural Language Processing allow the integration and analysis of textual reports with structured results. For groups of patients, Big Data offers the promise of large-scale analysis of outcomes, patterns, temporal trends, and correlations. The evolution of Big Data analytics moves us from description and reporting to forecasting, predictive modeling, and decision optimization.

  13. NOAA Big Data Partnership RFI

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2014-12-01

    In February 2014, the US National Oceanic and Atmospheric Administration (NOAA) issued a Big Data Request for Information (RFI) from industry and other organizations (e.g., non-profits, research laboratories, and universities) to assess capability and interest in establishing partnerships to position a copy of NOAA's vast data holdings in the Cloud, co-located with easy and affordable access to analytical capabilities. This RFI was motivated by a number of concerns. First, NOAA's data facilities do not necessarily have sufficient network infrastructure to transmit all available observations and numerical model outputs to all potential users, or sufficient infrastructure to support simultaneous computation by many users. Second, the available data are distributed across multiple services and data facilities, making it difficult to find and integrate data for cross-domain analysis and decision-making. Third, large datasets require users to have substantial network, storage, and computing capabilities of their own in order to fully interact with and exploit the latent value of the data. Finally, there may be commercial opportunities for value-added products and services derived from our data. Putting a working copy of data in the Cloud outside of NOAA's internal networks and infrastructures should reduce demands and risks on our systems, and should enable users to interact with multiple datasets and create new lines of business (much like the industries built on government-furnished weather or GPS data). The NOAA Big Data RFI therefore solicited information on technical and business approaches regarding possible partnership(s) that -- at no net cost to the government and minimum impact on existing data facilities -- would unleash the commercial potential of its environmental observations and model outputs. NOAA would retain the master archival copy of its data. Commercial partners would not be permitted to charge fees for access to the NOAA data they receive, but

  14. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2005-01-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. Efforts are underway to showcase the architecture of the GIS framework and initial results for sources and sinks. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is

  15. Promoting People's Participation.

    ERIC Educational Resources Information Center

    Fraser, Colin

    1981-01-01

    Discusses problems associated with communication in rural areas to promote participation in development programs. Suggests that success of such programs depends on continued government policy in favor of citizen participation in agricultural and rural development. (SK)

  16. Federal participation in LEED

    SciTech Connect

    Payne, Christopher; Dyer, Beverly

    2004-11-10

    The federal government has been an active participant in the development and use of USGBC's Leadership in Energy & Environmental Design Green Building Rating System (LEED). This paper presents a review of this participation and some expectations for ongoing partnership.

  17. Participative Training Skills.

    ERIC Educational Resources Information Center

    Rodwell, John

    Based on extensive field experience, this two-part book is intended to be a practical guide for maximizing participative training methods. The first part of the book looks at the principles and the core skills involved in participative training. It shows how trainee participation corresponds to the processes of adult learning and describes each…

  18. School Lunch Program Participation.

    ERIC Educational Resources Information Center

    Zucchino, Lori; Ranney, Christine K.

    1990-01-01

    Reductions in participation in National School Lunch Program in 1981-82 are of concern to hunger groups and legislators. Extent to which Omnibus Budget Reconciliation Acts (OBRA) of 1980-81 contributes to participation decline was measured by simulation model in New York State. Results suggest that OBRA increased participation; declining…

  19. Cosmic relics from the big bang

    SciTech Connect

    Hall, L.J.

    1988-12-01

    A brief introduction to the big bang picture of the early universe is given. Dark matter is discussed; particularly its implications for elementary particle physics. A classification scheme for dark matter relics is given. 21 refs., 11 figs., 1 tab.

  20. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime. PMID:16712061

  1. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  2. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  3. The NOAA Big Data Project

    NASA Astrophysics Data System (ADS)

    de la Beaujardiere, J.

    2015-12-01

    The US National Oceanic and Atmospheric Administration (NOAA) is a Big Data producer, generating tens of terabytes per day from hundreds of sensors on satellites, radars, aircraft, ships, and buoys, and from numerical models. These data are of critical importance and value for NOAA's mission to understand and predict changes in climate, weather, oceans, and coasts. In order to facilitate extracting additional value from this information, NOAA has established Cooperative Research and Development Agreements (CRADAs) with five Infrastructure-as-a-Service (IaaS) providers — Amazon, Google, IBM, Microsoft, Open Cloud Consortium — to determine whether hosting NOAA data in publicly-accessible Clouds alongside on-demand computational capability stimulates the creation of new value-added products and services and lines of business based on the data, and if the revenue generated by these new applications can support the costs of data transmission and hosting. Each IaaS provider is the anchor of a "Data Alliance" which organizations or entrepreneurs can join to develop and test new business or research avenues. This presentation will report on progress and lessons learned during the first 6 months of the 3-year CRADAs.

  4. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan Capalbo

    2005-12-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I are organized into four areas: (1) Evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; (2) Development of GIS-based reporting framework that links with national networks; (3) Design of an integrated suite of monitoring, measuring, and verification technologies, market-based opportunities for carbon management, and an economic/risk assessment framework; (referred to below as the Advanced Concepts component of the Phase I efforts) and (4) Initiation of a comprehensive education and outreach program. As a result of the Phase I activities, the groundwork is in place to provide an assessment of storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that complements the ongoing DOE research agenda in Carbon Sequestration. The geology of the Big Sky Carbon Sequestration Partnership Region is favorable for the potential sequestration of enormous volume of CO{sub 2}. The United States Geological Survey (USGS 1995) identified 10 geologic provinces and 111 plays in the region. These provinces and plays include both sedimentary rock types characteristic of oil, gas, and coal productions as well as large areas of mafic volcanic rocks. Of the 10 provinces and 111 plays, 1 province and 4 plays are located within Idaho. The remaining 9 provinces and 107 plays are dominated by sedimentary rocks and located in the states of Montana and Wyoming. The potential sequestration capacity of the 9 sedimentary provinces within the region ranges from 25,000 to almost 900,000 million metric tons of CO{sub 2}. Overall every sedimentary formation investigated

  5. Big-bang nucleosynthesis revisited

    NASA Technical Reports Server (NTRS)

    Olive, Keith A.; Schramm, David N.; Steigman, Gary; Walker, Terry P.

    1989-01-01

    The homogeneous big-bang nucleosynthesis yields of D, He-3, He-4, and Li-7 are computed taking into account recent measurements of the neutron mean-life as well as updates of several nuclear reaction rates which primarily affect the production of Li-7. The extraction of primordial abundances from observation and the likelihood that the primordial mass fraction of He-4, Y(sub p) is less than or equal to 0.24 are discussed. Using the primordial abundances of D + He-3 and Li-7 we limit the baryon-to-photon ratio (eta in units of 10 exp -10) 2.6 less than or equal to eta(sub 10) less than or equal to 4.3; which we use to argue that baryons contribute between 0.02 and 0.11 to the critical energy density of the universe. An upper limit to Y(sub p) of 0.24 constrains the number of light neutrinos to N(sub nu) less than or equal to 3.4, in excellent agreement with the LEP and SLC collider results. We turn this argument around to show that the collider limit of 3 neutrino species can be used to bound the primordial abundance of He-4: 0.235 less than or equal to Y(sub p) less than or equal to 0.245.

  6. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-01

    The Big Sky Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts during the second performance period fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts begun in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for

  7. "Big data" and "open data": What kind of access should researchers enjoy?

    PubMed

    Chatellier, Gilles; Varlet, Vincent; Blachier-Poisson, Corinne

    2016-02-01

    The healthcare sector is currently facing a new paradigm, the explosion of "big data". Coupled with advances in computer technology, the field of "big data" appears promising, allowing us to better understand the natural history of diseases, to follow-up new technologies (devices, drugs) implementation and to participate in precision medicine, etc. Data sources are multiple (medical and administrative data, electronic medical records, data from rapidly developing technologies such as DNA sequencing, connected devices, etc.) and heterogeneous while their use requires complex methods for accurate analysis. Moreover, faced with this new paradigm, we must determine who could (or should) have access to which data, how to combine collective interest and protection of personal data and how to finance in the long-term both operating costs and databases interrogation. This article analyses the opportunities and challenges related to the use of open and/or "big data", from the viewpoint of pharmacologists and representatives of the pharmaceutical and medical device industry.

  8. Rethinking Big Science. Modest, mezzo, grand science and the development of the Bevalac, 1971-1993.

    PubMed

    Westfall, Catherine

    2003-03-01

    Historians of science have tended to focus exclusively on scale in investigations of largescale research, perhaps because it has been easy to assume that comprehending a phenomenon dubbed "Big Science" hinges on an understanding of bigness. A close look at Lawrence Berkeley Laboratory's Bevalac, a medium-scale "mezzo science" project formed by uniting two preexisting machines--the modest SuperHILAC and the grand Bevatron--shows what can be gained by overcoming this preoccupation with bigness. The Bevalac story reveals how interconnections, connections, and disconnections ultimately led to the development of a new kind of science that transformed the landscape of large-scale research in the United States. Important lessons in historiography also emerge: the value of framing discussions in terms of networks, the necessity of constantly expanding and refining methodology, and the importance of avoiding the rhetoric of participants and instead finding words to tell our own stories.

  9. Data Confidentiality Challenges in Big Data Applications

    SciTech Connect

    Yin, Jian; Zhao, Dongfang

    2015-12-15

    In this paper, we address the problem of data confidentiality in big data analytics. In many fields, much useful patterns can be extracted by applying machine learning techniques to big data. However, data confidentiality must be protected. In many scenarios, data confidentiality could well be a prerequisite for data to be shared. We present a scheme to provide provable secure data confidentiality and discuss various techniques to optimize performance of such a system.

  10. Quality of Big Data in Healthcare

    SciTech Connect

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  11. Dark energy, wormholes, and the big rip

    SciTech Connect

    Faraoni, V.; Israel, W.

    2005-03-15

    The time evolution of a wormhole in a Friedmann universe approaching the big rip is studied. The wormhole is modeled by a thin spherical shell accreting the superquintessence fluid--two different models are presented. Contrary to recent claims that the wormhole overtakes the expansion of the universe and engulfs it before the big rip is reached, it is found that the wormhole becomes asymptotically comoving with the cosmic fluid and the future evolution of the universe is fully causal.

  12. COBE looks back to the Big Bang

    NASA Technical Reports Server (NTRS)

    Mather, John C.

    1993-01-01

    An overview is presented of NASA-Goddard's Cosmic Background Explorer (COBE), the first NASA satellite designed to observe the primeval explosion of the universe. The spacecraft carries three extremely sensitive IR and microwave instruments designed to measure the faint residual radiation from the Big Bang and to search for the formation of the first galaxies. COBE's far IR absolute spectrophotometer has shown that the Big Bang radiation has a blackbody spectrum, proving that there was no large energy release after the explosion.

  13. Cincinnati Big Area Additive Manufacturing (BAAM)

    SciTech Connect

    Duty, Chad E.; Love, Lonnie J.

    2015-03-04

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  14. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  15. Big Data: Survey, Technologies, Opportunities, and Challenges

    PubMed Central

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  16. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  17. What can zookeepers tell us about interacting with big cats in captivity?

    PubMed

    Szokalski, Monika S; Litchfield, Carla A; Foster, Wendy K

    2013-03-01

    Despite the potential dangers involved, interactions between zookeepers and captive big cats are increasing. Research with other animals, particularly nonhuman primates, suggests that closer interactions can be beneficial not only for the animals and their keepers, but also for zoo visitors. This study sought to determine whether the same benefits may apply to keeper-big cat interactions. An online questionnaire was completed by 86 keepers worldwide, assessing which types of handling (hands-on, protected, hands-off) they practice with their big cats, whether they practice training, and what their opinions of these methods are (through a series of rating scales and open-ended questions). Protected contact was the most frequently used handling method among this sample, particularly with lions, tigers, and cheetahs, and training was practiced by the majority of participants with all big cat species. Participants perceived protected contact as the most beneficial handling practice for big cats, keepers, and visitors, noting how it can allow a close bond between keeper and cat, as well as its educational value for zoo visitors. Contrastingly, concerns were raised about the use of hands-on approaches, particularly with regard to the safety of all parties involved and the potential for wrong messages to be sent to visitors. Further, training was reported to be more beneficial for each group than any handling practice, yielding similar potential benefits as protected contact. Consistent with existing information with other species, these findings will be useful in directing objective research examining the use of different handling and training methods with big cats.

  18. Big Jobs: Planning for Competence

    ERIC Educational Resources Information Center

    Jones, Nancy P.

    2005-01-01

    Three- to five-year-olds grow emotionally participating in meaningful and challenging physical, social, and problem-solving activities outdoors in an early childhood program on a farm. Caring for animals, planting, raking, shoveling, and engaging in meaningful indoor activities, under adult supervision, children learn to work collaboratively,…

  19. Big Books and Small Marvels

    ERIC Educational Resources Information Center

    Stanistreet, Paul

    2012-01-01

    The Reader Organisation's Get into Reading programme is all about getting people together in groups to engage with serious books. The groups are mixed and the participants sometimes challenging, but the outcomes are often remarkable. Jane Davis, who founded the Reader Organisation and continues to oversee Get into Reading, has witnessed a massive…

  20. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-06-30

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop (see attached agenda). The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement

  1. Boosting Big National Lab Data

    SciTech Connect

    Kleese van Dam, Kerstin

    2013-02-21

    Introduction: Big data. Love it or hate it, solving the world’s most intractable problems requires the ability to make sense of huge and complex sets of data and do it quickly. Speeding up the process – from hours to minutes or from weeks to days – is key to our success. One major source of such big data are physical experiments. As many will know, these physical experiments are commonly used to solve challenges in fields such as energy security, manufacturing, medicine, pharmacology, environmental protection and national security. Experiments use different instruments and sensor types to research for example the validity of new drugs, the base cause for diseases, more efficient energy sources, new materials for every day goods, effective methods for environmental cleanup, the optimal ingredients composition for chocolate or determine how to preserve valuable antics. This is done by experimentally determining the structure, properties and processes that govern biological systems, chemical processes and materials. The speed and quality at which we can acquire new insights from experiments directly influences the rate of scientific progress, industrial innovation and competitiveness. And gaining new groundbreaking insights, faster, is key to the economic success of our nations. Recent years have seen incredible advances in sensor technologies, from house size detector systems in large experiments such as the Large Hadron Collider and the ‘Eye of Gaia’ billion pixel camera detector to high throughput genome sequencing. These developments have led to an exponential increase in data volumes, rates and variety produced by instruments used for experimental work. This increase is coinciding with a need to analyze the experimental results at the time they are collected. This speed is required to optimize the data taking and quality, and also to enable new adaptive experiments, where the sample is manipulated as it is observed, e.g. a substance is injected into a

  2. Big bang nucleosynthesis: Present status

    NASA Astrophysics Data System (ADS)

    Cyburt, Richard H.; Fields, Brian D.; Olive, Keith A.; Yeh, Tsung-Han

    2016-01-01

    Big bang nucleosynthesis (BBN) describes the production of the lightest nuclides via a dynamic interplay among the four fundamental forces during the first seconds of cosmic time. A brief overview of the essentials of this physics is given, and new calculations presented of light-element abundances through 6Li and 7Li, with updated nuclear reactions and uncertainties including those in the neutron lifetime. Fits are provided for these results as a function of baryon density and of the number of neutrino flavors Nν. Recent developments are reviewed in BBN, particularly new, precision Planck cosmic microwave background (CMB) measurements that now probe the baryon density, helium content, and the effective number of degrees of freedom Neff. These measurements allow for a tight test of BBN and cosmology using CMB data alone. Our likelihood analysis convolves the 2015 Planck data chains with our BBN output and observational data. Adding astronomical measurements of light elements strengthens the power of BBN. A new determination of the primordial helium abundance is included in our likelihood analysis. New D/H observations are now more precise than the corresponding theoretical predictions and are consistent with the standard model and the Planck baryon density. Moreover, D/H now provides a tight measurement of Nν when combined with the CMB baryon density and provides a 2 σ upper limit Nν<3.2 . The new precision of the CMB and D/H observations together leaves D/H predictions as the largest source of uncertainties. Future improvement in BBN calculations will therefore rely on improved nuclear cross-section data. In contrast with D/H and 4He, 7Li predictions continue to disagree with observations, perhaps pointing to new physics. This paper concludes with a look at future directions including key nuclear reactions, astronomical observations, and theoretical issues.

  3. Pockmarks off Big Sur, California

    USGS Publications Warehouse

    Paull, C.; Ussler, W.; Maher, N.; Greene, H. Gary; Rehder, G.; Lorenson, T.; Lee, H.

    2002-01-01

    A pockmark field was discovered during EM-300 multi-beam bathymetric surveys on the lower continental slope off the Big Sur coast of California. The field contains ??? 1500 pockmarks which are between 130 and 260 m in diameter, and typically are 8-12 m deep located within a 560 km2 area. To investigate the origin of these features, piston cores were collected from both the interior and the flanks of the pockmarks, and remotely operated vehicle observation (ROV) video and sampling transects were conducted which passed through 19 of the pockmarks. The water column within and above the pockmarks was sampled for methane concentration. Piston cores and ROV collected push cores show that the pockmark field is composed of monotonous fine silts and clays and the cores within the pockmarks are indistinguishable from those outside the pockmarks. No evidence for either sediment winnowing or diagenetic alteration suggestive of fluid venting was obtained. 14C measurements of the organic carbon in the sediments indicate continuous sedimentation throughout the time resolution of the radiocarbon technique ( ??? 45000 yr BP), with a sedimentation rate of ??? 10 cm per 1000 yr both within and between the pockmarks. Concentrations of methane, dissolved inorganic carbon, sulfate, chloride, and ammonium in pore water extracted from within the cores are generally similar in composition to seawater and show little change with depth, suggesting low biogeochemical activity. These pore water chemical gradients indicate that neither significant accumulations of gas are likely to exist in the shallow subsurface ( ??? 100 m) nor is active fluid advection occurring within the sampled sediments. Taken together the data indicate that these pockmarks are more than 45000 yr old, are presently inactive, and contain no indications of earlier fluid or gas venting events. ?? 2002 Elsevier Science B.V. All rights reserved.

  4. Big Sky Carbon Sequestration Partnership

    SciTech Connect

    Susan M. Capalbo

    2005-11-01

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership in Phase I fall into four areas: evaluation of sources and carbon sequestration sinks that will be used to determine the location of pilot demonstrations in Phase II; development of GIS-based reporting framework that links with national networks; designing an integrated suite of monitoring, measuring, and verification technologies and assessment frameworks; and initiating a comprehensive education and outreach program. The groundwork is in place to provide an assessment of storage capabilities for CO2 utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research agenda in Carbon Sequestration. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other DOE regional partnerships. The Partnership recognizes the critical importance of measurement, monitoring, and verification technologies to support not only carbon trading but all policies and programs that DOE and other agencies may want to pursue in support of GHG mitigation. The efforts in developing and implementing MMV technologies for geological sequestration reflect this concern. Research is also underway to identify and validate best management practices for soil C in the

  5. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed

    Hutter, Harald; Moerman, Donald

    2015-11-01

    A clear definition of what constitutes "Big Data" is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of "complete" data sets for this organism is actually rather small--not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein-protein interaction--important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell.

  6. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed Central

    Hutter, Harald; Moerman, Donald

    2015-01-01

    A clear definition of what constitutes “Big Data” is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of “complete” data sets for this organism is actually rather small—not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein–protein interaction—important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  7. Making a Difference. An Impact Study of Big Brothers/Big Sisters.

    ERIC Educational Resources Information Center

    Tierney, Joseph P.; And Others

    This report provides reliable evidence that mentoring programs can positively affect young people. The evidence is derived from research conducted at local affiliates of Big Brothers/Big Sisters of America (BB/BSA), the oldest, best-known, and arguably most sophisticated of the country's mentoring programs. Public/Private Ventures, Inc. conducted…

  8. Benchmarking Big Data Systems and the BigData Top100 List.

    PubMed

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  9. Benchmarking Big Data Systems and the BigData Top100 List.

    PubMed

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process. PMID:27447039

  10. Patterns of public participation.

    PubMed

    Slutsky, Jean; Tumilty, Emma; Max, Catherine; Lu, Lanting; Tantivess, Sripen; Hauegen, Renata Curi; Whitty, Jennifer A; Weale, Albert; Pearson, Steven D; Tugendhaft, Aviva; Wang, Hufeng; Staniszewska, Sophie; Weerasuriya, Krisantha; Ahn, Jeonghoon; Cubillos, Leonardo

    2016-08-15

    Purpose - The paper summarizes data from 12 countries, chosen to exhibit wide variation, on the role and place of public participation in the setting of priorities. The purpose of this paper is to exhibit cross-national patterns in respect of public participation, linking those differences to institutional features of the countries concerned. Design/methodology/approach - The approach is an example of case-orientated qualitative assessment of participation practices. It derives its data from the presentation of country case studies by experts on each system. The country cases are located within the historical development of democracy in each country. Findings - Patterns of participation are widely variable. Participation that is effective through routinized institutional processes appears to be inversely related to contestatory participation that uses political mobilization to challenge the legitimacy of the priority setting process. No system has resolved the conceptual ambiguities that are implicit in the idea of public participation. Originality/value - The paper draws on a unique collection of country case studies in participatory practice in prioritization, supplementing existing published sources. In showing that contestatory participation plays an important role in a sub-set of these countries it makes an important contribution to the field because it broadens the debate about public participation in priority setting beyond the use of minipublics and the observation of public representatives on decision-making bodies. PMID:27468773

  11. Transcriptome marker diagnostics using big data.

    PubMed

    Han, Henry; Liu, Ying

    2016-02-01

    The big omics data are challenging translational bioinformatics in an unprecedented way for its complexities and volumes. How to employ big omics data to achieve a rivalling-clinical, reproducible disease diagnosis from a systems approach is an urgent problem to be solved in translational bioinformatics and machine learning. In this study, the authors propose a novel transcriptome marker diagnosis to tackle this problem using big RNA-seq data by viewing whole transcriptome as a profile marker systematically. The systems diagnosis not only avoids the reproducibility issue of the existing gene-/network-marker-based diagnostic methods, but also achieves rivalling-clinical diagnostic results by extracting true signals from big RNA-seq data. Their method demonstrates a better fit for personalised diagnostics by attaining exceptional diagnostic performance via using systems information than its competitive methods and prepares itself as a good candidate for clinical usage. To the best of their knowledge, it is the first study on this topic and will inspire the more investigations in big omics data diagnostics.

  12. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  13. The Confluence of Exascale and Big Data

    NASA Astrophysics Data System (ADS)

    Dosanjh, Sudip

    2014-04-01

    Exascale computing has rightly received considerable attention within the high performance computing community. In many fields, scientific progress requires a thousand-fold increase in supercomputing performance over the next decade. Science needs include performing single simulations that span a large portion of an exascale system, as well high throughput computing. The big data problem has also received considerable attention, but is sometimes viewed as being orthogonal to exascale computing. This talk focuses on the confluence of exascale and big data. Exascale and big data face many similar technical challenges including increasing power/energy constraints, the growing mismatch between computing and data movement speeds, an explosion in concurrency and the reduced reliability of large computing systems. Even though exascale and data intensive systems might have different system-level architectures, the fundamental building blocks will be similar. Analyzing all the information produced by exascale simulations will also generate a big data problem. And finally, many experimental facilities are being inundated with large quantities of data as sensors and sequencers improve at rates that surpass Moore's Law. It is becoming increasingly difficult to analyze all of the data from a single experiment and it is often impossible to make comparisons across data sets. It will only be possible to accelerate scientific discovery if we bring together the high performance computing and big data communities.

  14. Children's Participation in Research

    ERIC Educational Resources Information Center

    Brostrom, Stig

    2012-01-01

    In (post) modern society children are seen as active subjects and participants who have a legitimate basis in the United Nations Convention of the Rights of the Child. As a consequence of this, children are able to play an active role in the planning of/and participation in both education and research in their own preschool settings. This article…

  15. BIG SKY CARBON SEQUESTRATION PARTNERSHIP

    SciTech Connect

    Susan M. Capalbo

    2004-10-31

    The Big Sky Carbon Sequestration Partnership, led by Montana State University, is comprised of research institutions, public entities and private sectors organizations, and the Confederated Salish and Kootenai Tribes and the Nez Perce Tribe. Efforts under this Partnership fall into four areas: evaluation of sources and carbon sequestration sinks; development of GIS-based reporting framework; designing an integrated suite of monitoring, measuring, and verification technologies; and initiating a comprehensive education and outreach program. At the first two Partnership meetings the groundwork was put in place to provide an assessment of capture and storage capabilities for CO{sub 2} utilizing the resources found in the Partnership region (both geological and terrestrial sinks), that would complement the ongoing DOE research. During the third quarter, planning efforts are underway for the next Partnership meeting which will showcase the architecture of the GIS framework and initial results for sources and sinks, discuss the methods and analysis underway for assessing geological and terrestrial sequestration potentials. The meeting will conclude with an ASME workshop. The region has a diverse array of geological formations that could provide storage options for carbon in one or more of its three states. Likewise, initial estimates of terrestrial sinks indicate a vast potential for increasing and maintaining soil C on forested, agricultural, and reclaimed lands. Both options include the potential for offsetting economic benefits to industry and society. Steps have been taken to assure that the GIS-based framework is consistent among types of sinks within the Big Sky Partnership area and with the efforts of other western DOE partnerships. Efforts are also being made to find funding to include Wyoming in the coverage areas for both geological and terrestrial sinks and sources. The Partnership recognizes the critical importance of measurement, monitoring, and verification

  16. Depression and Political Participation*

    PubMed Central

    Ojeda, Christopher

    2015-01-01

    In this paper, I propose that depression is a political phenomenon insofar as it has political sources and consequences. I then investigate one aspect of this argument—whether depression reduces participation. I hypothesize that individuals with depression lack the motivation and physical capacity to vote and engage in other forms of political participation due to somatic problems and feelings of hopelessness and apathy. Moreover, I examine how depression in adolescence can have downstream consequences for participation in young adulthood. The analyses, using both cross-sectional and longitudinal data, show that voter turnout and other forms of participation decrease as the severity of depressed mood increases. These findings are discussed in light of disability rights and potential efforts to boost participation among this group. PMID:26924857

  17. Are youth mentoring programs good value-for-money? An evaluation of the Big Brothers Big Sisters Melbourne Program

    PubMed Central

    Moodie, Marjory L; Fisher, Jane

    2009-01-01

    Background The Big Brothers Big Sisters (BBBS) program matches vulnerable young people with a trained, supervised adult volunteer as mentor. The young people are typically seriously disadvantaged, with multiple psychosocial problems. Methods Threshold analysis was undertaken to determine whether investment in the program was a worthwhile use of limited public funds. The potential cost savings were based on US estimates of life-time costs associated with high-risk youth who drop out-of-school and become adult criminals. The intervention was modelled for children aged 10–14 years residing in Melbourne in 2004. Results If the program serviced 2,208 of the most vulnerable young people, it would cost AUD 39.5 M. Assuming 50% were high-risk, the associated costs of their adult criminality would be AUD 3.3 billion. To break even, the program would need to avert high-risk behaviours in only 1.3% (14/1,104) of participants. Conclusion This indicative evaluation suggests that the BBBS program represents excellent 'value for money'. PMID:19178749

  18. Adapting bioinformatics curricula for big data.

    PubMed

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs.

  19. Unsupervised Tensor Mining for Big Data Practitioners.

    PubMed

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry.

  20. Implications of Big Data for cell biology

    PubMed Central

    Dolinski, Kara; Troyanskaya, Olga G.

    2015-01-01

    Big Data” has surpassed “systems biology” and “omics” as the hottest buzzword in the biological sciences, but is there any substance behind the hype? Certainly, we have learned about various aspects of cell and molecular biology from the many individual high-throughput data sets that have been published in the past 15–20 years. These data, although useful as individual data sets, can provide much more knowledge when interrogated with Big Data approaches, such as applying integrative methods that leverage the heterogeneous data compendia in their entirety. Here we discuss the benefits and challenges of such Big Data approaches in biology and how cell and molecular biologists can best take advantage of them. PMID:26174066

  1. The dominance of big pharma: power.

    PubMed

    Edgar, Andrew

    2013-05-01

    The purpose of this paper is to provide a normative model for the assessment of the exercise of power by Big Pharma. By drawing on the work of Steven Lukes, it will be argued that while Big Pharma is overtly highly regulated, so that its power is indeed restricted in the interests of patients and the general public, the industry is still able to exercise what Lukes describes as a third dimension of power. This entails concealing the conflicts of interest and grievances that Big Pharma may have with the health care system, physicians and patients, crucially through rhetorical engagements with Patient Advocacy Groups that seek to shape public opinion, and also by marginalising certain groups, excluding them from debates over health care resource allocation. Three issues will be examined: the construction of a conception of the patient as expert patient or consumer; the phenomenon of disease mongering; the suppression or distortion of debates over resource allocation.

  2. Little Big Horn River Water Quality Project

    SciTech Connect

    Bad Bear, D.J.; Hooker, D.

    1995-10-01

    This report summarizes the accomplishments of the Water Quality Project on the Little Big horn River during the summer of 1995. The majority of the summer was spent collecting data on the Little Big Horn River, then testing the water samples for a number of different tests which was done at the Little Big Horn College in Crow Agency, Montana. The intention of this study is to preform stream quality analysis to gain an understanding of the quality of selected portion of the river, to assess any impact that the existing developments may be causing to the environment and to gather base-line data which will serve to provide information concerning the proposed development. Citizens of the reservation have expressed a concern of the quality of the water on the reservation; surface waters, ground water, and well waters.

  3. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  4. Unsupervised Tensor Mining for Big Data Practitioners.

    PubMed

    Papalexakis, Evangelos E; Faloutsos, Christos

    2016-09-01

    Multiaspect data are ubiquitous in modern Big Data applications. For instance, different aspects of a social network are the different types of communication between people, the time stamp of each interaction, and the location associated to each individual. How can we jointly model all those aspects and leverage the additional information that they introduce to our analysis? Tensors, which are multidimensional extensions of matrices, are a principled and mathematically sound way of modeling such multiaspect data. In this article, our goal is to popularize tensors and tensor decompositions to Big Data practitioners by demonstrating their effectiveness, outlining challenges that pertain to their application in Big Data scenarios, and presenting our recent work that tackles those challenges. We view this work as a step toward a fully automated, unsupervised tensor mining tool that can be easily and broadly adopted by practitioners in academia and industry. PMID:27642720

  5. BigMouth: a multi-institutional dental data repository

    PubMed Central

    Walji, Muhammad F; Kalenderian, Elsbeth; Stark, Paul C; White, Joel M; Kookal, Krishna K; Phan, Dat; Tran, Duong; Bernstam, Elmer V; Ramoni, Rachel

    2014-01-01

    Few oral health databases are available for research and the advancement of evidence-based dentistry. In this work we developed a centralized data repository derived from electronic health records (EHRs) at four dental schools participating in the Consortium of Oral Health Research and Informatics. A multi-stakeholder committee developed a data governance framework that encouraged data sharing while allowing control of contributed data. We adopted the i2b2 data warehousing platform and mapped data from each institution to a common reference terminology. We realized that dental EHRs urgently need to adopt common terminologies. While all used the same treatment code set, only three of the four sites used a common diagnostic terminology, and there were wide discrepancies in how medical and dental histories were documented. BigMouth was successfully launched in August 2012 with data on 1.1 million patients, and made available to users at the contributing institutions. PMID:24993547

  6. How do we identify big rivers? And how big is big?

    NASA Astrophysics Data System (ADS)

    Miall, Andrew D.

    2006-04-01

    "Big rivers" are the trunk rivers that carry the water and sediment load from major orogens, or that drain large areas of a continent. Identifying such rivers in the ancient record is a challenge. Some guidance may be provided by tectonic setting and sedimentological evidence, including the scale of architectural elements, and clues from provenance studies, but such data are not infallible guides to river magnitude. The scale of depositional elements is the most obvious clue to channel size, but evidence is typically sparse and inadequate, and may be misleading. For example, thick fining-upward successions may be tectonic cyclothems. Two examples of the analysis of large ancient river systems are discussed here in order to highlight problems of methodology and interpretation. The Hawkesbury Sandstone (Triassic) of the Sydney Basin, Australia, is commonly cited as the deposit of a large river, on the basis of abundant very large-scale crossbedding. An examination of very large outcrops of this unit, including a coastal cliff section 6 km long near Sydney, showed that even with 100% exposure there are ambiguities in the determination of channel scale. It was concluded in this case that the channel dimensions of the Hawkesbury rivers were about half the size of the modern Brahmaputra River. The tectonic setting of a major ancient fluvial system is commonly not a useful clue to river scale. The Hawkesbury Sandstone is a system draining transversely from a cratonic source into a foreland basin, whereas most large rivers in foreland basins flow axially and are derived mainly from the orogenic uplifts (e.g., the large tidally influenced rivers of the Athabasca Oil Sands, Alberta). Epeirogenic tilting of a continent by the dynamic topography process may generate drainages in unexpected directions. For example, analyses of detrital zircons in Upper Paleozoic-Mesozoic nonmarine successions in the SW United States suggests significant derivation from the Appalachian orogen

  7. Energy scale of the Big Bounce

    SciTech Connect

    Malkiewicz, Przemyslaw; Piechocki, Wlodzimierz

    2009-09-15

    We examine the nature of the cosmological Big Bounce transition within the loop geometry underlying loop quantum cosmology at classical and quantum levels. Our canonical quantization method is an alternative to the standard loop quantum cosmology. An evolution parameter we use has a clear interpretation. Our method opens the door for analyses of spectra of physical observables like the energy density and the volume operator. We find that one cannot determine the energy scale specific to the Big Bounce by making use of the loop geometry without an extra input from observational cosmology.

  8. How quantum is the big bang?

    PubMed

    Bojowald, Martin

    2008-06-01

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state. PMID:18643411

  9. Effective dynamics of the matrix big bang

    SciTech Connect

    Craps, Ben; Rajaraman, Arvind; Sethi, Savdeep

    2006-05-15

    We study the leading quantum effects in the recently introduced matrix big bang model. This amounts to a study of supersymmetric Yang-Mills theory compactified on the Milne orbifold. We find a one-loop potential that is attractive near the big bang. Surprisingly, the potential decays very rapidly at late times where it appears to be generated by D-brane effects. Usually, general covariance constrains the form of any effective action generated by renormalization group flow. However, the form of our one-loop potential seems to violate these constraints in a manner that suggests a connection between the cosmological singularity and long wavelength, late time physics.

  10. Harnessing the Heart of Big Data

    PubMed Central

    Scruggs, Sarah B.; Watson, Karol; Su, Andrew I.; Hermjakob, Henning; Yates, John R.; Lindsey, Merry L.; Ping, Peipei

    2015-01-01

    The exponential increase in Big Data generation combined with limited capitalization on the wealth of information embedded within Big Data have prompted us to revisit our scientific discovery paradigms. A successful transition into this digital era of medicine holds great promise for advancing fundamental knowledge in biology, innovating human health and driving personalized medicine, however, this will require a drastic shift of research culture in how we conceptualize science and use data. An e-transformation will require global adoption and synergism among computational science, biomedical research and clinical domains. PMID:25814682

  11. Livermore Big Trees Park: 1998 Results

    SciTech Connect

    Mac Queen, D; Gallegos, G; Surano, K

    2002-04-18

    This report is an in-depth study of results from environmental sampling conducted in 1998 by the Lawrence Livermore National Laboratory (LLNL) at Big Trees Park in the city of Livermore. The purpose of the sampling was to determine the extent and origin of plutonium found in soil at concentrations above fallout-background levels in the park. This report describes the sampling that was conducted, the chemical and radio-chemical analyses of the samples, the quality control assessments and statistical analyses of the analytical results, and LLNL's interpretations of the results. It includes a number of data analyses not presented in LLNL's previous reports on Big Trees Park.

  12. How quantum is the big bang?

    PubMed

    Bojowald, Martin

    2008-06-01

    When quantum gravity is used to discuss the big bang singularity, the most important, though rarely addressed, question is what role genuine quantum degrees of freedom play. Here, complete effective equations are derived for isotropic models with an interacting scalar to all orders in the expansions involved. The resulting coupling terms show that quantum fluctuations do not affect the bounce much. Quantum correlations, however, do have an important role and could even eliminate the bounce. How quantum gravity regularizes the big bang depends crucially on properties of the quantum state.

  13. Use of Big-Screen Films in Multiple Childbirth Education Classroom Settings

    PubMed Central

    Kaufman, Tamara

    2010-01-01

    Although two recent films, Orgasmic Birth and Pregnant in America, were intended for the big screen, they can also serve as valuable teaching resources in multiple childbirth education settings. Each film conveys powerful messages about birth and today's birthing culture. Depending on a childbirth educator's classroom setting (hospital, birthing center, or home birth environment), particular portions in each film, along with extra clips featured on the films' DVDs, can enhance an educator's curriculum and spark compelling discussions with class participants. PMID:21358831

  14. Internet-based brain training games, citizen scientists, and big data: ethical issues in unprecedented virtual territories.

    PubMed

    Purcell, Ryan H; Rommelfanger, Karen S

    2015-04-22

    Internet brain training programs, where consumers serve as both subjects and funders of the research, represent the closest engagement many individuals have with neuroscience. Safeguards are needed to protect participants' privacy and the evolving scientific enterprise of big data. PMID:25905809

  15. A Comparative Investigation of the BigCAT and Erickson S-24 Measures of Speech-Associated Attitude

    ERIC Educational Resources Information Center

    Vanryckeghem, Martine; Brutten, Gene J.

    2012-01-01

    The BigCAT and the Erickson S-24, self-report measures of communication attitude, were administered in a randomly determined order to 72 adults who stuttered (PWS) and 72 who did not (PWNS). The two groups of participants differed from each other to a statistically significant extent on both of these measures of speech-associated attitude,…

  16. Participating in Clinical Trials

    MedlinePlus

    ... this page please turn Javascript on. Participating in Clinical Trials About Clinical Trials A Research Study With Human Subjects A clinical ... to treat or cure a disease. Phases of Clinical Trials Clinical trials of drugs are usually described based ...

  17. Understanding Participation in Programs.

    ERIC Educational Resources Information Center

    Hanson, Alan L.

    1991-01-01

    Adherence to program planning principles does not guarantee participation. Attention must be paid to characteristics that make a program responsive: target audience, promotion and marketing, competition, and logistics. (SK)

  18. Clinical Trials - Participants

    MedlinePlus

    ... participating in was reviewed by an IRB. Further Reading For more information about research protections, see: Office ... data and decide whether the results have medical importance. Results from clinical trials are often published in ...

  19. Learning through Participation

    ERIC Educational Resources Information Center

    Leeb, David; Prentiss, William C.

    1970-01-01

    An experimental program at Valencia Junior College (Florida) allows every student to actively participate in all phases of the political science course. A variety of multimedia materials, which the students help to develop and evaluate, are used. (BB)

  20. Public Participation Plan

    SciTech Connect

    Not Available

    1993-07-01

    The purpose of this Public Participation Plan is to describe the US Department of Energy`s (DOE) plan for involving the public in the decision-making process for the Uranium Mill Tailings Remedial Action (UMTRA) Project. The plan describes how the DOE will meet the public participation requirements of the Uranium Mill Tailings Radiation Control Act (UMTRCA) of 1978, as amended, and of the National Environmental Policy Act (NEPA) of 1969. It includes the UMTRA Project Office plans for complying with DOE Order 5440.1D and for implementing the DOE`s Public Participation Policy for Environmental Restoration and Waste Management (1992) and Public Participation Guidance for Environmental Restoration and Waste Management (1993).

  1. Research recruitment using Facebook advertising: big potential, big challenges.

    PubMed

    Kapp, Julie M; Peters, Colleen; Oliver, Debra Parker

    2013-03-01

    To our knowledge, ours is the first study to report on Facebook advertising as an exclusive mechanism for recruiting women ages 35-49 years residing in the USA into a health-related research study. We directed our survey to women ages 35-49 years who resided in the USA exclusively using three Facebook advertisements. Women were then redirected to our survey site. There were 20,568,960 women on Facebook that met the eligibility criteria. The three ads resulted in 899,998 impressions with a reach of 374,225 women. Of the women reached, 280 women (0.075 %) clicked the ad. Of the women who clicked the ad, nine women (3.2 %) proceeded past the introductory page. Social networking, and in particular Facebook, is an innovative venue for recruiting participants for research studies. Challenges include developing an ad to foster interest without biasing the sample, and motivating women who click the ad to complete the survey. There is still much to learn about this potential method of recruitment. PMID:23292877

  2. AirMSPI PODEX BigSur Terrain Images

    Atmospheric Science Data Center

    2013-12-13

    ... from the PODEX 2013 Campaign   Big Sur target (Big Sur, California) 02/03/2013 Terrain-projected   Select ...   Version number   For more information, see the Data Product Specifications (DPS)   ...

  3. Big Creek Hydroelectric System, East & West Transmission Line, 241mile ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Big Creek Hydroelectric System, East & West Transmission Line, 241-mile transmission corridor extending between the Big Creek Hydroelectric System in the Sierra National Forest in Fresno County and the Eagle Rock Substation in Los Angeles, California, Visalia, Tulare County, CA

  4. What's the Big Sweat about Dehydration? (For Kids)

    MedlinePlus

    ... Dictionary of Medical Words En Español What Other Kids Are Reading Back-to-School Butterflies? Read This ... What's the Big Sweat About Dehydration? KidsHealth > For Kids > What's the Big Sweat About Dehydration? Print A ...

  5. "Small Steps, Big Rewards": Preventing Type 2 Diabetes

    MedlinePlus

    ... please turn Javascript on. Feature: Diabetes "Small Steps, Big Rewards": Preventing Type 2 Diabetes Past Issues / Fall ... These are the plain facts in "Small Steps. Big Rewards: Prevent Type 2 Diabetes," an education campaign ...

  6. Big-Time Fundraising for Today's Schools

    ERIC Educational Resources Information Center

    Levenson, Stanley

    2006-01-01

    In this enlightening book, nationally recognized author and fundraising consultant Stanley Levenson shows school leaders how to move away from labor-intensive, nickel-and-dime bake sales and car washes, and into the world of big-time fundraising. Following the model used by colleges and universities, the author presents a wealth of practical…

  7. Big-Time Sports in American Universities

    ERIC Educational Resources Information Center

    Clotfelter, Charles T.

    2011-01-01

    For almost a century, big-time college sports has been a wildly popular but consistently problematic part of American higher education. The challenges it poses to traditional academic values have been recognized from the start, but they have grown more ominous in recent decades, as cable television has become ubiquitous, commercial opportunities…

  8. Big physics quartet win government backing

    NASA Astrophysics Data System (ADS)

    Banks, Michael

    2014-09-01

    Four major physics-based projects are among 10 to have been selected by Japan’s Ministry of Education, Culture, Sports, Science and Technology for funding in the coming decade as part of its “roadmap” of big-science projects.

  9. Integrating "big data" into surgical practice.

    PubMed

    Mathias, Brittany; Lipori, Gigi; Moldawer, Lyle L; Efron, Philip A

    2016-02-01

    'Big data' is the next frontier of medicine. We now have the ability to generate and analyze large quantities of healthcare data. Although interpreting and integrating this information into clinical practice poses many challenges, the potential benefits of personalized medicine are seemingly without limit.

  10. A Big Problem for Magellan: Food Preservation

    ERIC Educational Resources Information Center

    Galvao, Cecilia; Reis, Pedro; Freire, Sofia

    2008-01-01

    In this paper, we present data related to how a Portuguese teacher developed the module "A big problem for Magellan: Food preservation." Students were asked to plan an investigation in order to identify which were the best food preservation methods in the XV and XVI centuries of Portuguese overseas navigation, and then establish a parallel between…

  11. Challenges of Big Data in Educational Assessment

    ERIC Educational Resources Information Center

    Gibson, David C.; Webb, Mary; Ifenthaler, Dirk

    2015-01-01

    This paper briefly discusses four measurement challenges of data science or "big data" in educational assessments that are enabled by technology: 1. Dealing with change over time via time-based data. 2. How a digital performance space's relationships interact with learner actions, communications and products. 3. How layers of…

  12. Big Island Demonstration Project - Black Liquor

    SciTech Connect

    2006-08-01

    Black liquor is a papermaking byproduct that also serves as a fuel for pulp and paper mills. This project involves the design, construction, and operation of a black liquor gasifier that will be integrated into Georgia-Pacific's Big Island facility in Virginia, a mill that has been in operation for more than 100 years.

  13. Marketing Your Library with the Big Read

    ERIC Educational Resources Information Center

    Johnson, Wendell G.

    2012-01-01

    The Big Read was developed by the National Endowment for the Arts to revitalize the role of culture in American society and encourage the reading of landmark literature. Each year since 2007, the DeKalb Public Library, Northern Illinois University, and Kishwaukee Community College have partnered to foster literacy in the community. This article…

  14. More on Sports and the Big6.

    ERIC Educational Resources Information Center

    Eisenberg, Mike

    1998-01-01

    Presents strategies for relating the Big6 information problem-solving process to sports to gain students' attention, sustain it, and make instruction relevant to their interests. Lectures by coaches, computer-based sports games, sports information sources, the use of technology in sports, and judging sports events are discussed. (LRW)

  15. Data Needs for Big City Schools.

    ERIC Educational Resources Information Center

    Eubanks, Eugene E.

    Public schools in the big cities and urban areas will become proportionally more minority and poor in the 1980's and 1990's. The traditional measures used to collect data on minority population have proved to be inaccurate. The following items are needed and will be of value to people working in urban public schools: (1) data which distinguish…

  16. The Big Ideas behind Whole System Reform

    ERIC Educational Resources Information Center

    Fullan, Michael

    2010-01-01

    Whole system reform means that every vital part of the system--school, community, district, and government--contributes individually and in concert to forward movement and success, using practice, not research, as the driver of reform. With this in mind, several "big ideas", based on successful implementation, informed Ontario's reform strategy:…

  17. Science Literacy Circles: Big Ideas about Science

    ERIC Educational Resources Information Center

    Devick-Fry, Jane; LeSage, Teresa

    2010-01-01

    Science literacy circles incorporate the organization of both science notebooks and literature circles to help K-8 students internalize big ideas about science. Using science literacy circles gives students opportunities to engage in critical thinking as they inductively develop understanding about science concepts. (Contains 1 table and 7…

  18. Big Broadband Connectivity in the United States

    ERIC Educational Resources Information Center

    Windhausen, John, Jr.

    2008-01-01

    The economic and social future of the United States depends on answering the growing demand for very high-speed broadband connectivity, a capability termed "big broadband." Failure to take on the challenge could lead to a decline in global competitiveness and an inability to educate students. (Contains 20 notes.)

  19. Big Bubbles in Boiling Liquids: Students' Views

    ERIC Educational Resources Information Center

    Costu, Bayram

    2008-01-01

    The aim of this study was to elicit students' conceptions about big bubbles in boiling liquids (water, ethanol and aqueous CuSO[subscript 4] solution). The study is based on twenty-four students at different ages and grades. The clinical interviews technique was conducted to solicit students' conceptions and the interviews were analyzed to…

  20. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Note: [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to 1000 light-years, or about 9000 million million km! More Information: This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising cooperation in astronomical research. ESO operates

  1. Black Hole Blows Big Bubble

    NASA Astrophysics Data System (ADS)

    2010-07-01

    astronomers understand the similarity between small black holes formed from exploded stars and the supermassive black holes at the centres of galaxies. Very powerful jets have been seen from supermassive black holes, but are thought to be less frequent in the smaller microquasar variety. The new discovery suggests that many of them may simply have gone unnoticed so far. The gas-blowing black hole is located 12 million light-years away, in the outskirts of the spiral galaxy NGC 7793 (eso0914b). From the size and expansion velocity of the bubble the astronomers have found that the jet activity must have been ongoing for at least 200 000 years. Notes [1] Astronomers do not have yet any means of measuring the size of the black hole itself. The smallest stellar black hole discovered so far has a radius of about 15 km. An average stellar black hole of about 10 solar masses has a radius of about 30 km, while a "big" stellar black hole may have a radius of up to 300 km. This is still much smaller than the jets, which extend out to several hundreds light years on each side of the black hole, or about several thousand million million km! More information This result appears in a paper published in this week's issue of the journal Nature (A 300 parsec long jet-inflated bubble around a powerful microquasar in the galaxy NGC 7793, by Manfred W. Pakull, Roberto Soria and Christian Motch). ESO, the European Southern Observatory, is the foremost intergovernmental astronomy organisation in Europe and the world's most productive astronomical observatory. It is supported by 14 countries: Austria, Belgium, the Czech Republic, Denmark, France, Finland, Germany, Italy, the Netherlands, Portugal, Spain, Sweden, Switzerland and the United Kingdom. ESO carries out an ambitious programme focused on the design, construction and operation of powerful ground-based observing facilities enabling astronomers to make important scientific discoveries. ESO also plays a leading role in promoting and organising

  2. Big Crater as Viewed by Pathfinder Lander

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    Mars Pathfinder is the second in NASA

  3. 8. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. NORTHERLY VIEW OF THE DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING THE HOLLOW BAYS 2, 3, 4, 5, AND 6 AND THE PLUNGE POOL IN THE FOREGROUND. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  4. Big Ideas in Primary Mathematics: Issues and Directions

    ERIC Educational Resources Information Center

    Askew, Mike

    2013-01-01

    This article is located within the literature arguing for attention to Big Ideas in teaching and learning mathematics for understanding. The focus is on surveying the literature of Big Ideas and clarifying what might constitute Big Ideas in the primary Mathematics Curriculum based on both theoretical and pragmatic considerations. This is…

  5. 76 FR 26240 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming..., 2011, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County Weed...

  6. 78 FR 33326 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-04

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... at 3:00 p.m. ADDRESSES: The meeting will be held at Big Horn County Weed and Pest Building,...

  7. 76 FR 47141 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ] ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming..., 2011 and will begin at 3 p.m. ADDRESSES: The meeting will be held at the Big Horn County Weed and...

  8. 76 FR 7810 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Lovell, Wyoming..., 2011, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn Federal...

  9. 77 FR 49779 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-17

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... 11, 2012 and will begin at 3 p.m. ADDRESSES: The meeting will be held at the Big Horn County Weed...

  10. 75 FR 71069 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee will meet in Greybull, Wyoming... December 1, 2010, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big Horn County...

  11. Sports and the Big6: The Information Advantage.

    ERIC Educational Resources Information Center

    Eisenberg, Mike

    1997-01-01

    Explores the connection between sports and the Big6 information problem-solving process and how sports provides an ideal setting for learning and teaching about the Big6. Topics include information aspects of baseball, football, soccer, basketball, figure skating, track and field, and golf; and the Big6 process applied to sports. (LRW)

  12. ["Big data" - large data, a lot of knowledge?].

    PubMed

    Hothorn, Torsten

    2015-01-28

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  13. 11. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING CONSTRUCTION OF THE ARCH, TAKEN ON NOVEMBER 26, 1930, (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON JUNE 5, 1973, BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  14. 14. VIEW OF UPSTREAM ELEVATION SHOWING CONSTRUCTION OF BIG TUJUNGA ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. VIEW OF UPSTREAM ELEVATION SHOWING CONSTRUCTION OF BIG TUJUNGA DAM, TAKEN ON MAY 27, 1931, (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON JUNE 5, 1973, BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  15. 12. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. VIEW OF UPSTREAM ELEVATION OF BIG TUJUNGA DAM SHOWING CONSTRUCTION OF THE ARCH, TAKEN ON JANUARY 28, 1931, (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON JUNE 5, 1973, BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Tujunga Dam, 809 West Big Tujunga Road, Sunland, Los Angeles County, CA

  16. 7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. SOUTHEAST VIEW OF BIG DALTON DAM SHOWING THE MULTIPLE ARCHES, AN UPSTREAM VIEW OF THE PARAPET WALL ALONG THE CREST OF THE DAM, AND THE SHELTER HOUSE AT THE EAST END OF THE DAM. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  17. 11. VIEW OF UPSTREAM ELEVATION OF BIG DALTON DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. VIEW OF UPSTREAM ELEVATION OF BIG DALTON DAM SHOWING CONSTRUCTION OF THE ARCH WALLS, TAKEN ON SEPTEMBER 11, 1928 (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 6/5/1973 BY PHOTOGRAPHER GATSON OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  18. 15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. UPSTREAM VIEW (PHOTOGRAPHER UNKNOWN) SHOWING BIG DALTON DAM NEAR FULL CAPACITY AFTER CONSTRUCTION. PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 2-15-1973 BY PHOTOGRAPHER D. MEIER OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  19. 16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2161962 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. AERIAL VIEW OF BIG DALTON DAM TAKEN ON 2-16-1962 BY L.A. COUNTY PUBLIC WORKS PHOTOGRAPHER SINGER. PHOTO SHOWS THE RESERVOIR NEAR FULL CAPACITY AND WATER BEING RELEASED ON THE DOWNSTREAM SIDE. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  20. 13. VIEW OF DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. VIEW OF DOWNSTREAM ELEVATION OF BIG DALTON DAM SHOWING CONSTRUCTION OF THE ARCHES AND ARCH WALLS TAKEN IN 1928-1929 (PHOTOGRAPHER UNKNOWN). PICTURE WAS DEVELOPED FROM COPY NEGATIVES WHICH WERE TAKEN ON 2-15-1973 BY PHOTOGRAPHER D. MEIER OF L.A. COUNTY PUBLIC WORKS. - Big Dalton Dam, 2600 Big Dalton Canyon Road, Glendora, Los Angeles County, CA

  1. Clarity and causality needed in claims about Big Gods.

    PubMed

    Watts, Joseph; Bulbulia, Joseph; Gray, Russell D; Atkinson, Quentin D

    2016-01-01

    We welcome Norenzayan et al.'s claim that the prosocial effects of beliefs in supernatural agents extend beyond Big Gods. To date, however, supporting evidence has focused on the Abrahamic Big God, making generalisations difficult. We discuss a recent study that highlights the need for clarity about the causal path by which supernatural beliefs affect the evolution of big societies. PMID:26948745

  2. View of New Big Oak Flat Road seen from Old ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of New Big Oak Flat Road seen from Old Wawona Road near location of photograph HAER CA-148-17. Note road cuts, alignment, and tunnels. Devils Dance Floor at left distance. Looking northwest - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  3. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its...

  4. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    PubMed

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  5. Becoming a Runner: Big, Middle and Small Stories about Physical Activity Participation in Later Life

    ERIC Educational Resources Information Center

    Griffin, Meridith; Phoenix, Cassandra

    2016-01-01

    How do older adults learn to tell a "new" story about, through, and with the body? We know that narratives are embodied, lived and central to the process of meaning-making--and as such, they do not lie in the waiting for telling, but are an active part of everyday interaction. Telling stories about ourselves to others is one way in which…

  6. Analysis of operator participation

    NASA Technical Reports Server (NTRS)

    Zarakovskiy, G. M.; Zinchenko, V. P.

    1973-01-01

    The problem of providing a psychological conception of the analysis of operator participation in a form that will allow the qualitative approach to be combined with the quantitative approach is examined. This conception is based on an understanding of the essence of human endeavor in automated control systems that now determine the development of society's productive forces and that are the main object of ergonomic research. Two main types of operator participation were examined; information retrieval with immediate service and information retrieval with delayed service.

  7. High School Students as Mentors: Findings from the Big Brothers Big Sisters School-Based Mentoring Impact Study

    ERIC Educational Resources Information Center

    Herrera, Carla; Kauh, Tina J.; Cooney, Siobhan M.; Grossman, Jean Baldwin; McMaken, Jennifer

    2008-01-01

    High schools have recently become a popular source of mentors for school-based mentoring (SBM) programs. The high school Bigs program of Big Brothers Big Sisters of America, for example, currently involves close to 50,000 high-school-aged mentors across the country. While the use of these young mentors has several potential advantages, their age…

  8. SETI as a part of Big History

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2014-08-01

    Big History is an emerging academic discipline which examines history scientifically from the Big Bang to the present. It uses a multidisciplinary approach based on combining numerous disciplines from science and the humanities, and explores human existence in the context of this bigger picture. It is taught at some universities. In a series of recent papers ([11] through [15] and [17] through [18]) and in a book [16], we developed a new mathematical model embracing Darwinian Evolution (RNA to Humans, see, in particular, [17] and Human History (Aztecs to USA, see [16]) and then we extrapolated even that into the future up to ten million years (see 18), the minimum time requested for a civilization to expand to the whole Milky Way (Fermi paradox). In this paper, we further extend that model in the past so as to let it start at the Big Bang (13.8 billion years ago) thus merging Big History, Evolution on Earth and SETI (the modern Search for ExtraTerrestrial Intelligence) into a single body of knowledge of a statistical type. Our idea is that the Geometric Brownian Motion (GBM), so far used as the key stochastic process of financial mathematics (Black-Sholes models and related 1997 Nobel Prize in Economics!) may be successfully applied to the whole of Big History. In particular, in this paper we derive Big History Theory based on GBMs: just as the GBM is the “movie” unfolding in time, so the Statistical Drake Equation is its “still picture”, static in time, and the GBM is the time-extension of the Drake Equation. Darwinian Evolution on Earth may be easily described as an increasing GBM in the number of living species on Earth over the last 3.5 billion years. The first of them was RNA 3.5 billion years ago, and now 50

  9. Big Data - What is it and why it matters.

    PubMed

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers.

  10. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry.

  11. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry. PMID:26440506

  12. Katimavik Participant Information Guide.

    ERIC Educational Resources Information Center

    OPCAN, Montreal (Quebec).

    The guide provides prospective participants with an overview of Katimavik, a 9-month community volunteer service and learning program for 17- to 21-year-olds sponsored since 1977 by the Canadian Government. The guide describes the application process and computerized random selection procedures; work projects, which may range from building…

  13. Narrowing Participation Gaps

    ERIC Educational Resources Information Center

    Hand, Victoria; Kirtley, Karmen; Matassa, Michael

    2015-01-01

    Shrinking the achievement gap in mathematics is a tall order. One way to approach this challenge is to think about how the achievement gap manifests itself in the classroom and take concrete action. For example, opportunities to participate in activities that involve mathematical reasoning and argumentation in a safe and supportive manner are…

  14. Participative Decision-Making.

    ERIC Educational Resources Information Center

    Lindelow, John; And Others

    Chapter 6 in a volume on school leadership, this chapter makes a case for the use of participative decision-making (PDM) at the school-site level, outlines guidelines for its implementation, and describes the experiences of some schools with PDM systems. It begins by citing research indicating the advantages of PDM, including better decisions,…

  15. Participative Decision-Making.

    ERIC Educational Resources Information Center

    Lindelow, John; And Others

    Chapter 7 of a revised volume on school leadership, this chapter advocates the use of participative decision-making (PDM) at the school site level, outlines implementation guidelines, and describes the experiences of some schools with PDM systems. A cornerstone of a reform movement to make organizational operations more democratic and less…

  16. Communication Games: Participant's Manual.

    ERIC Educational Resources Information Center

    Krupar, Karen R.

    Using a series of communicational games, the author leads the participant through self-awareness, verbal and nonverbal communication, decision-making, problem-solving, and skills in perception, listening, and small group, organizational, and cultural communications. The thesis behind the book is that model-making, role-playing, or other forms of…

  17. Canada's Participation in TIMSS.

    ERIC Educational Resources Information Center

    McConaghy, Tom

    1998-01-01

    In the grade 12 portion of the Third International Mathematics and Science Study, Canadian students performed better than other participating G-8 countries. In fact, Canada scored consistently above the international mean for all three age groups tested. However, some educators and reformers have expressed dissatisfaction with these results. (MLH)

  18. Epilepsy and sports participation.

    PubMed

    Howard, Gregory M; Radloff, Monika; Sevier, Thomas L

    2004-02-01

    Epilepsy is a common disease found in 2% of the population, affecting both young and old. Unfortunately, epileptics have previously been discouraged from participation in physical activity and sports for fear of inducing seizures or increasing seizure frequency. Despite a shift in medical recommendations toward encouraging rather than restricting participation, the stigma remains and epileptics continue to be less active than the general population. This results in increased body mass index, decreased aerobic endurance, poorer self-esteem, and higher levels of anxiety and depression. Although there are rare cases of exercise-induced seizures, studies have shown that physical activity can decrease seizure frequency, as well as lead to improved cardiovascular and psychologic health. The majority of sports are safe for epileptics to participate in with special attention to adequate seizure control, close monitoring of medications, and preparation of family, coaches, or trainers. Contact sports including football, hockey, and soccer have not been shown to induce seizures, and epileptics should not be precluded from participation. Water sports and swimming are felt to be safe if seizures are well controlled and direct supervision is present. Additional care must be taken in sports involving heights such as gymnastics, harnessed rock climbing, or horseback riding. Sports such as hang-gliding, scuba diving, or free climbing are not recommended, given the risk of severe injury or death, if a seizure were to occur during the activity. This article reviews the risks and benefits of physical activity in epileptics, discusses sports in which epileptics may participate, and addresses how to decrease possible risks for injury.

  19. Consumer participation in power market balancing. A real-life step towards smart grids

    NASA Astrophysics Data System (ADS)

    Per, Lund

    2014-09-01

    With the increasing role of wind and solar power, the power balance authorities are facing a big challenge: How to manage the increasing need for fast balancing power brought on by increased penetration of variable and difficult-to-forecast renewable generation? Could more active participation by the residential customers in managing electricity demand be a smart way to go?

  20. The big data-big model (BDBM) challenges in ecological research

    NASA Astrophysics Data System (ADS)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  1. Eastbound. Pioneering community participation.

    PubMed

    1997-03-01

    The government of Indonesia has taken steps to foster economic growth in the eastern region of the country by creating a transportation infrastructure. While many parts of the region still lack sea transportation, the rate of economic growth in the region has outstripped that of the nation since 1993. In this region, clove was the major cash crop, but trade regulations have reduced the profit in clove farming. Maluku was once renowned for its fishing, but this industry is now dominated by big investors, and the region depends upon aid for backward villages and upon loans from nongovernmental organizations (NGOs) that are funneled through local self-help groups. The Yayasan Indonesia Sejahtera (YIS) has worked in eastern Indonesia since 1974 and is planning to collect data on its most urgent problems and to fund the income-generation project proposals submitted by NGOs. Among these programs are foundations that provide funds for fishermen to purchase motorized boats, loans to groups of traders, and loans for fishermen and for breeding goats and chickens. Women can also procure loans to sell groceries or for farming activities. The self-help groups of fishermen face obstacles when the weather prohibits fishing, and the farmers have problems obtaining good breeders and combating disease among their livestock. The NGOs combat problems caused by a lack of field workers and by limited funds as well as the obstacles caused by the geography of the region. Members of the self-help groups are improving their knowledge and skills, and the NGOs are improving their management capabilities to deal with these challenges.

  2. The good body: when big is better.

    PubMed

    Cassidy, C M

    1991-09-01

    An important cultural question is, "What is a 'good'--desirable, beautiful, impressive--body?" The answers are legion; here I examine why bigger bodies represent survival skill, and how this power symbolism is embodied by behaviors that guide larger persons toward the top of the social hierarchy. bigness is a complex concept comprising tallness, boniness, muscularity and fattiness. Data show that most people worldwide want to be big--both tall and fat. Those who achieve the ideal are disproportionately among the society's most socially powerful. In the food-secure West, fascination with power and the body has not waned, but has been redefined such that thinness is desired. This apparent anomaly is resolved by realizing that thinness in the midst of abundance--as long as one is also tall and muscular--still projects the traditional message of power, and brings such social boons as upward mobility. PMID:1961102

  3. Big Crunch-based omnidirectional light concentrators

    NASA Astrophysics Data System (ADS)

    Smolyaninov, Igor I.; Hung, Yu-Ju

    2014-12-01

    Omnidirectional light concentration remains an unsolved problem despite such important practical applications as the design of efficient mobile photovoltaic cells. Recently developed optical black hole designs offer partial solutions to this problem. However, even these solutions are not truly omnidirectional since they do not exhibit a horizon, and at large enough incidence angles the light may be trapped into quasi-stationary orbits around such imperfect optical black holes. Here, we propose and realize experimentally another gravity-inspired design of a broadband omnidirectional light concentrator based on the cosmological Big Crunch solutions. By mimicking the Big Crunch spacetime via a corresponding effective optical metric, we make sure that every photon world line terminates in a single point.

  4. Spectral Observations of BIG Objects. III

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.

    2004-07-01

    Results of spectral observations of 66 objects from the BIG (Byurakan IRAS Galaxies) sample made with the 1.93 m telescope at the Observatoire de Haute Provence (OHP, France) are presented. Emission lines are observed from 64 of the galaxies. The red shifts are determined, the radial velocities, distances, and absolute stellar magnitudes are calculated, the spectrum line parameters are determined, diagnostic diagrams are constructed, the objects are classified according to activity type, and their IR and far-IR luminosities are calculated. Of the 66 objects (corresponding to 61 IRAS sources), 6 are Sy2, 2 are LINERs, 8 are AGN (Sy2 or LINER), 10 are composite, 34 are HII, and 4 are Em of undetermined type. It is calculated that IRAS 07479+7832= BIG d141a is a ultraluminous IR galaxy (ULIG), and 21 are LIG. Spectra of several of the galaxies being studied are presented.

  5. FAST TRACK COMMUNICATION: Big Bounce and inhomogeneities

    NASA Astrophysics Data System (ADS)

    Brizuela, David; Mena Marugán, Guillermo A.; Pawłowski, Tomasz

    2010-03-01

    The dynamics of an inhomogeneous universe is studied with the methods of loop quantum cosmology, via a so-called hybrid quantization, as an example of the quantization of vacuum cosmological spacetimes containing gravitational waves (Gowdy spacetimes). The analysis of this model with an infinite number of degrees of freedom, performed at the effective level, shows that (i) the initial Big Bang singularity is replaced (as in the case of homogeneous cosmological models) by a Big Bounce, joining deterministically two large universes, (ii) the universe size at the bounce is at least of the same order of magnitude as that of the background homogeneous universe and (iii) for each gravitational wave mode, the difference in amplitude at very early and very late times has a vanishing statistical average when the bounce dynamics is strongly dominated by the inhomogeneities, whereas this average is positive when the dynamics is in a near-vacuum regime, so that statistically the inhomogeneities are amplified.

  6. Big Bend National Park, TX, USA, Mexico

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Sierra del Carmen of Mexico, across the Rio Grande River from Big Bend National Park, TX, (28.5N, 104.0W) is centered in this photo. The Rio Grande River bisects the scene; Mexico to the east, USA to the west. The thousand ft. Boquillas limestone cliff on the Mexican side of the river changes colors from white to pink to lavender at sunset. This severely eroded sedimentary landscape was once an ancient seabed later overlaid with volcanic activity.

  7. Big Data Challenges for Large Radio Arrays

    NASA Technical Reports Server (NTRS)

    Jones, Dayton L.; Wagstaff, Kiri; Thompson, David; D'Addario, Larry; Navarro, Robert; Mattmann, Chris; Majid, Walid; Lazio, Joseph; Preston, Robert; Rebbapragada, Umaa

    2012-01-01

    Future large radio astronomy arrays, particularly the Square Kilometre Array (SKA), will be able to generate data at rates far higher than can be analyzed or stored affordably with current practices. This is, by definition, a "big data" problem, and requires an end-to-end solution if future radio arrays are to reach their full scientific potential. Similar data processing, transport, storage, and management challenges face next-generation facilities in many other fields.

  8. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money. PMID:24853791

  9. Dark radiation emerging after big bang nucleosynthesis?

    SciTech Connect

    Fischler, Willy; Meyers, Joel

    2011-03-15

    We show how recent data from observations of the cosmic microwave background may suggest the presence of additional radiation density which appeared after big bang nucleosynthesis. We propose a general scheme by which this radiation could be produced from the decay of nonrelativistic matter, we place constraints on the properties of such matter, and we give specific examples of scenarios in which this general scheme may be realized.

  10. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money.

  11. Can big business save health care?

    PubMed

    Dunn, Philip

    2007-01-01

    Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is.

  12. Can big business save health care?

    PubMed

    Dunn, Philip

    2007-01-01

    Corporate America has decided to stop bellyaching about the cost and quality of the health care it helps buy for its employees. Now it's taking concrete action. Large employers such as Wal-Mart, Oracle, Cisco, BP America and many, many others are pressuring providers to meet performance standards, adopt information technology and transform the efficiency of their operations. Big Business wants value for its buck, and it's now putting money where its mouth is. PMID:17302135

  13. Temperament and Character Inventory-R (TCI-R) and Big Five Questionnaire (BFQ): convergence and divergence.

    PubMed

    Capanna, Cristina; Struglia, Francesca; Riccardi, Ilaria; Daneluzzo, Enrico; Stratta, Paolo; Rossi, Alessandro

    2012-06-01

    This study evaluated the correspondence between measures of two competing theories of personality, the five-factor model as measured by the Big Five Questionnaire (BFQ), and Cloninger's psychobiological theory measured by the Temperament and Character Inventory-Revised (TCI-R). A sample of 900 Italian participants, balanced with respect to sex (393 men and 507 women), and representative of the adult population with respect to age (range 18 to 70 years; M = 39.6, SD = 15.7) completed the TCI-R and the Big Five Questionnaire. All TCI-R personality dimensions except Self-Transcendence were moderately correlated with one or more of the Big Five dimensions (from r = .40 to .61), and the two instruments showed areas of convergence. However, the differences outweighed the similarities, indicating that these current conceptualizations and measures of personality are somewhat inconsistent with each other.

  14. Body image and personality among British men: associations between the Big Five personality domains, drive for muscularity, and body appreciation.

    PubMed

    Benford, Karis; Swami, Viren

    2014-09-01

    The present study examined associations between the Big Five personality domains and measures of men's body image. A total of 509 men from the community in London, UK, completed measures of drive for muscularity, body appreciation, the Big Five domains, and subjective social status, and provided their demographic details. The results of a hierarchical regression showed that, once the effects of participant body mass index (BMI) and subjective social status had been accounted for, men's drive for muscularity was significantly predicted by Neuroticism (β=.29). In addition, taking into account the effects of BMI and subjective social status, men's body appreciation was significantly predicted by Neuroticism (β=-.35) and Extraversion (β=.12). These findings highlight potential avenues for the development of intervention approaches based on the relationship between the Big Five personality traits and body image.

  15. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches.

  16. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay. PMID:27695593

  17. Vertical landscraping, a big regionalism for Dubai.

    PubMed

    Wilson, Matthew

    2010-01-01

    Dubai's ecologic and economic complications are exacerbated by six years of accelerated expansion, a fixed top-down approach to urbanism and the construction of iconic single-phase mega-projects. With recent construction delays, project cancellations and growing landscape issues, Dubai's tower typologies have been unresponsive to changing environmental, socio-cultural and economic patterns (BBC, 2009; Gillet, 2009; Lewis, 2009). In this essay, a theory of "Big Regionalism" guides an argument for an economically and ecologically linked tower typology called the Condenser. This phased "box-to-tower" typology is part of a greater Landscape Urbanist strategy called Vertical Landscraping. Within this strategy, the Condenser's role is to densify the city, facilitating the creation of ecologic voids that order the urban region. Delineating "Big Regional" principles, the Condenser provides a time-based, global-local urban growth approach that weaves Bigness into a series of urban-regional, economic and ecological relationships, builds upon the environmental performance of the city's regional architecture and planning, promotes a continuity of Dubai's urban history, and responds to its landscape issues while condensing development. These speculations permit consideration of the overlooked opportunities embedded within Dubai's mega-projects and their long-term impact on the urban morphology. PMID:21132951

  18. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  19. Vertical landscraping, a big regionalism for Dubai.

    PubMed

    Wilson, Matthew

    2010-01-01

    Dubai's ecologic and economic complications are exacerbated by six years of accelerated expansion, a fixed top-down approach to urbanism and the construction of iconic single-phase mega-projects. With recent construction delays, project cancellations and growing landscape issues, Dubai's tower typologies have been unresponsive to changing environmental, socio-cultural and economic patterns (BBC, 2009; Gillet, 2009; Lewis, 2009). In this essay, a theory of "Big Regionalism" guides an argument for an economically and ecologically linked tower typology called the Condenser. This phased "box-to-tower" typology is part of a greater Landscape Urbanist strategy called Vertical Landscraping. Within this strategy, the Condenser's role is to densify the city, facilitating the creation of ecologic voids that order the urban region. Delineating "Big Regional" principles, the Condenser provides a time-based, global-local urban growth approach that weaves Bigness into a series of urban-regional, economic and ecological relationships, builds upon the environmental performance of the city's regional architecture and planning, promotes a continuity of Dubai's urban history, and responds to its landscape issues while condensing development. These speculations permit consideration of the overlooked opportunities embedded within Dubai's mega-projects and their long-term impact on the urban morphology.

  20. Statistical methods and computing for big data

    PubMed Central

    Wang, Chun; Chen, Ming-Hui; Schifano, Elizabeth; Wu, Jing

    2016-01-01

    Big data are data on a massive scale in terms of volume, intensity, and complexity that exceed the capacity of standard analytic tools. They present opportunities as well as challenges to statisticians. The role of computational statisticians in scientific discovery from big data analyses has been under-recognized even by peer statisticians. This article summarizes recent methodological and software developments in statistics that address the big data challenges. Methodologies are grouped into three classes: subsampling-based, divide and conquer, and online updating for stream data. As a new contribution, the online updating approach is extended to variable selection with commonly used criteria, and their performances are assessed in a simulation study with stream data. Software packages are summarized with focuses on the open source R and R packages, covering recent tools that help break the barriers of computer memory and computing power. Some of the tools are illustrated in a case study with a logistic regression for the chance of airline delay.

  1. Bohmian quantization of the big rip

    NASA Astrophysics Data System (ADS)

    Pinto-Neto, Nelson; Pantoja, Diego Moraes

    2009-10-01

    It is shown in this paper that minisuperspace quantization of homogeneous and isotropic geometries with phantom scalar fields, when examined in the light of the Bohm-de Broglie interpretation of quantum mechanics, does not eliminate, in general, the classical big rip singularity present in the classical model. For some values of the Hamilton-Jacobi separation constant present in a class of quantum state solutions of the Wheeler-De Witt equation, the big rip can be either completely eliminated or may still constitute a future attractor for all expanding solutions. This is contrary to the conclusion presented in [M. P. Dabrowski, C. Kiefer, and B. Sandhofer, Phys. Rev. DPRVDAQ1550-7998 74, 044022 (2006).10.1103/PhysRevD.74.044022], using a different interpretation of the wave function, where the big rip singularity is completely eliminated (“smoothed out”) through quantization, independently of such a separation constant and for all members of the above mentioned class of solutions. This is an example of the very peculiar situation where different interpretations of the same quantum state of a system are predicting different physical facts, instead of just giving different descriptions of the same observable facts: in fact, there is nothing more observable than the fate of the whole Universe.

  2. Abraham Pais Prize for History of Physics Lecture: Big, Bigger, Too Big? From Los Alamos to Fermilab and the SSC

    NASA Astrophysics Data System (ADS)

    Hoddeson, Lillian

    2012-03-01

    The modern era of big science emerged during World War II. Oppenheimer's Los Alamos laboratory offered the quintessential model of a government-funded, mission-oriented facility directed by a strong charismatic leader. The postwar beneficiaries of this model included the increasingly ambitious large laboratories that participated in particle physics--in particular, Brookhaven, SLAC, and Fermilab. They carried the big science they practiced into a new realm where experiments eventually became as large and costly as entire laboratories had been. Meanwhile the available funding grew more limited causing the physics research to be concentrated into fewer and bigger experiments that appeared never to end. The next phase in American high-energy physics was the Superconducting Super Collider, the most costly pure physics project ever attempted. The SSC's termination was a tragedy for American science, but for historians it offers an opportunity to understand what made the success of earlier large high-energy physics laboratories possible, and what made the continuation of the SSC impossible. The most obvious reason for the SSC's failure was its enormous and escalating budget, which Congress would no longer support. Other factors need to be recognized however: no leader could be found with directing skills as strong as those of Wilson, Panofsky, Lederman, or Richter; the scale of the project subjected it to uncomfortable public and Congressional scrutiny; and the DOE's enforcement of management procedures of the military-industrial complex that clashed with those typical of the scientific community led to the alienation and withdrawal of many of the most creative scientists, and to the perception and the reality of poor management. These factors, exacerbated by negative pressure from scientists in other fields and a post-Cold War climate in which physicists had little of their earlier cultural prestige, discouraged efforts to gain international support. They made the SSC

  3. Big Five personality traits: are they really important for the subjective well-being of Indians?

    PubMed

    Tanksale, Deepa

    2015-02-01

    This study empirically examined the relationship between the Big Five personality traits and subjective well-being (SWB) in India. SWB variables used were life satisfaction, positive affect and negative affect. A total of 183 participants in the age range 30-40 years from Pune, India, completed the personality and SWB measures. Backward stepwise regression analysis showed that the Big Five traits accounted for 17% of the variance in life satisfaction, 35% variance in positive affect and 28% variance in negative affect. Conscientiousness emerged as the strongest predictor of life satisfaction. In line with the earlier research findings, neuroticism and extraversion were found to predict negative affect and positive affect, respectively. Neither openness to experience nor agreeableness contributed to SWB. The research emphasises the need to revisit the association between personality and SWB across different cultures, especially non-western cultures.

  4. Big Data, Big Problems: Incorporating Mission, Values, and Culture in Provider Affiliations.

    PubMed

    Shaha, Steven H; Sayeed, Zain; Anoushiravani, Afshin A; El-Othmani, Mouhanad M; Saleh, Khaled J

    2016-10-01

    This article explores how integration of data from clinical registries and electronic health records produces a quality impact within orthopedic practices. Data are differentiated from information, and several types of data that are collected and used in orthopedic outcome measurement are defined. Furthermore, the concept of comparative effectiveness and its impact on orthopedic clinical research are assessed. This article places emphasis on how the concept of big data produces health care challenges balanced with benefits that may be faced by patients and orthopedic surgeons. Finally, essential characteristics of an electronic health record that interlinks musculoskeletal care and big data initiatives are reviewed.

  5. Big Data, Big Problems: Incorporating Mission, Values, and Culture in Provider Affiliations.

    PubMed

    Shaha, Steven H; Sayeed, Zain; Anoushiravani, Afshin A; El-Othmani, Mouhanad M; Saleh, Khaled J

    2016-10-01

    This article explores how integration of data from clinical registries and electronic health records produces a quality impact within orthopedic practices. Data are differentiated from information, and several types of data that are collected and used in orthopedic outcome measurement are defined. Furthermore, the concept of comparative effectiveness and its impact on orthopedic clinical research are assessed. This article places emphasis on how the concept of big data produces health care challenges balanced with benefits that may be faced by patients and orthopedic surgeons. Finally, essential characteristics of an electronic health record that interlinks musculoskeletal care and big data initiatives are reviewed. PMID:27637659

  6. Community-Academic Partnership Participation.

    PubMed

    Meza, Rosemary; Drahota, Amy; Spurgeon, Emily

    2016-10-01

    Community-academic partnerships (CAPs) improve the research process, outcomes, and yield benefits for the community and researchers. This exploratory study examined factors important in community stakeholders' decision to participate in CAPs. Autism spectrum disorder (ASD) community stakeholders, previously contacted to participate in a CAP (n = 18), completed the 15-item Decision to Participate Questionnaire (DPQ). The DPQ assessed reasons for participating or declining participation in the ASD CAP. CAP participants rated networking with other providers, fit of collaboration with agency philosophy, and opportunity for future training/consultations as factors more important in their decision to participate in the ASD CAP than nonparticipants. Nonparticipants reported the number of requests to participate in research as more important in their decision to decline participation than participants. Findings reveal important factors in community stakeholders' decision to participate in CAPs that may provide guidance on increasing community engagement in CAPs and help close the science-to-service gap.

  7. [Women's participation in science].

    PubMed

    Sánchez-Guzmán, María Alejandra; Corona-Vázquez, Teresa

    2009-01-01

    The participation of women in higher education in Mexico took place in the late 19th and early 20th century. The rise of women's enrollment in universities known as the "feminization of enrollment" occurred in the last thirty years. In this review we analyze how the new conditions that facilitated better access to higher education are reflected in the inclusion of women in science. We include an overview of the issues associated with a change in the demographics of enrollment, segregation of academic areas between men and women and participation in post graduate degrees. We also review the proportion of women in science. While in higher education the ratio between male and women is almost 50-50 and in some areas the presence of women is even higher, in the field of scientific research women account for barely 30% of professionals. This is largely due to structural conditions that limit the access of women to higher positions of power that have been predominantly taken by men.

  8. Data management by using R: big data clinical research series.

    PubMed

    Zhang, Zhongheng

    2015-11-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research.

  9. Data management by using R: big data clinical research series

    PubMed Central

    2015-01-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research. PMID:26697463

  10. Relations of the Big-Five personality dimensions to autodestructive behavior in clinical and non-clinical adolescent populations

    PubMed Central

    Kotrla Topić, Marina; Perković Kovačević, Marina; Mlačić, Boris

    2012-01-01

    Aim To examine the relationship between the Big-Five personality model and autodestructive behavior symptoms, namely Autodestructiveness and Suicidal Depression in two groups of participants: clinical and non-clinical adolescents. Methods Two groups of participants, clinical (adolescents with diagnosis of psychiatric disorder based on clinical impression and according to valid diagnostic criteria, N = 92) and non-clinical (high-school students, N = 87), completed two sets of questionnaires: the Autodestructiveness Scale which provided data on Autodestructiveness and Suicidal Depression, and the International Personality Item Pool (IPIP), which provided data on the Big -Five personality dimensions. Results Clinical group showed significantly higher values on the Autodestructiveness scale in general, as well as on Suicidal Depression, Aggressiveness, and Borderline subscales than the non-clinical group. Some of the dimensions of the Big-Five personality model, ie, Emotional Stability, Conscientiousness, and Agreeableness showed significant relationship (hierarchical regression analyses, P values for β coefficients from <0.001 to 0.021) with Autodestructivness and Suicidal Depression, even after controlling for the sex and group effects or, when analyzing Suicidal Depression, after controlling the effect of other subscales. Conclusion The results indicate that dimensions of the Big-Five model are important when evaluating adolescent psychiatric patients and adolescents from general population at risk of self-destructive behavior. PMID:23100207

  11. Age-related trends of inhibitory control in Stroop-like big-small task in 3 to 12-year-old children and young adults.

    PubMed

    Ikeda, Yoshifumi; Okuzumi, Hideyuki; Kokubun, Mitsuru

    2014-01-01

    Inhibitory control is the ability to suppress competing, dominant, automatic, or prepotent cognitive processing at perceptual, intermediate, and output stages. Inhibitory control is a key cognitive function of typical and atypical child development. This study examined age-related trends of Stroop-like interference in 3 to 12-year-old children and young adults by administration of a computerized Stroop-like big-small task with reduced working memory demand. This task used a set of pictures displaying a big and small circle in black and included the same condition and the opposite condition. In the same condition, each participant was instructed to say "big" when viewing the big circle and to say "small" when viewing the small circle. In the opposite condition, each participant was instructed to say "small" when viewing the big circle and to say "big" when viewing the small circle. The opposite condition required participants to inhibit the prepotent response of saying the same, a familiar response to a perceptual stimulus. The results of this study showed that Stroop-like interference decreased markedly in children in terms of error rates and correct response time. There was no deterioration of performance occurring between the early trials and the late trials in the sessions of the day-night task. Moreover, pretest failure rate was relatively low in this study. The Stroop-like big-small task is a useful tool to assess the development of inhibitory control in young children in that the task is easy to understand and has small working memory demand.

  12. Big Data Analytics for Prostate Radiotherapy

    PubMed Central

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose–volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the “RadoncSpace”) in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  13. Observational hints on the Big Bounce

    SciTech Connect

    Mielczarek, Jakub; Kurek, Aleksandra; Szydłowski, Marek; Kamionka, Michał E-mail: kamionka@astro.uni.wroc.pl E-mail: uoszydlo@cyf-kr.edu.pl

    2010-07-01

    In this paper we study possible observational consequences of the bouncing cosmology. We consider a model where a phase of inflation is preceded by a cosmic bounce. While we consider in this paper only that the bounce is due to loop quantum gravity, most of the results presented here can be applied for different bouncing cosmologies. We concentrate on the scenario where the scalar field, as the result of contraction of the universe, is driven from the bottom of the potential well. The field is amplified, and finally the phase of the standard slow-roll inflation is realized. Such an evolution modifies the standard inflationary spectrum of perturbations by the additional oscillations and damping on the large scales. We extract the parameters of the model from the observations of the cosmic microwave background radiation. In particular, the value of inflaton mass is equal to m = (1.7±0.6)·10{sup 13} GeV. In our considerations we base on the seven years of observations made by the WMAP satellite. We propose the new observational consistency check for the phase of slow-roll inflation. We investigate the conditions which have to be fulfilled to make the observations of the Big Bounce effects possible. We translate them to the requirements on the parameters of the model and then put the observational constraints on the model. Based on assumption usually made in loop quantum cosmology, the Barbero-Immirzi parameter was shown to be constrained by γ < 1100 from the cosmological observations. We have compared the Big Bounce model with the standard Big Bang scenario and showed that the present observational data is not informative enough to distinguish these models.

  14. Big Impacts and Transient Oceans on Titan

    NASA Technical Reports Server (NTRS)

    Zahnle, K. J.; Korycansky, D. G.; Nixon, C. A.

    2014-01-01

    We have studied the thermal consequences of very big impacts on Titan [1]. Titan's thick atmosphere and volatile-rich surface cause it to respond to big impacts in a somewhat Earth-like manner. Here we construct a simple globally-averaged model that tracks the flow of energy through the environment in the weeks, years, and millenia after a big comet strikes Titan. The model Titan is endowed with 1.4 bars of N2 and 0.07 bars of CH4, methane lakes, a water ice crust, and enough methane underground to saturate the regolith to the surface. We assume that half of the impact energy is immediately available to the atmosphere and surface while the other half is buried at the site of the crater and is unavailable on time scales of interest. The atmosphere and surface are treated as isothermal. We make the simplifying assumptions that the crust is everywhere as methane saturated as it was at the Huygens landing site, that the concentration of methane in the regolith is the same as it is at the surface, and that the crust is made of water ice. Heat flow into and out of the crust is approximated by step-functions. If the impact is great enough, ice melts. The meltwater oceans cool to the atmosphere conductively through an ice lid while at the base melting their way into the interior, driven down in part through Rayleigh-Taylor instabilities between the dense water and the warm ice. Topography, CO2, and hydrocarbons other than methane are ignored. Methane and ethane clathrate hydrates are discussed quantitatively but not fully incorporated into the model.

  15. Big Data Analytics for Prostate Radiotherapy.

    PubMed

    Coates, James; Souhami, Luis; El Naqa, Issam

    2016-01-01

    Radiation therapy is a first-line treatment option for localized prostate cancer and radiation-induced normal tissue damage are often the main limiting factor for modern radiotherapy regimens. Conversely, under-dosing of target volumes in an attempt to spare adjacent healthy tissues limits the likelihood of achieving local, long-term control. Thus, the ability to generate personalized data-driven risk profiles for radiotherapy outcomes would provide valuable prognostic information to help guide both clinicians and patients alike. Big data applied to radiation oncology promises to deliver better understanding of outcomes by harvesting and integrating heterogeneous data types, including patient-specific clinical parameters, treatment-related dose-volume metrics, and biological risk factors. When taken together, such variables make up the basis for a multi-dimensional space (the "RadoncSpace") in which the presented modeling techniques search in order to identify significant predictors. Herein, we review outcome modeling and big data-mining techniques for both tumor control and radiotherapy-induced normal tissue effects. We apply many of the presented modeling approaches onto a cohort of hypofractionated prostate cancer patients taking into account different data types and a large heterogeneous mix of physical and biological parameters. Cross-validation techniques are also reviewed for the refinement of the proposed framework architecture and checking individual model performance. We conclude by considering advanced modeling techniques that borrow concepts from big data analytics, such as machine learning and artificial intelligence, before discussing the potential future impact of systems radiobiology approaches. PMID:27379211

  16. Differential Privacy Preserving in Big Data Analytics for Connected Health.

    PubMed

    Lin, Chi; Song, Zihao; Song, Houbing; Zhou, Yanhong; Wang, Yi; Wu, Guowei

    2016-04-01

    In Body Area Networks (BANs), big data collected by wearable sensors usually contain sensitive information, which is compulsory to be appropriately protected. Previous methods neglected privacy protection issue, leading to privacy exposure. In this paper, a differential privacy protection scheme for big data in body sensor network is developed. Compared with previous methods, this scheme will provide privacy protection with higher availability and reliability. We introduce the concept of dynamic noise thresholds, which makes our scheme more suitable to process big data. Experimental results demonstrate that, even when the attacker has full background knowledge, the proposed scheme can still provide enough interference to big sensitive data so as to preserve the privacy.

  17. 'Big data' in pharmaceutical science: challenges and opportunities.

    PubMed

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science. PMID:24962278

  18. Probing the Big Bang with LEP

    NASA Technical Reports Server (NTRS)

    Schramm, David N.

    1990-01-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis, and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is approximately 6 percent of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting than the favorite non-baryonic dark matter candidates of a few years ago.

  19. The New Big Science at the NSLS

    NASA Astrophysics Data System (ADS)

    Crease, Robert

    2016-03-01

    The term ``New Big Science'' refers to a phase shift in the kind of large-scale science that was carried out throughout the U.S. National Laboratory system, when large-scale materials science accelerators rather than high-energy physics accelerators became marquee projects at most major basic research laboratories in the post-Cold War era, accompanied by important changes in the character and culture of the research ecosystem at these laboratories. This talk explores some aspects of this phase shift at BNL's National Synchrotron Light Source.

  20. The Next Big Thing - Eric Haseltine

    ScienceCinema

    Eric Haseltine

    2016-07-12

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  1. np -> d gamma for big bang nucleosynthesis

    SciTech Connect

    Jiunn-Wei Chen; Martin J. Savage

    1999-12-01

    The cross section from np -> dy is calculated at energies relevant to big-bang nucleosynthesis using the recently developed effective field theory that describes the two nucleon sector. The E1 amplitude is computed up to N{sup 3}LO and depends only upon nucleon-nucleon phase shift data. In contrast, the M1 contribution is determined by the cross section for cold neutron capture. The uncertainty in the calculation for nucleon energies up to E{approx}1 MeV is estimated to be <= 4%.

  2. The big unknown: plant virus biodiversity.

    PubMed

    Roossinck, Marilyn J

    2011-07-01

    Studies on plant virus biodiversity are in their infancy, but with new technologies we can expect to see more information about novel plant viruses in the near future. The challenge for virus biodiversity work is that viruses do not have any universal coding sequence, such as ribosomal RNAs found in all cellular life. These obstacles are being overcome in clever ways. Understanding what exists in our natural environment will help us to tackle big issues in agriculture, such as disease emergence and the use of beneficial viruses and other microbes.

  3. The Big Bang and Cosmic Inflation

    NASA Astrophysics Data System (ADS)

    Guth, Alan H.

    2014-03-01

    A summary is given of the key developments of cosmology in the 20th century, from the work of Albert Einstein to the emergence of the generally accepted hot big bang model. The successes of this model are reviewed, but emphasis is placed on the questions that the model leaves unanswered. The remainder of the paper describes the inflationary universe model, which provides plausible answers to a number of these questions. It also offers a possible explanation for the origin of essentially all the matter and energy in the observed universe.

  4. Nuclear Receptors, RXR, and the Big Bang.

    PubMed

    Evans, Ronald M; Mangelsdorf, David J

    2014-03-27

    Isolation of genes encoding the receptors for steroids, retinoids, vitamin D, and thyroid hormone and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors and, in particular, of the retinoid X receptor (RXR) positioned nuclear receptors at the epicenter of the "Big Bang" of molecular endocrinology. This Review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multicellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism.

  5. Nuclear Receptors, RXR & the Big Bang

    PubMed Central

    Evans, Ronald M.; Mangelsdorf, David J.

    2014-01-01

    Summary Isolation of genes encoding the receptors for steroids, retinoids, vitamin D and thyroid hormone, and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors, and in particular of the retinoid X receptor (RXR), positioned nuclear receptors at the epicenter of the “Big Bang” of molecular endocrinology. This review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multi-cellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism. PMID:24679540

  6. Nuclear Receptors, RXR, and the Big Bang.

    PubMed

    Evans, Ronald M; Mangelsdorf, David J

    2014-03-27

    Isolation of genes encoding the receptors for steroids, retinoids, vitamin D, and thyroid hormone and their structural and functional analysis revealed an evolutionarily conserved template for nuclear hormone receptors. This discovery sparked identification of numerous genes encoding related proteins, termed orphan receptors. Characterization of these orphan receptors and, in particular, of the retinoid X receptor (RXR) positioned nuclear receptors at the epicenter of the "Big Bang" of molecular endocrinology. This Review provides a personal perspective on nuclear receptors and explores their integrated and coordinated signaling networks that are essential for multicellular life, highlighting the RXR heterodimer and its associated ligands and transcriptional mechanism. PMID:24679540

  7. Pre - big bang inflation requires fine tuning

    SciTech Connect

    Turner, Michael S.; Weinberg, Erick J.

    1997-10-01

    The pre-big-bang cosmology inspired by superstring theories has been suggested as an alternative to slow-roll inflation. We analyze, in both the Jordan and Einstein frames, the effect of spatial curvature on this scenario and show that too much curvature --- of either sign --- reduces the duration of the inflationary era to such an extent that the flatness and horizon problems are not solved. Hence, a fine-tuning of initial conditions is required to obtain enough inflation to solve the cosmological problems.

  8. Ocean Networks Canada's "Big Data" Initiative

    NASA Astrophysics Data System (ADS)

    Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.

    2013-12-01

    Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and

  9. The Next Big Thing - Eric Haseltine

    SciTech Connect

    Eric Haseltine

    2009-09-16

    Eric Haseltine, Haseltine Partners president and former chief of Walt Disney Imagineering, presented "The Next Big Thing," on Sept. 11, at the ORNL. He described the four "early warning signs" that a scientific breakthrough is imminent, and then suggested practical ways to turn these insights into breakthrough innovations. Haseltine is former director of research at the National Security Agency and associate director for science and technology for the director of National Intelligence, former executive vice president of Walt Disney Imagineering and director of engineering for Hughes Aircraft. He has 15 patents in optics, special effects and electronic media, and more than 100 publications in science and technical journals, the web and Discover Magazine.

  10. Probing the Big Bang with LEP

    SciTech Connect

    Schramm, D.N. Fermi National Accelerator Lab., Batavia, IL )

    1990-06-01

    It is shown that LEP probes the Big Bang in two significant ways: (1) nucleosynthesis and (2) dark matter constraints. In the first case, LEP verifies the cosmological standard model prediction on the number of neutrino types, thus strengthening the conclusion that the cosmological baryon density is {approximately}6% of the critical value. In the second case, LEP shows that the remaining non-baryonic cosmological matter must be somewhat more massive and/or more weakly interacting that the favorite non-baryonic dark matter candidates of a few years ago. 59 refs., 4 figs., 2 tabs.

  11. Yough, literacy and participation

    NASA Astrophysics Data System (ADS)

    Gillette, Arthur

    1985-12-01

    The number of illiterates in the world continues to grow. Simultaneously, there are few if any literacy efforts in the world today that do not depend upon the energies and skills (and sometimes ideas) of young people. Youth's participation in the provision of literacy, in some industrialized as well as in many developing countries, is classified according to three patterns: the project pattern, the programme pattern, and the campaign pattern. The project pattern is not seen to hold out the prospect of enabling youth to make serious inroads into growing illiteracy. Conversely, the campaign pattern seemed largely exceptional. Suggestions are made to draw on elements of both the project and the campaign patterns to show ways of enrichting, systematizing and generalizing the programme pattern.

  12. Researching participant recruitment times.

    PubMed

    O'Brien, Rachel; Black, Polly

    2015-11-01

    Conducting research in emergency departments is relatively new, and there are a number of ethical and practical challenges to recruiting patients in these settings. In 2008, the Emergency Medicine Research Group Edinburgh (EMERGE) was set up at the Royal Infirmary of Edinburgh emergency department to support researchers and encourage the growth of research in emergency medicine. As part of a review of their working methods, the group's clinical nurse researchers undertook a small study to identify participant recruitment times. The results showed a significant difference between perceived and actual recruitment times, which has implications for planning staff numbers and budgets. This article describes the evaluation process and methods of data collection, and discusses the results. PMID:26542924

  13. Making big sense from big data in toxicology by read-across.

    PubMed

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  14. Making big sense from big data in toxicology by read-across.

    PubMed

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others. PMID:27032088

  15. Big science and big administration. Confronting the governance, financial and legal challenges of FuturICT

    NASA Astrophysics Data System (ADS)

    Smart, J.; Scott, M.; McCarthy, J. B.; Tan, K. T.; Argyrakis, P.; Bishop, S.; Conte, R.; Havlin, S.; San Miguel, M.; Stauffacher, D.

    2012-11-01

    This paper considers the issues around managing large scientific projects, and draws conclusions for the governance and management of FuturICT, based on previous experience of Big Science projects, such as CERN and ATLAS. We also consider the legal and ethical issues of the FuturICT project as the funding instrument moves from the Seventh Framework Programme to Horizon 2020.

  16. Revisiting the Big Six and the Big Five among Hong Kong University Students

    ERIC Educational Resources Information Center

    Zhang, Li-fang

    2008-01-01

    The present study replicated investigation of the link between Holland's six career interest types and Costa and McCrae's big five personality traits in a Chinese context. A sample of 79 university students from Hong Kong evaluated their own abilities and responded to the Short-Version Self-Directed Search (SVSDS) and the NEO Five-Factor…

  17. "Big Blue Marble" Fact Sheet and "Big Blue Marble" Program Content (Shows 1 through 78).

    ERIC Educational Resources Information Center

    International Telephone and Telegraph Corp., New York, NY.

    This booklet describes the content of 78 programs presented in the "Big Blue Marble" series, an international series of children's television shows sponsored by the International Telephone and Telegraph Corporation. The major sequence of subjects is given, as well as a description of each program's folktale adaptation (a regular feature) and…

  18. OSEP Research Institutes: Bridging Research and Practice. Big Ideas (Plus a Little Effort) Produce Big Results.

    ERIC Educational Resources Information Center

    Grossen, Bonnie; Caros, Jennifer; Carnine, Doug; Davis, Betsy; Deshler, Don; Schumaker, Jean; Bulgren, Janis; Lenz, Keith; Adams, Gary; Jantzen, Jean-Ellen; Marquis, Janet

    2002-01-01

    This article describes the three major components of the BIG Accommodation Model, a program that focuses on restructuring traditional instruction through alternative programs designed to accelerate learning in students with disabilities and other students. The components include curricula engineered to accelerate learning; early, intensive…

  19. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    USGS Publications Warehouse

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  20. Teaching Information & Technology Skills: The Big6[TM] in Elementary Schools. Professional Growth Series.

    ERIC Educational Resources Information Center

    Eisenberg, Michael B.; Berkowitz, Robert E.

    This book about using the Big6 information problem solving process model in elementary schools is organized into two parts. Providing an overview of the Big6 approach, Part 1 includes the following chapters: "Introduction: The Need," including the information problem, the Big6 and other process models, and teaching/learning the Big6; "The Big6…

  1. Big Data, Little Data, and Care Coordination for Medicare Beneficiaries with Medigap Coverage.

    PubMed

    Ozminkowski, Ronald J; Wells, Timothy S; Hawkins, Kevin; Bhattarai, Gandhi R; Martel, Charles W; Yeh, Charlotte S

    2015-06-01

    Most healthcare data warehouses include big data such as health plan, medical, and pharmacy claims information for many thousands and sometimes millions of insured individuals. This makes it possible to identify those with multiple chronic conditions who may benefit from participation in care coordination programs meant to improve their health. The objective of this article is to describe how large databases, including individual and claims data, and other, smaller types of data from surveys and personal interviews, are used to support a care coordination program. The program described in this study was implemented for adults who are generally 65 years of age or older and have an AARP(®) Medicare Supplement Insurance Plan (i.e., a Medigap plan) insured by UnitedHealthcare Insurance Company (or, for New York residents, UnitedHealthcare Insurance Company of New York). Individual and claims data were used first to calculate risk scores that were then utilized to identify the majority of individuals who were qualified for program participation. For efficient use of time and resources, propensity to succeed modeling was used to prioritize referrals based upon their predicted probabilities of (1) engaging in the care coordination program, (2) saving money once engaged, and (3) receiving higher quality of care. To date, program evaluations have reported positive returns on investment and improved quality of healthcare among program participants. In conclusion, the use of data sources big and small can help guide program operations and determine if care coordination programs are working to help older adults live healthier lives.

  2. The big bang? An eventful year in workers' compensation.

    PubMed

    Guidotti, Tee L

    2006-01-01

    Workers' compensation in the past two years has been dominated by events in California, which have been so fundamental as to merit the term big bang. Passage of Senate Bill 899 has led to a comprehensive program of reform in access to medical care, access to rehabilitation services, temporary and permanent disability, evidence-based management, dispute resolution, and system innovation. Two noteworthy developments thus arose: a new requirement for apportionment by cause in causation analysis, and the adoption of evidence-based criteria for impairment assessment, treatment guidelines, and, soon, utilization review. Elsewhere in the United States, changes were modest, but extensive legislative activity in Texas suggests that Texas will be next to make major changes. In Canada, the Workers' Compensation Board of British Columbia has adopted an ambitious strategic initiative, and there is a Canadawide movement to establish presumption for certain diseases in firefighters. Suggestions for future directions include an increased emphasis on prevention, integration of programs, worker participation, enhancing the expertise of health care professionals, evidence-based management, process evaluation, and opportunities for innovation.

  3. Big Explosions and Strong Gravity: Packaged Activities for Girl Scouts

    NASA Astrophysics Data System (ADS)

    Hornschemeier, A. E.; Lochner, J.; Feaga, L.; Ganguly, R.; Ford, K. S.

    2004-12-01

    We report on our experiences with middle-school age girls in a Girl Scout activity that centers around black holes and supernova remnants. These activities are packaged into a patch activity called ``Big Explosions and Strong Gravity" (BESG) and utilizes science ``kits" that we have developed. This program draws its excitement from the observing of supernova and black holes. These astronomical objects are used as a basis for exploration of the electromagnetic spectrum, chemistry, stellar evolution, and orbital motion. These topics address national middle-school and high-school science standards for earth and space science and are approached using inquiry-based teaching methods. We held an intensive BESG day in July 2004 with sixty participants and a teacher evaluator and have now posted updated activity descriptions to a website. We gratefully acknowledged funding from the Chandra X-ray Center through an Education and Public Outreach proposal grant for Chandra Cycle 5. This work was done in collaboration with the Girl Scouts of Central Maryland.

  4. [Algorithms, machine intelligence, big data : general considerations].

    PubMed

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges.

  5. Gravitational waves from the big bounce

    SciTech Connect

    Mielczarek, Jakub

    2008-11-15

    In this paper we investigate gravitational wave production during the big bounce phase, inspired by loop quantum cosmology. We consider the influence of the holonomy corrections to the equation for tensor modes. We show that they act like additional effective graviton mass, suppressing gravitational wave creation. However, such effects can be treated perturbatively. We investigate a simplified model without holonomy corrections to the equation for modes and find its exact analytical solution. Assuming the form for matter {rho}{proportional_to}a{sup -2} we calculate the full spectrum of the gravitational waves from the big bounce phase. The spectrum obtained decreases to zero for the low energy modes. On the basis of this observation we infer that this effect can lead to low cosmic microwave background (CMB) multipole suppression and gives a potential way for testing loop quantum cosmology models. We also consider a scenario with a post-bounce inflationary phase. The power spectrum obtained gives a qualitative explanation of the CMB spectra, including low multipole suppression.

  6. [Algorithms, machine intelligence, big data : general considerations].

    PubMed

    Radermacher, F J

    2015-08-01

    We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges. PMID:26141245

  7. Big Bang Cosmic Titanic: Cause for Concern?

    NASA Astrophysics Data System (ADS)

    Gentry, Robert

    2013-04-01

    This abstract alerts physicists to a situation that, unless soon addressed, may yet affect PRL integrity. I refer to Stanley Brown's and DAE Robert Caldwell's rejection of PRL submission LJ12135, A Cosmic Titanic: Big Bang Cosmology Unravels Upon Discovery of Serious Flaws in Its Foundational Expansion Redshift Assumption, by their claim that BB is an established theory while ignoring our paper's Titanic, namely, that BB's foundational spacetime expansion redshifts assumption has now been proven to be irrefutably false because it is contradicted by our seminal discovery that GPS operation unequivocally proves that GR effects do not produce in-flight photon wavelength changes demanded by this central assumption. This discovery causes the big bang to collapse as quickly as did Ptolemaic cosmology when Copernicus discovered its foundational assumption was heliocentric, not geocentric. Additional evidence that something is amiss in PRL's treatment of LJ12135 comes from both Brown and EiC Gene Spouse agreeing to meet at my exhibit during last year's Atlanta APS to discuss this cover-up issue. Sprouse kept his commitment; Brown didn't. Question: If Brown could have refuted my claim of a cover-up, why didn't he come to present it before Gene Sprouse? I am appealing LJ12135's rejection.

  8. Particle Production and Big Rip Singularities

    NASA Astrophysics Data System (ADS)

    Bates, Jason

    2010-02-01

    In 1929, Edwin Hubble found that objects in our Universe generally recede from us at a rate proportional to their distance, suggesting that the Universe as a whole is expanding. More recently, astronomers have observed that this expansion is accelerating. According to Einstein's theory of gravity, all normal matter in the Universe should act to slow the rate of expansion, so there must be something new which is causing this acceleration. Cosmologists call this ``Dark Energy.'' One of the possibilities for dark energy leads to a Universe which expands to an infinite size in a finite amount of time. This scenario is called a ``Big Rip,'' because near the end of time this expansion overcomes all other forces in the Universe - even atoms are ripped apart. However, Quantum Mechanics predicts that as the Universe expands particles will be created. If enough particles are created, this process could slow or even halt the expansion, and the ``Big Rip'' might be avoided. Using numerical methods, we considered the quantum effects for massive and massless scalar fields, and found that while at late times quantum effects do grow large, they do not become comparable to the dark energy until very near the singularity when the curvature of the Universe approaches the Planck scale. )

  9. Big Data Issues under the Copernicus Programme

    NASA Astrophysics Data System (ADS)

    Schulte-Braucks, R. L.

    2014-12-01

    The Copernicus Programme of Earth observation satellites (http://copernicus.eu) will be affected by a growing volume of data and information. The first satellite (Sentinel 1A) has just been launched. Seven additional satellites are to be launched by the end of the decade. These will produce 8 TB of data per day, i.e. considerably more than can be downloaded via normal Internet connections.There is no definitive answer to the many challenges of big data but there are gradual solutions for Copernicus in view of the progressive roll out of the space infrastructure and the thematic services which the European Commission will develop. This presentation will present several approaches to the big data issue. It will start from the needs of the Copernicus users, which are far from being homogeneous. As their needs are different, the European Commission and ESA will have to propose different solutions to fulfil these needs, taking into account the present and future state of technology. The presentation will discuss these solutions, both with regard to a better use of the network and with regard to hosted processing.

  10. Data, Big Data, and Metadata in Anesthesiology.

    PubMed

    Levin, Matthew A; Wanderer, Jonathan P; Ehrenfeld, Jesse M

    2015-12-01

    The last decade has seen an explosion in the growth of digital data. Since 2005, the total amount of digital data created or replicated on all platforms and devices has been doubling every 2 years, from an estimated 132 exabytes (132 billion gigabytes) in 2005 to 4.4 zettabytes (4.4 trillion gigabytes) in 2013, and a projected 44 zettabytes (44 trillion gigabytes) in 2020. This growth has been driven in large part by the rise of social media along with more powerful and connected mobile devices, with an estimated 75% of information in the digital universe generated by individuals rather than entities. Transactions and communications including payments, instant messages, Web searches, social media updates, and online posts are all becoming part of a vast pool of data that live "in the cloud" on clusters of servers located in remote data centers. The amount of accumulating data has become so large that it has given rise to the term Big Data. In many ways, Big Data is just a buzzword, a phrase that is often misunderstood and misused to describe any sort of data, no matter the size or complexity. However, there is truth to the assertion that some data sets truly require new management and analysis techniques. PMID:26579664

  11. The Big Five default brain: functional evidence.

    PubMed

    Sampaio, Adriana; Soares, José Miguel; Coutinho, Joana; Sousa, Nuno; Gonçalves, Óscar F

    2014-11-01

    Recent neuroimaging studies have provided evidence that different dimensions of human personality may be associated with specific structural neuroanatomic correlates. Identifying brain correlates of a situation-independent personality structure would require evidence of a stable default mode of brain functioning. In this study, we investigated the correlates of the Big Five personality dimensions (Extraversion, Neuroticism, Openness/Intellect, Agreeableness, and Conscientiousness) and the default mode network (DMN). Forty-nine healthy adults completed the NEO-Five Factor. The results showed that the Extraversion (E) and Agreeableness (A) were positively correlated with activity in the midline core of the DMN, whereas Neuroticism (N), Openness (O), and Conscientiousness (C) were correlated with the parietal cortex system. Activity of the anterior cingulate cortex was positively associated with A and negatively with C. Regions of the parietal lobe were differentially associated with each personality dimension. The present study not only confirms previous functional correlates regarding the Big Five personality dimensions, but it also expands our knowledge showing the association between different personality dimensions and specific patterns of brain activation at rest.

  12. Rasdaman for Big Spatial Raster Data

    NASA Astrophysics Data System (ADS)

    Hu, F.; Huang, Q.; Scheele, C. J.; Yang, C. P.; Yu, M.; Liu, K.

    2015-12-01

    Spatial raster data have grown exponentially over the past decade. Recent advancements on data acquisition technology, such as remote sensing, have allowed us to collect massive observation data of various spatial resolution and domain coverage. The volume, velocity, and variety of such spatial data, along with the computational intensive nature of spatial queries, pose grand challenge to the storage technologies for effective big data management. While high performance computing platforms (e.g., cloud computing) can be used to solve the computing-intensive issues in big data analysis, data has to be managed in a way that is suitable for distributed parallel processing. Recently, rasdaman (raster data manager) has emerged as a scalable and cost-effective database solution to store and retrieve massive multi-dimensional arrays, such as sensor, image, and statistics data. Within this paper, the pros and cons of using rasdaman to manage and query spatial raster data will be examined and compared with other common approaches, including file-based systems, relational databases (e.g., PostgreSQL/PostGIS), and NoSQL databases (e.g., MongoDB and Hive). Earth Observing System (EOS) data collected from NASA's Atmospheric Scientific Data Center (ASDC) will be used and stored in these selected database systems, and a set of spatial and non-spatial queries will be designed to benchmark their performance on retrieving large-scale, multi-dimensional arrays of EOS data. Lessons learnt from using rasdaman will be discussed as well.

  13. "Big data" and "open data": What kind of access should researchers enjoy?

    PubMed

    Chatellier, Gilles; Varlet, Vincent; Blachier-Poisson, Corinne

    2016-02-01

    The healthcare sector is currently facing a new paradigm, the explosion of "big data". Coupled with advances in computer technology, the field of "big data" appears promising, allowing us to better understand the natural history of diseases, to follow-up new technologies (devices, drugs) implementation and to participate in precision medicine, etc. Data sources are multiple (medical and administrative data, electronic medical records, data from rapidly developing technologies such as DNA sequencing, connected devices, etc.) and heterogeneous while their use requires complex methods for accurate analysis. Moreover, faced with this new paradigm, we must determine who could (or should) have access to which data, how to combine collective interest and protection of personal data and how to finance in the long-term both operating costs and databases interrogation. This article analyses the opportunities and challenges related to the use of open and/or "big data", from the viewpoint of pharmacologists and representatives of the pharmaceutical and medical device industry. PMID:27080635

  14. The Big-Five factor structure as an integrative framework: an analysis of Clarke's AVA model.

    PubMed

    Goldberg, L R; Sweeney, D; Merenda, P F; Hughes, J E

    1996-06-01

    Using a large (N = 3,629) sample of participants selected to be representative of U.S. working adults in the year 2,000, we provide links between the constructs in 2 personality models that have been derived from quite different rationales. We demonstrate the use of a novel procedure for providing orthogonal Big-Five factor scores and use those scores to analyze the scales of the Activity Vector Analysis (AVA). We discuss the implications of our many findings both for the science of personality assessment and for future research using the AVA model.

  15. Big Sky Telegraph: Telecommunications Guide to Community Action.

    ERIC Educational Resources Information Center

    Odasz, Frank B., Comp.

    This document contains a wide assortment of papers and promotional materials concerning the Big Sky Telegraph, a Montana-based telecommunications network serving rural economic development organizations. Funded by the US West Foundation and Western Montana College, Big Sky was created to stimulate grassroots innovation in rural education,…

  16. The Big Picture College: A Model High School Program Graduates

    ERIC Educational Resources Information Center

    Scurry, Jamie E.; Littky, Dennis

    2007-01-01

    The Providence-based Big Picture Company has transformed the American high school experience for low-income, urban students. Now it is ready to take on a new challenge: redesigning the American college. In this article, the authors discuss how the Big Picture College will build a curriculum that emphasizes students' interests, integrates…

  17. 33 CFR 117.267 - Big Carlos Pass.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Big Carlos Pass. 117.267 Section 117.267 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.267 Big Carlos Pass. The draw of...

  18. 33 CFR 117.267 - Big Carlos Pass.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Big Carlos Pass. 117.267 Section 117.267 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.267 Big Carlos Pass. The draw of...

  19. 33 CFR 117.267 - Big Carlos Pass.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Big Carlos Pass. 117.267 Section 117.267 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.267 Big Carlos Pass. The draw of...

  20. 33 CFR 117.267 - Big Carlos Pass.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Big Carlos Pass. 117.267 Section 117.267 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.267 Big Carlos Pass. The draw of...

  1. 33 CFR 117.267 - Big Carlos Pass.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Big Carlos Pass. 117.267 Section 117.267 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Florida § 117.267 Big Carlos Pass. The draw of...

  2. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 3 2014-07-01 2014-07-01 false Big Fork River, Minn.; logging. 207.370 Section 207.370 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE NAVIGATION REGULATIONS § 207.370 Big Fork River, Minn.; logging. (a) During the...

  3. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 3 2013-07-01 2013-07-01 false Big Fork River, Minn.; logging. 207.370 Section 207.370 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE NAVIGATION REGULATIONS § 207.370 Big Fork River, Minn.; logging. (a) During the...

  4. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 3 2012-07-01 2012-07-01 false Big Fork River, Minn.; logging. 207.370 Section 207.370 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE NAVIGATION REGULATIONS § 207.370 Big Fork River, Minn.; logging. (a) During the...

  5. 33 CFR 207.370 - Big Fork River, Minn.; logging.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 3 2011-07-01 2011-07-01 false Big Fork River, Minn.; logging. 207.370 Section 207.370 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE NAVIGATION REGULATIONS § 207.370 Big Fork River, Minn.; logging. (a) During the...

  6. Will Big Data Mean the End of Privacy?

    ERIC Educational Resources Information Center

    Pence, Harry E.

    2015-01-01

    Big Data is currently a hot topic in the field of technology, and many campuses are considering the addition of this topic into their undergraduate courses. Big Data tools are not just playing an increasingly important role in many commercial enterprises; they are also combining with new digital devices to dramatically change privacy. This article…

  7. Big Earth Data Initiative: Metadata Improvement: Case Studies

    NASA Technical Reports Server (NTRS)

    Kozimor, John; Habermann, Ted; Farley, John

    2016-01-01

    Big Earth Data Initiative (BEDI) The Big Earth Data Initiative (BEDI) invests in standardizing and optimizing the collection, management and delivery of U.S. Government's civil Earth observation data to improve discovery, access use, and understanding of Earth observations by the broader user community. Complete and consistent standard metadata helps address all three goals.

  8. The Role of Big Data in the Social Sciences

    ERIC Educational Resources Information Center

    Ovadia, Steven

    2013-01-01

    Big Data is an increasingly popular term across scholarly and popular literature but lacks a formal definition (Lohr 2012). This is beneficial in that it keeps the term flexible. For librarians, Big Data represents a few important ideas. One idea is the idea of balancing accessibility with privacy. Librarians tend to want information to be as open…

  9. Toward a manifesto for the 'public understanding of big data'.

    PubMed

    Michael, Mike; Lupton, Deborah

    2016-01-01

    In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article.

  10. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  11. Semantic Web technologies for the big data in life sciences.

    PubMed

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  12. Revitalizing Education in the Big Cities. A Report.

    ERIC Educational Resources Information Center

    Morphet, Edgar L.; And Others

    This report is concerned with the problems and procedures relative to planning and effecting needed improvements in big city education and is based on the assumption that States should continue to be primarily responsible for education. Five authors discuss (1) big city education: its challenge to governance; (2) urban learning environments,…

  13. 3. BIG HOUSE (left) AND CORN CRIB (right) IN THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. BIG HOUSE (left) AND CORN CRIB (right) IN THE BACKGROUND. See also individual HABS documentation: Walker Family Farm, Big House (HABS No. TN-121 A), and Walker Family Farm, Corn Crib (HABS No. TN-121 C). - Walker Family Farm (General views), Gatlinburg, Sevier County, TN

  14. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 1 2010-07-01 2010-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  15. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 1 2014-07-01 2014-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  16. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 1 2011-07-01 2011-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  17. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 1 2012-07-01 2012-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  18. 36 CFR 7.41 - Big Bend National Park.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 1 2013-07-01 2013-07-01 false Big Bend National Park. 7.41 Section 7.41 Parks, Forests, and Public Property NATIONAL PARK SERVICE, DEPARTMENT OF THE INTERIOR SPECIAL REGULATIONS, AREAS OF THE NATIONAL PARK SYSTEM § 7.41 Big Bend National Park. (a) Fishing; closed...

  19. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 33 Navigation and Navigable Waters 1 2013-07-01 2013-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  20. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 33 Navigation and Navigable Waters 1 2011-07-01 2011-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  1. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 33 Navigation and Navigable Waters 1 2014-07-01 2014-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  2. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 33 Navigation and Navigable Waters 1 2012-07-01 2012-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  3. 33 CFR 117.677 - Big Sunflower River.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 33 Navigation and Navigable Waters 1 2010-07-01 2010-07-01 false Big Sunflower River. 117.677 Section 117.677 Navigation and Navigable Waters COAST GUARD, DEPARTMENT OF HOMELAND SECURITY BRIDGES DRAWBRIDGE OPERATION REGULATIONS Specific Requirements Mississippi § 117.677 Big Sunflower River. The draw...

  4. Who is the Effective Volunteer: Characteristics of Successful Big Brothers.

    ERIC Educational Resources Information Center

    Thorelli, Irene M.; Appel, Victor H.

    The demographic characteristics of the typical volunteer, taken from the personnel files of 208 current and previous volunteers of a Big Brothers agency, indicate the following profile. The modal Big Brother is usually Anglo-American, is a young adult aged 18 to 25, is a student or a full-time employed person, has some college education, lives in…

  5. "Big Society" in the UK: A Policy Review

    ERIC Educational Resources Information Center

    Evans, Kathy

    2011-01-01

    Alongside the UK Coalition Government's historic public spending cuts, the "Big Society" has become a major narrative in UK political discourse. This article reviews key features of Big Society policies against their aims of rebalancing the economy and mending "Broken Britain", with particular reference to their implications for children and young…

  6. The Whole Shebang: How Science Produced the Big Bang Model.

    ERIC Educational Resources Information Center

    Ferris, Timothy

    2002-01-01

    Offers an account of the accumulation of evidence that has led scientists to have confidence in the big bang theory of the creation of the universe. Discusses the early work of Ptolemy, Copernicus, Kepler, Galileo, and Newton, noting the rise of astrophysics, and highlighting the birth of the big bang model (the cosmic microwave background theory…

  7. Build the Big Society on What We Know Works

    ERIC Educational Resources Information Center

    Low, John

    2011-01-01

    Hardly anyone can have failed to pick up on the recent flurry of stories in the national press about the Big Society, sparked off by Dame Elisabeth Hoodless's remarks that funds for volunteers are disappearing at an alarming rate, and that this is undermining the very idea of a "Big Society". At the same time, under the government's radical…

  8. Curriculum: Big Decisions--Making Healthy, Informed Choices about Sex

    ERIC Educational Resources Information Center

    Davis, Melanie

    2009-01-01

    Big Decisions is a 10-lesson abstinence-plus curriculum for ages 12-18 that emphasizes sex as a big decision, abstinence as the healthiest choice, and the mandate that sexually active teens use condoms and be tested for sexually transmitted diseases. This program can be implemented with limited resources and facilitator training when abstinence…

  9. How to Help Students Confront Life's "Big Questions"

    ERIC Educational Resources Information Center

    Walvoord, Barbara E.

    2008-01-01

    Many college students are interested in spirituality and the "big questions" about life's meaning and values, but many professors seem not to know how to respond to that interest. In this article, the author offers several strategies to help students confront the "big questions". One way is to structure assignments and discussions so that students…

  10. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system. PMID:27009423

  11. 78 FR 52523 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing Take notice that on August 16, 2013, Big Rivers Electric Corporation filed its proposed revenue requirements for reactive...

  12. 75 FR 141 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-04

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Big Rivers Electric Corporation; Notice of Filing December 23, 2009. Take... (2009) (September 17 Order), Big Rivers Electric Corporation filed revised tariff sheets to its...

  13. Research on Implementing Big Data: Technology, People, & Processes

    ERIC Educational Resources Information Center

    Rankin, Jenny Grant; Johnson, Margie; Dennis, Randall

    2015-01-01

    When many people hear the term "big data", they primarily think of a technology tool for the collection and reporting of data of high variety, volume, and velocity. However, the complexity of big data is not only the technology, but the supporting processes, policies, and people supporting it. This paper was written by three experts to…

  14. Deal or No Deal? Evaluating Big Deals and Their Journals

    ERIC Educational Resources Information Center

    Blecic, Deborah D.; Wiberley, Stephen E., Jr.; Fiscella, Joan B.; Bahnmaier-Blaszczak, Sara; Lowery, Rebecca

    2013-01-01

    This paper presents methods to develop metrics that compare Big Deal journal packages and the journals within those packages. Deal-level metrics guide selection of a Big Deal for termination. Journal-level metrics guide selection of individual subscriptions from journals previously provided by a terminated deal. The paper argues that, while the…

  15. Semantic Web technologies for the big data in life sciences.

    PubMed

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences. PMID:25224624

  16. A proposed framework of big data readiness in public sectors

    NASA Astrophysics Data System (ADS)

    Ali, Raja Haslinda Raja Mohd; Mohamad, Rosli; Sudin, Suhizaz

    2016-08-01

    Growing interest over big data mainly linked to its great potential to unveil unforeseen pattern or profiles that support organisation's key business decisions. Following private sector moves to embrace big data, the government sector has now getting into the bandwagon. Big data has been considered as one of the potential tools to enhance service delivery of the public sector within its financial resources constraints. Malaysian government, particularly, has considered big data as one of the main national agenda. Regardless of government commitment to promote big data amongst government agencies, degrees of readiness of the government agencies as well as their employees are crucial in ensuring successful deployment of big data. This paper, therefore, proposes a conceptual framework to investigate perceived readiness of big data potentials amongst Malaysian government agencies. Perceived readiness of 28 ministries and their respective employees will be assessed using both qualitative (interview) and quantitative (survey) approaches. The outcome of the study is expected to offer meaningful insight on factors affecting change readiness among public agencies on big data potentials and the expected outcome from greater/lower change readiness among the public sectors.

  17. Literature, Community and Cooperation: The Big Read at Regent University

    ERIC Educational Resources Information Center

    Hillery, Leanne B.; Henkel, Harold L.

    2010-01-01

    This article recounts the experience of the Regent University Library in planning and implementing a festival of Tolstoy and Russian culture as part of the National Endowment for the Arts (NEA) Big Read initiative. The NEA launched the Big Read in 2006 to counter the alarming decline of literary reading documented in its 2002 report Reading at…

  18. Big Canyon Creek Ecological Restoration Strategy.

    SciTech Connect

    Rasmussen, Lynn; Richardson, Shannon

    2007-10-01

    He-yey, Nez Perce for steelhead or rainbow trout (Oncorhynchus mykiss), are a culturally and ecologically significant resource within the Big Canyon Creek watershed; they are also part of the federally listed Snake River Basin Steelhead DPS. The majority of the Big Canyon Creek drainage is considered critical habitat for that DPS as well as for the federally listed Snake River fall chinook (Oncorhynchus tshawytscha) ESU. The Nez Perce Soil and Water Conservation District (District) and the Nez Perce Tribe Department of Fisheries Resources Management-Watershed (Tribe), in an effort to support the continued existence of these and other aquatic species, have developed this document to direct funding toward priority restoration projects in priority areas for the Big Canyon Creek watershed. In order to achieve this, the District and the Tribe: (1) Developed a working group and technical team composed of managers from a variety of stakeholders within the basin; (2) Established geographically distinct sub-watershed areas called Assessment Units (AUs); (3) Created a prioritization framework for the AUs and prioritized them; and (4) Developed treatment strategies to utilize within the prioritized AUs. Assessment Units were delineated by significant shifts in sampled juvenile O. mykiss (steelhead/rainbow trout) densities, which were found to fall at fish passage barriers. The prioritization framework considered four aspects critical to determining the relative importance of performing restoration in a certain area: density of critical fish species, physical condition of the AU, water quantity, and water quality. It was established, through vigorous data analysis within these four areas, that the geographic priority areas for restoration within the Big Canyon Creek watershed are Big Canyon Creek from stream km 45.5 to the headwaters, Little Canyon from km 15 to 30, the mainstem corridors of Big Canyon (mouth to 7km) and Little Canyon (mouth to 7km). The District and the Tribe

  19. A Solution to ``Too Big to Fail''

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-10-01

    Its a tricky business to reconcile simulations of our galaxys formation with our current observations of the Milky Way and its satellites. In a recent study, scientists have addressed one discrepancy between simulations and observations: the so-called to big to fail problem.From Missing Satellites to Too Big to FailThe favored model of the universe is the lambda-cold-dark-matter (CDM) cosmological model. This model does a great job of correctly predicting the large-scale structure of the universe, but there are still a few problems with it on smaller scales.Hubble image of UGC 5497, a dwarf galaxy associated with Messier 81. In the missing satellite problem, simulations of galaxy formation predict that there should be more such satellite galaxies than we observe. [ESA/NASA]The first is the missing satellites problem: CDM cosmology predicts that galaxies like the Milky Way should have significantly more satellite galaxies than we observe. A proposed solution to this problem is the argument that there may exist many more satellites than weve observed, but these dwarf galaxies have had their stars stripped from them during tidal interactions which prevents us from being able to see them.This solution creates a new problem, though: the too big to fail problem. This problem states that many of the satellites predicted by CDM cosmology are simply so massive that theres no way they couldnt have visible stars. Another way of looking at it: the observed satellites of the Milky Way are not massive enough to be consistent with predictions from CDM.Artists illustration of a supernova, a type of stellar feedback that can modify the dark-matter distribution of a satellite galaxy. [NASA/CXC/M. Weiss]Density Profiles and Tidal StirringLed by Mihai Tomozeiu (University of Zurich), a team of scientists has published a study in which they propose a solution to the too big to fail problem. By running detailed cosmological zoom simulations of our galaxys formation, Tomozeiu and

  20. Big Data for Big Questions: Global Soil Change and the National Soil Carbon Network

    NASA Astrophysics Data System (ADS)

    Nave, L. E.; Swanston, C.

    2010-12-01

    studies, the mean recovery time for forest floor C was 128 yr. In a broader context, these results demonstrate that combining database work with quantitative synthesis (such as meta-analysis) allows scientists to detect large-scale patterns that are obscured by variation within individual studies. And, in addition to improving analytical capacity for addressing large questions, large databases are useful for identifying data gaps in global soil change research. In light of these benefits, now is an opportune time to advance the study of global soil change by networking and sharing data with the National Soil Carbon Network. The NSCN seeks participants in an effort to compile databases, answer big-picture, predictive questions about soil C vulnerability, and identify and fill data gaps and research needs. The NSCN seeks to be a facilitator that links existing resources rather than reinvents them, and offers opportunities for a variety of activities, including sharing sites, data, archives, and lab infrastructure. The NSCN is fundamentally collaborative, and operates under the assumption that our shared scientific interest in global soil change will be best advanced if we work together rather than in isolation.

  1. Social Mobility and Social Participation

    ERIC Educational Resources Information Center

    Sewell, William H.

    1978-01-01

    Examines data related to social mobility and social participation of Americans. Topics include educational and occupational mobility; voting; volunteer work; charitable giving; community participation; views on religion; and anomie. For journal availability, see SO 506 144. (Author/DB)

  2. Design Principles for Effective Knowledge Discovery from Big Data

    SciTech Connect

    Begoli, Edmon; Horey, James L

    2012-01-01

    Big data phenomenon refers to the practice of collection and processing of very large data sets and associated systems and algorithms used to analyze these massive datasets. Architectures for big data usually range across multiple machines and clusters, and they commonly consist of multiple special purpose sub-systems. Coupled with the knowledge discovery process, big data movement offers many unique opportunities for organizations to benefit (with respect to new insights, business optimizations, etc.). However, due to the difficulty of analyzing such large datasets, big data presents unique systems engineering and architectural challenges. In this paper, we present three sys- tem design principles that can inform organizations on effective analytic and data collection processes, system organization, and data dissemination practices. The principles presented derive from our own research and development experiences with big data problems from various federal agencies, and we illustrate each principle with our own experiences and recommendations.

  3. Integrative methods for analyzing big data in precision medicine.

    PubMed

    Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša

    2016-03-01

    We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face.

  4. [Utilization of Big Data in Medicine and Future Outlook].

    PubMed

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  5. [Utilization of Big Data in Medicine and Future Outlook].

    PubMed

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan. PMID:27363223

  6. Big data is essential for further development of integrative medicine.

    PubMed

    Li, Guo-zheng; Liu, Bao-yan

    2015-05-01

    To give a short summary on achievements, opportunities and challenges of big data in integrative medicine (IM) and explore the future works on breaking the bottleneck to make IM develop rapidly, this paper presents the growing field of big data from IM, describes the systems of data collection and the techniques of data analytics, introduces the advances, and discusses the future works especially the challenges in this field. Big data is increasing dramatically as the time flies, whatever we face it or not. Big data is evolving into a promising way for deep insight IM, the ancient medicine integrating with modern medicine. We have great achievements in data collection and data analysis, where existing results show it is possible to discover the knowledge and rules behind the clinical records. Transferring from experience-based medicine to evidence-based medicine, IM depends on the big data technology in this great era.

  7. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development.

  8. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development. PMID:26683510

  9. Commentary: Epidemiology in the era of big data.

    PubMed

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-05-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.

  10. Big data and visual analytics in anaesthesia and health care.

    PubMed

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences.

  11. Big data and visual analytics in anaesthesia and health care.

    PubMed

    Simpao, A F; Ahumada, L M; Rehman, M A

    2015-09-01

    Advances in computer technology, patient monitoring systems, and electronic health record systems have enabled rapid accumulation of patient data in electronic form (i.e. big data). Organizations such as the Anesthesia Quality Institute and Multicenter Perioperative Outcomes Group have spearheaded large-scale efforts to collect anaesthesia big data for outcomes research and quality improvement. Analytics--the systematic use of data combined with quantitative and qualitative analysis to make decisions--can be applied to big data for quality and performance improvements, such as predictive risk assessment, clinical decision support, and resource management. Visual analytics is the science of analytical reasoning facilitated by interactive visual interfaces, and it can facilitate performance of cognitive activities involving big data. Ongoing integration of big data and analytics within anaesthesia and health care will increase demand for anaesthesia professionals who are well versed in both the medical and the information sciences. PMID:25627395

  12. Participative management: a contingency approach.

    PubMed

    Callahan, C B; Wall, L L

    1987-09-01

    The participative management trend has been misinterpreted by staff to mean that they make all the decisions. To decrease the discrepancy between the management philosophy of participation and the subordinate interpretation of the system, the selection of appropriate decision participation procedures is essential. When the leaders communicate the degree of influence that subordinates will have, the staff learn to trust and support the participative management system. PMID:3655932

  13. Children's Participation Rights in Research

    ERIC Educational Resources Information Center

    Powell, Mary Ann; Smith, Anne B.

    2009-01-01

    This article explores children's participation in research, from the perspectives of researchers who have conducted research with children. Researchers' reports, gained using an email interviewing method, suggest that children's participation rights are particularly compromised when the potential child participants are considered vulnerable and…

  14. 37 CFR 1.98 - Content of information disclosure statement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... must be identified by inventor, patent number, and issue date. (2) Each U.S. patent application... statement must be identified by the inventor, application number, and filing date. (4) Each foreign...

  15. Curating Big Data Made Simple: Perspectives from Scientific Communities.

    PubMed

    Sowe, Sulayman K; Zettsu, Koji

    2014-03-01

    The digital universe is exponentially producing an unprecedented volume of data that has brought benefits as well as fundamental challenges for enterprises and scientific communities alike. This trend is inherently exciting for the development and deployment of cloud platforms to support scientific communities curating big data. The excitement stems from the fact that scientists can now access and extract value from the big data corpus, establish relationships between bits and pieces of information from many types of data, and collaborate with a diverse community of researchers from various domains. However, despite these perceived benefits, to date, little attention is focused on the people or communities who are both beneficiaries and, at the same time, producers of big data. The technical challenges posed by big data are as big as understanding the dynamics of communities working with big data, whether scientific or otherwise. Furthermore, the big data era also means that big data platforms for data-intensive research must be designed in such a way that research scientists can easily search and find data for their research, upload and download datasets for onsite/offsite use, perform computations and analysis, share their findings and research experience, and seamlessly collaborate with their colleagues. In this article, we present the architecture and design of a cloud platform that meets some of these requirements, and a big data curation model that describes how a community of earth and environmental scientists is using the platform to curate data. Motivation for developing the platform, lessons learnt in overcoming some challenges associated with supporting scientists to curate big data, and future research directions are also presented.

  16. Big data and biomedical informatics: a challenging opportunity.

    PubMed

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  17. Big Data and Biomedical Informatics: A Challenging Opportunity

    PubMed Central

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  18. Big data and biomedical informatics: a challenging opportunity.

    PubMed

    Bellazzi, R

    2014-01-01

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  19. How 'Big data' will drive future innovation.

    PubMed

    Baillie, Jonathan

    2016-03-01

    Giving the opening keynote speech at last year's Healthcare Estates conference, Mike Hobbs, managing director of Carillion Health, drew on his 25 years' experience to discuss how innovation can help drive the greater efficiency and productivity that the NHS is charged with delivering, in the process cutting costs at a time when the service faces the tightest economic pressures in its history. He argued that as we enter a new world of 'Big data', the availability of accurate, comprehensive data on which to base key decisions will be the major enabler for the design and construction of high quality healthcare facilities in the future. It will equally be key, he said, to their efficient, low-cost, and optimal utilisation to provide the higher 'productivity' the Government says is essential. PMID:27132307

  20. Livermore Big Trees Park Soil Survey

    SciTech Connect

    McConachie, W.A.; Failor, R.A.

    1995-01-01

    Lawrence Livermore National Laboratory (LLNL) will sample and analyze soil in the Big Trees Park area in Livermore, California, to determine if the initial level of plutonium (Pu) in a soil sample taken by the U.S. Environmental Protection Agency (EPA) in September 1993 can be confirmed. Nineteen samples will be collected and analyzed: 4 in the area where the initial EPA sample was taken, 2 in the nearby Arroyo Seco, 12 in scattered uncovered soil areas in the park and nearby school, and 1 from the sandbox of a nearby apartment complex. Two quality control (QC) samples (field duplicates of the preceding samples) win also be collected and analyzed. This document briefly describes the purpose behind the sampling, the sampling rationale, and the methodology.

  1. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  2. Big data era in meteor science

    NASA Astrophysics Data System (ADS)

    Vinković, D.; Gritsevich, M.; Srećković, V.; Pečnik, B.; Szabó, G.; Debattista, V.; Škoda, P.; Mahabal, A.; Peltoniemi, J.; Mönkölä, S.; Mickaelian, A.; Turunen, E.; Kákona, J.; Koskinen, J.; Grokhovsky, V.

    2016-01-01

    Over the last couple of decades technological advancements in observational techniques in meteor science have yielded drastic improvements in the quality, quantity and diversity of meteor data, while even more ambitious instruments are about to become operational. This empowers meteor science to boost its experimental and theoretical horizons and seek more advanced science goals. We review some of the developments that push meteor science into the big data era that requires more complex methodological approaches through interdisciplinary collaborations with other branches of physics and computer science. We argue that meteor science should become an integral part of large surveys in astronomy, aeronomy and space physics, and tackle the complexity of micro-physics of meteor plasma and its interaction with the atmosphere.

  3. The Economics of Big Area Addtiive Manufacturing

    SciTech Connect

    Post, Brian; Lloyd, Peter D; Lindahl, John; Lind, Randall F; Love, Lonnie J; Kunc, Vlastimil

    2016-01-01

    Case studies on the economics of Additive Manufacturing (AM) suggest that processing time is the dominant cost in manufacturing. Most additive processes have similar performance metrics: small part sizes, low production rates and expensive feedstocks. Big Area Additive Manufacturing is based on transitioning polymer extrusion technology from a wire to a pellet feedstock. Utilizing pellets significantly increases deposition speed and lowers material cost by utilizing low cost injection molding feedstock. The use of carbon fiber reinforced polymers eliminates the need for a heated chamber, significantly reducing machine power requirements and size constraints. We hypothesize that the increase in productivity coupled with decrease in feedstock and energy costs will enable AM to become more competitive with conventional manufacturing processes for many applications. As a test case, we compare the cost of using traditional fused deposition modeling (FDM) with BAAM for additively manufacturing composite tooling.

  4. The Big Bang as scientific fact.

    PubMed

    Faber, S M

    2001-12-01

    In the year 1900, humanity had barely a notion of our place on the cosmic stage, and no inkling at all of how we got here. The one hundred short years of the twentieth century sufficed to unravel 14 billion years of cosmic history and how those grand events, after 9 billions of years or so, set the stage for the birth of our own home, the Solar System. The key events in this history are not hard to comprehend; they can be sketched in a few brief pages. This precious knowledge is part of our shared heritage as human beings and is fundamental to the future prospects of our species. Without it, we are ignorant of the powerful forces that have shaped our past and that will shape our destiny in the future. Read here the cosmic history of humanity, beginning with the Big Bang.

  5. New nuclear physics for big bang nucleosynthesis

    SciTech Connect

    Boyd, Richard N.; Brune, Carl R.; Fuller, George M.; Smith, Christel J.

    2010-11-15

    We discuss nuclear reactions which could play a role in big bang nucleosynthesis. Most of these reactions involve lithium and beryllium isotopes and the rates for some of these have not previously been included in BBN calculations. Few of these reactions are well studied in the laboratory. We also discuss novel effects in these reactions, including thermal population of nuclear target states, resonant enhancement, and nonthermal neutron reaction products. We perform sensitivity studies which show that even given considerable nuclear physics uncertainties, most of these nuclear reactions have minimal leverage on the standard BBN abundance yields of {sup 6}Li and {sup 7}Li. Although a few have the potential to alter the yields significantly, we argue that this is unlikely.

  6. How 'Big data' will drive future innovation.

    PubMed

    Baillie, Jonathan

    2016-03-01

    Giving the opening keynote speech at last year's Healthcare Estates conference, Mike Hobbs, managing director of Carillion Health, drew on his 25 years' experience to discuss how innovation can help drive the greater efficiency and productivity that the NHS is charged with delivering, in the process cutting costs at a time when the service faces the tightest economic pressures in its history. He argued that as we enter a new world of 'Big data', the availability of accurate, comprehensive data on which to base key decisions will be the major enabler for the design and construction of high quality healthcare facilities in the future. It will equally be key, he said, to their efficient, low-cost, and optimal utilisation to provide the higher 'productivity' the Government says is essential.

  7. Home theater projectors: the next big thing?

    NASA Astrophysics Data System (ADS)

    Chinnock, Christopher B.

    2002-04-01

    The business presentation market has traditionally been the mainstay of the projection business, but as these users find the projectors work well at showing movies at home, interest in the home entertainment market is heating up. The idea of creating a theater environment in the home, complete with big screen projector and quality audio system, is not new. Wealthy patrons have been doing it for years. But can the concept be extended to ordinary living rooms? Many think so. Already pioneers like Sony, InFocus, Toshiba and Plus Vision are offering first generation products - and others will follow. But this market will require projectors that have different performance characteristics than those designed for data projection. In this paper, we will discuss how the requirements for a home theater projector differ from those of a data projector. We will provide updated information on who is doing what in this segment and give some insight into the growth potential.

  8. Big bang nucleosynthesis limit on Nν

    NASA Astrophysics Data System (ADS)

    Lisi, E.; Sarkar, S.; Villante, F. L.

    1999-06-01

    Recently we presented a simple method for determining the correlated uncertainties of the light element abundances expected from big bang nucleosynthesis, which avoids the need for lengthy Monte Carlo simulations. We now extend this approach to consider departures from the standard model, in particular to constrain any new light degrees of freedom present in the thermal plasma during nucleosynthesis. Since the observational situation regarding the inferred primordial abundances has not yet stabilized, we present illustrative bounds on the equivalent number of neutrino species Nν for various combinations of individual abundance determinations. Our 95% C.L. bounds on Nν range between 2 and 4, and can easily be reevaluated using the technique provided when the abundances are known more accurately.

  9. Big biology is here to stay

    SciTech Connect

    Wiley, H. S.

    2008-08-01

    The new, large-scale research centers started by the Roadmap initiative created new research opportunities. The purpose of many of them, in fact, is to provide resources to the scientific community that can be exploited to enable new research ideas and directions. Research grants are also available for investigators to contribute to many of these centers. The NIH is now actively soliciting ideas for new Roadmap projects, so if you have an opinion on the most useful types of projects to fund, let them know. However, just complaining about big science is not useful. The success of large, high profile NIH projects is the best way to get increased funding for all of NIH and to accelerate scientific advances in biology in the process.

  10. The big data challenges of connectomics

    PubMed Central

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2015-01-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them. PMID:25349911

  11. Affective forecasting and the Big Five

    PubMed Central

    Hoerger, Michael; Quirk, Stuart W.

    2011-01-01

    Recent studies on affective forecasting clarify that the emotional reactions people anticipate often differ markedly from those they actually experience in response to affective stimuli and events. However, core personality differences in affective forecasting have received limited attention, despite their potential relevance to choice behavior. In the present study, 226 college undergraduates rated their anticipated and experienced reactions to the emotionally-evocative event of Valentine’s Day and completed a measure of the Big Five personality traits – neuroticism, extraversion, openness to experience, agreeableness, and conscientiousness – and their facet scales. Neuroticism and extraversion were associated with baseline mood, experienced emotional reactions, and anticipated emotional reactions. The present findings hold implications for the study of individual differences in affective forecasting, personality theory, and interventions research. PMID:22021944

  12. No crisis for big bang nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Kernan, Peter J.; Sarkar, Subir

    1996-09-01

    Contrary to a recent claim, the inferred primordial abundances of the light elements are quite consistent with the expectations from standard big bang nucleosynthesis when attention is restricted to direct observations rather than results from chemical evolution models. The number of light neutrino (or equivalent particle) species (Nν) can be as high as 4.53 if the nucleon-to-photon ratio (η) is at its lower limit of 1.65×10-10, as constrained by the upper bound on the deuterium abundance in high redshift quasar absorption systems. Alternatively, with Nν=3, η can be as high as 8.90×10-10 if the deuterium abundance is bounded from below by its interstellar value. These conclusions follow from the upward revision of the primordial helium abundance inferred from recent observations of blue compact galaxies, using updated atomic physics inputs.

  13. A Spectrograph for BigBOSS

    NASA Astrophysics Data System (ADS)

    CARTON, Pierre-Henri; Bebek, C.; Cazaux, S.; Ealet, A.; Eppelle, D.; Kneib, J.; Karst, P.; levi, M.; magneville, C.; Palanque-Delabrouille, N.; Ruhlmann-Kleider, V.; Schlegel, D.; Yeche, C.

    2012-01-01

    The Big-Boss spectrographs assembly will take in charge the light from the fiber output to the detector, including the optics, gratings, mechanics and cryostats. The 5000 fibers are split in 10 bundles of 500 ones. Each of these channel feed one spectrograph. The full bandwidth from 0.36µm to 1.05µm is split in 3 bands. Each channel is composed with one collimator (doublet lenses), a VPH grating, and a 6 lenses camera. The 500 fiber spectrum are imaged onto a 4kx4k detector thanks to the F/2 camera. Each fiber core is imaged onto 4 pixels. Each channel of the BigBOSS spectrograph will be equipped with a single-CCD camera, resulting in 30 cryostats in total for the instrument. Based on its experience of CCD cameras for projects like EROS and MegaCam, CEA/Saclay has designed small and autonomous cryogenic vessels which integrate cryo-cooling, CCD positioning and slow control interfacing capabilities. The use of a Linear Pulse Tube with its own control unit, both developed by Thales Cryogenics BV, will ensure versatility, reliability and operational flexibility. CCD's will be cooled down to 140K, with stability better than 1K. CCD's will be positioned within 15µm along the optical axis and 50µm in the XY Plan. Slow Control machines will be directly interfaced to an Ethernet network, which will allow them to be operated remotely. The concept of spectrograph leads to a very robust concept without any mechanics (except the shutters). This 30 channels has a impressive compactness with its 3m3 volume. The development of such number of channel will drive to a quasi mass production philosophy.

  14. The natural science underlying big history.

    PubMed

    Chaisson, Eric J

    2014-01-01

    Nature's many varied complex systems-including galaxies, stars, planets, life, and society-are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density-contrasting with information content or entropy production-is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated.

  15. The Uses of Big Data in Cities.

    PubMed

    Bettencourt, Luís M A

    2014-03-01

    There is much enthusiasm currently about the possibilities created by new and more extensive sources of data to better understand and manage cities. Here, I explore how big data can be useful in urban planning by formalizing the planning process as a general computational problem. I show that, under general conditions, new sources of data coordinated with urban policy can be applied following fundamental principles of engineering to achieve new solutions to important age-old urban problems. I also show that comprehensive urban planning is computationally intractable (i.e., practically impossible) in large cities, regardless of the amounts of data available. This dilemma between the need for planning and coordination and its impossibility in detail is resolved by the recognition that cities are first and foremost self-organizing social networks embedded in space and enabled by urban infrastructure and services. As such, the primary role of big data in cities is to facilitate information flows and mechanisms of learning and coordination by heterogeneous individuals. However, processes of self-organization in cities, as well as of service improvement and expansion, must rely on general principles that enforce necessary conditions for cities to operate and evolve. Such ideas are the core of a developing scientific theory of cities, which is itself enabled by the growing availability of quantitative data on thousands of cities worldwide, across different geographies and levels of development. These three uses of data and information technologies in cities constitute then the necessary pillars for more successful urban policy and management that encourages, and does not stifle, the fundamental role of cities as engines of development and innovation in human societies.

  16. The Natural Science Underlying Big History

    PubMed Central

    Chaisson, Eric J.

    2014-01-01

    Nature's many varied complex systems—including galaxies, stars, planets, life, and society—are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density—contrasting with information content or entropy production—is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated. PMID:25032228

  17. The natural science underlying big history.

    PubMed

    Chaisson, Eric J

    2014-01-01

    Nature's many varied complex systems-including galaxies, stars, planets, life, and society-are islands of order within the increasingly disordered Universe. All organized systems are subject to physical, biological, or cultural evolution, which together comprise the grander interdisciplinary subject of cosmic evolution. A wealth of observational data supports the hypothesis that increasingly complex systems evolve unceasingly, uncaringly, and unpredictably from big bang to humankind. These are global history greatly extended, big history with a scientific basis, and natural history broadly portrayed across ∼14 billion years of time. Human beings and our cultural inventions are not special, unique, or apart from Nature; rather, we are an integral part of a universal evolutionary process connecting all such complex systems throughout space and time. Such evolution writ large has significant potential to unify the natural sciences into a holistic understanding of who we are and whence we came. No new science (beyond frontier, nonequilibrium thermodynamics) is needed to describe cosmic evolution's major milestones at a deep and empirical level. Quantitative models and experimental tests imply that a remarkable simplicity underlies the emergence and growth of complexity for a wide spectrum of known and diverse systems. Energy is a principal facilitator of the rising complexity of ordered systems within the expanding Universe; energy flows are as central to life and society as they are to stars and galaxies. In particular, energy rate density-contrasting with information content or entropy production-is an objective metric suitable to gauge relative degrees of complexity among a hierarchy of widely assorted systems observed throughout the material Universe. Operationally, those systems capable of utilizing optimum amounts of energy tend to survive, and those that cannot are nonrandomly eliminated. PMID:25032228

  18. Big Data for a Big Ocean at the NOAA National Oceanographic Data Center

    NASA Astrophysics Data System (ADS)

    Casey, K. S.

    2014-12-01

    Covering most of planet Earth, the vast, physically challenging ocean environment was once the sole domain of hardy, sea-going oceanographers. More recently, however, ocean observing systems have become more operational as well as more diverse. With observations coming from satellites, automated ship-based systems, autonomous underwater and airborne vehicles, in situ observing systems, and numerical models the field of oceanography is now clearly in the domain of Big Data. The NOAA National Oceanographic Data Center (NODC) and its partners around the world are addressing the entire range of Big Data issues for the ocean environment. A growing variety, volume, and velocity of incoming "Big Ocean" data streams are being managed through numerous approaches including the automated ingest and archive of incoming data; deployment of standardized, machine-consumable data discovery services; and interoperable data access, visualization, and subset mechanisms. In addition, support to the community of data producers to help them create more machine-ready ocean observation data streams is being provided and pilot projects to effectively incorporate commercial and hybrid cloud storage, access, and processing services into existing workflows and systems are being conducted. NODC is also engaging more actively than ever in the broader community of environmental data facilities to address these challenges. Details on these efforts at NODC and its partners will be provided and input sought on new and evolving user requirements.

  19. Classical propagation of strings across a big crunch/big bang singularity

    SciTech Connect

    Niz, Gustavo; Turok, Neil

    2007-01-15

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z{sub 2}, the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example.

  20. Integrating the Apache Big Data Stack with HPC for Big Data

    NASA Astrophysics Data System (ADS)

    Fox, G. C.; Qiu, J.; Jha, S.

    2014-12-01

    There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.

  1. Classical propagation of strings across a big crunch/big bang singularity

    NASA Astrophysics Data System (ADS)

    Niz, Gustavo; Turok, Neil

    2007-01-01

    One of the simplest time-dependent solutions of M theory consists of nine-dimensional Euclidean space times 1+1-dimensional compactified Milne space-time. With a further modding out by Z2, the space-time represents two orbifold planes which collide and re-emerge, a process proposed as an explanation of the hot big bang [J. Khoury, B. A. Ovrut, P. J. Steinhardt, and N. Turok, Phys. Rev. D 64, 123522 (2001).PRVDAQ0556-282110.1103/PhysRevD.64.123522][P. J. Steinhardt and N. Turok, Science 296, 1436 (2002).SCIEAS0036-807510.1126/science.1070462][N. Turok, M. Perry, and P. J. Steinhardt, Phys. Rev. D 70, 106004 (2004).PRVDAQ0556-282110.1103/PhysRevD.70.106004]. When the two planes are near, the light states of the theory consist of winding M2-branes, describing fundamental strings in a particular ten-dimensional background. They suffer no blue-shift as the M theory dimension collapses, and their equations of motion are regular across the transition from big crunch to big bang. In this paper, we study the classical evolution of fundamental strings across the singularity in some detail. We also develop a simple semiclassical approximation to the quantum evolution which allows one to compute the quantum production of excitations on the string and implement it in a simplified example.

  2. Uncovering a Pressure-Tuned Electronic Transition in Bi[subscript 1.98]Sr[subscript 2.06]Y[subscript0.68]Cu[subscript 2]O[subscipt 8+delta] using Raman Scattering and X-Ray Diffraction

    SciTech Connect

    Cuk, T.; Struzhkin, V.V.; Devereaux, T.P.; Goncharov, A.F.; Kendziora, C.A.; Eisaki, H.; Mao, H.-K.; Shen, Z.-X.

    2008-06-03

    We report pressure-tuned Raman and x-ray diffraction data of Bi{sub 1.98}Sr{sub 2.06}Y{sub 0.68}Cu{sub 2}O{sub 8+{delta}} revealing a critical pressure at 21 GPa with anomalies in electronic Raman background, electron-phonon coupling {lambda}, spectral weight transfer, density dependent behavior of phonons and magnons, and a compressibility change in the c axis. For the first time in a cuprate, mobile charge carriers, lattice, and magnetism all show anomalies at a distinct critical pressure in the same experimental setting. Furthermore, the spectral changes suggest that the critical pressure at 21 GPa is related to the critical point at optimal doping.

  3. Blue Shield Plan Physician Participation

    PubMed Central

    Yett, Donald E.; Der, William; Ernst, Richard L.; Hay, Joel W.

    1981-01-01

    Many Blue Shield Plans offer participation agreements to physicians that are structurally similar to the participation provisions of Medicaid programs. This paper examines physicians' participation decisions in two such Blue Shield Plans where the participation agreements were on an all-or-nothing basis. The major results show that increases in the Plans' reasonable fees or fee schedules significantly raise the probability of participation, and that physicians with characteristics associated with “low quality” are significantly more likely to participate than are physicians with characteristics associated with “high quality.” In this sense the results highlight the tradeoff that must be faced in administering governmental health insurance policy. On the one hand, restricting reasonable and scheduled fees is the principal current tool for containing expenditures on physicians' services. Yet these restrictions tend to depress physicians' willingness to participate in government programs, thereby reducing access to high quality care by the populations those programs were designed to serve. PMID:10309468

  4. Male Adolescents' Reasons for Participating in Physical Activity, Barriers to Participation, and Suggestions for Increasing Participation

    ERIC Educational Resources Information Center

    Allison, Kenneth R.; Dwyer, John J. M.; Goldenberg, Ellie; Fein, Allan; Yoshida, Karen K.; Boutilier, Marie

    2005-01-01

    This study explored male adolescents' reasons for participating in moderate and vigorous physical activity, perceived barriers to moderate and vigorous physical activity, and suggestions as to what can be done to increase participation in physical activity. A total of 26 male 15- and 16-year-old adolescents participated in focus group sessions,…

  5. Application and Exploration of Big Data Mining in Clinical Medicine

    PubMed Central

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-01-01

    Objective: To review theories and technologies of big data mining and their application in clinical medicine. Data Sources: Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Study Selection: Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. Results: This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster–Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Conclusion: Big data mining has the potential to play an important role in clinical medicine. PMID:26960378

  6. [Discussion paper on participation and participative methods in gerontology].

    PubMed

    Aner, Kirsten

    2016-02-01

    The concept of "participation" and the demand for the use of "participative methods" in human, healthcare, nursing and gerontological research as well as the corresponding fields of practice are in great demand; however, the targets and organization of "participation" are not always sufficiently explicated. The working group on critical gerontology of the German Society of Gerontology and Geriatrics uses this phenomenon as an opportunity for positioning and develops a catalogue of criteria for reflection and assessment of participation of elderly people in science and practice, which can also be considered a stimulus for further discussions.

  7. Dreaming and personality: Wake-dream continuity, thought suppression, and the Big Five Inventory.

    PubMed

    Malinowski, Josie E

    2015-12-15

    Studies have found relationships between dream content and personality traits, but there are still many traits that have been underexplored or have had questionable conclusions drawn about them. Experimental work has found a 'rebound' effect in dreams when thoughts are suppressed prior to sleep, but the effect of trait thought suppression on dream content has not yet been researched. In the present study participants (N=106) reported their Most Recent Dream, answered questions about the content of the dream, and completed questionnaires measuring trait thought suppression and the 'Big Five' personality traits. Of these, 83 were suitably recent for analyses. A significant positive correlation was found between trait thought suppression and participants' ratings of dreaming of waking-life emotions, and high suppressors reported dreaming more of their waking-life emotions than low suppressors did. The results may lend support to the compensation theory of dreams, and/or the ironic process theory of mental control. PMID:26496477

  8. Dreaming and personality: Wake-dream continuity, thought suppression, and the Big Five Inventory.

    PubMed

    Malinowski, Josie E

    2015-12-15

    Studies have found relationships between dream content and personality traits, but there are still many traits that have been underexplored or have had questionable conclusions drawn about them. Experimental work has found a 'rebound' effect in dreams when thoughts are suppressed prior to sleep, but the effect of trait thought suppression on dream content has not yet been researched. In the present study participants (N=106) reported their Most Recent Dream, answered questions about the content of the dream, and completed questionnaires measuring trait thought suppression and the 'Big Five' personality traits. Of these, 83 were suitably recent for analyses. A significant positive correlation was found between trait thought suppression and participants' ratings of dreaming of waking-life emotions, and high suppressors reported dreaming more of their waking-life emotions than low suppressors did. The results may lend support to the compensation theory of dreams, and/or the ironic process theory of mental control.

  9. pp wave big bangs: Matrix strings and shrinking fuzzy spheres

    SciTech Connect

    Das, Sumit R.; Michelson, Jeremy

    2005-10-15

    We find pp wave solutions in string theory with null-like linear dilatons. These provide toy models of big bang cosmologies. We formulate matrix string theory in these backgrounds. Near the big bang 'singularity', the string theory becomes strongly coupled but the Yang-Mills description of the matrix string is weakly coupled. The presence of a second length scale allows us to focus on a specific class of non-Abelian configurations, viz. fuzzy cylinders, for a suitable regime of parameters. We show that, for a class of pp waves, fuzzy cylinders which start out big at early times dynamically shrink into usual strings at sufficiently late times.

  10. Research Activities at Fermilab for Big Data Movement

    SciTech Connect

    Mhashilkar, Parag; Wu, Wenji; Kim, Hyun W; Garzoglio, Gabriele; Dykstra, Dave; Slyz, Marko; DeMar, Phil

    2013-01-01

    Adaptation of 100GE Networking Infrastructure is the next step towards management of Big Data. Being the US Tier-1 Center for the Large Hadron Collider's (LHC) Compact Muon Solenoid (CMS) experiment and the central data center for several other large-scale research collaborations, Fermilab has to constantly deal with the scaling and wide-area distribution challenges of the big data. In this paper, we will describe some of the challenges involved in the movement of big data over 100GE infrastructure and the research activities at Fermilab to address these challenges.

  11. ELM Meets Urban Big Data Analysis: Case Studies

    PubMed Central

    Chen, Huajun; Chen, Jiaoyan

    2016-01-01

    In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203

  12. [Medical big data and precision medicine: prospects of epidemiology].

    PubMed

    Song, J; Hu, Y H

    2016-08-10

    Since the development of high-throughput technology, electronic medical record system and big data technology, the value of medical data has caused more attention. On the other hand, the proposal of Precision Medicine Initiative opens up the prospect for medical big data. As a Tool-related Discipline, Epidemiology is, focusing on exploitation the resources of existing big data and promoting the integration of translational research and knowledge to completely unlocking the "black box" of exposure-disease continuum. It also tries to accelerating the realization of the ultimate goal on precision medicine. The overall purpose, however is to translate the evidence from scientific research to improve the health of the people.

  13. Customizing computational methods for visual analytics with big data.

    PubMed

    Choo, Jaegul; Park, Haesun

    2013-01-01

    The volume of available data has been growing exponentially, increasing data problem's complexity and obscurity. In response, visual analytics (VA) has gained attention, yet its solutions haven't scaled well for big data. Computational methods can improve VA's scalability by giving users compact, meaningful information about the input data. However, the significant computation time these methods require hinders real-time interactive visualization of big data. By addressing crucial discrepancies between these methods and VA regarding precision and convergence, researchers have proposed ways to customize them for VA. These approaches, which include low-precision computation and iteration-level interactive visualization, ensure real-time interactive VA for big data.

  14. ELM Meets Urban Big Data Analysis: Case Studies

    PubMed Central

    Chen, Huajun; Chen, Jiaoyan

    2016-01-01

    In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework.

  15. Big bang photosynthesis and pregalactic nucleosynthesis of light elements

    NASA Technical Reports Server (NTRS)

    Audouze, J.; Lindley, D.; Silk, J.

    1985-01-01

    Two nonstandard scenarios for pregalactic synthesis of the light elements (H-2, He-3, He-4, and Li-7) are developed. Big bang photosynthesis occurs if energetic photons, produced by the decay of massive neutrinos or gravitinos, partially photodisintegrate He-4 (formed in the standard hot big bang) to produce H-2 and He-3. In this case, primordial nucleosynthesis no longer constrains the baryon density of the universe, or the number of neutrino species. Alternatively, one may dispense partially or completely with the hot big bang and produce the light elements by bombardment of primordial gas, provided that He-4 is synthesized by a later generation of massive stars.

  16. Participation-Based Services: Promoting Children's Participation in Natural Settings

    ERIC Educational Resources Information Center

    Campbell, Philippa

    2004-01-01

    When children are young, the activities and routines in which they participate are influenced by family decisions as well as by opportunities for participation. Families report that finding community opportunities for their young children with disabilities can be difficult. Furthermore, ensuring their children's success in these settings requires…

  17. Big, Dark Dunes Northeast of Syrtis Major

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Big sand dunes! Mars is home to some very large, windblown dunes. The dunes shown here rise to almost 100 meters (275 feet) at their crests. Unlike dunes on Earth, the larger dunes of Mars are composed of dark, rather than light grains. This is probably related to the composition of the sand, since different materials will have different brightnesses. For example, beaches on the island of Oahu in Hawaii are light colored because they consist of ground-up particles of seashells, while beaches in the southern shores of the island of Hawaii (the 'Big Island' in the Hawaiian island chain) are dark because they consist of sand derived from dark lava rock.

    The dunes in this picture taken by the Mars Orbiter Camera (MOC) are located on the floor of an old, 72 km-(45 mi)-diameter crater located northeast of Syrtis Major. The sand is being blown from the upper right toward the lower left. The surface that the dunes have been travelling across is pitted and cratered. The substrate is also hard and bright--i.e., it is composed of a material of different composition than the sand in the dunes. The dark streaks on the dune surfaces area puzzle...at first glance one might conclude they are the result of holiday visitors with off-road vehicles. However, the streaks more likely result from passing dust devils or wind gusts that disturb the sand surface just enough to leave a streak. The image shown here covers an area approximately 2.6 km (1.6 mi) wide, and is illuminated from the lower right.

    Malin Space Science Systems and the California Institute of Technology built the MOC using spare hardware from the Mars Observer mission. MSSS operates the camera from its facilities in San Diego, CA. The Jet Propulsion Laboratory's Mars Surveyor Operations Project operates the Mars Global Surveyor spacecraft with its industrial partner, Lockheed Martin Astronautics, from facilities in Pasadena, CA and Denver, CO.

  18. Where Big Data and Prediction Meet

    SciTech Connect

    Ahrens, James; Brase, Jim M.; Hart, Bill; Kusnezov, Dimitri; Shalf, John

    2014-09-11

    Our ability to assemble and analyze massive data sets, often referred to under the title of “big data”, is an increasingly important tool for shaping national policy. This in turn has introduced issues from privacy concerns to cyber security. But as IBM’s John Kelly emphasized in the last Innovation, making sense of the vast arrays of data will require radically new computing tools. In the past, technologies and tools for analysis of big data were viewed as quite different from the traditional realm of high performance computing (HPC) with its huge models of phenomena such as global climate or supporting the nuclear test moratorium. Looking ahead, this will change with very positive benefits for both worlds. Societal issues such as global security, economic planning and genetic analysis demand increased understanding that goes beyond existing data analysis and reduction. The modeling world often produces simulations that are complex compositions of mathematical models and experimental data. This has resulted in outstanding successes such as the annual assessment of the state of the US nuclear weapons stockpile without underground nuclear testing. Ironically, while there were historically many test conducted, this body of data provides only modest insight into the underlying physics of the system. A great deal of emphasis was thus placed on the level of confidence we can develop for the predictions. As data analytics and simulation come together, there is a growing need to assess the confidence levels in both data being gathered and the complex models used to make predictions. An example of this is assuring the security or optimizing the performance of critical infrastructure systems such as the power grid. If one wants to understand the vulnerabilities of the system or impacts of predicted threats, full scales tests of the grid against threat scenarios are unlikely. Preventive measures would need to be predicated on well-defined margins of confidence in order

  19. Interoperability Outlook in the Big Data Future

    NASA Astrophysics Data System (ADS)

    Kuo, K. S.; Ramachandran, R.

    2015-12-01

    The establishment of distributed active archive centers (DAACs) as data warehouses and the standardization of file format by NASA's Earth Observing System Data Information System (EOSDIS) had doubtlessly propelled interoperability of NASA Earth science data to unprecedented heights in the 1990s. However, we obviously still feel wanting two decades later. We believe the inadequate interoperability we experience is a result of the the current practice that data are first packaged into files before distribution and only the metadata of these files are cataloged into databases and become searchable. Data therefore cannot be efficiently filtered. Any extensive study thus requires downloading large volumes of data files to a local system for processing and analysis.The need to download data not only creates duplication and inefficiency but also further impedes interoperability, because the analysis has to be performed locally by individual researchers in individual institutions. Each institution or researcher often has its/his/her own preference in the choice of data management practice as well as programming languages. Analysis results (derived data) so produced are thus subject to the differences of these practices, which later form formidable barriers to interoperability. A number of Big Data technologies are currently being examined and tested to address Big Earth Data issues. These technologies share one common characteristics: exploiting compute and storage affinity to more efficiently analyze large volumes and great varieties of data. Distributed active "archive" centers are likely to evolve into distributed active "analysis" centers, which not only archive data but also provide analysis service right where the data reside. "Analysis" will become the more visible function of these centers. It is thus reasonable to expect interoperability to improve because analysis, in addition to data, becomes more centralized. Within a "distributed active analysis center

  20. Astronaut Health Participant Summary Application

    NASA Technical Reports Server (NTRS)

    Johnson, Kathy; Krog, Ralph; Rodriguez, Seth; Wear, Mary; Volpe, Robert; Trevino, Gina; Eudy, Deborah; Parisian, Diane

    2011-01-01

    The Longitudinal Study of Astronaut Health (LSAH) Participant Summary software captures data based on a custom information model designed to gather all relevant, discrete medical events for its study participants. This software provides a summarized view of the study participant s entire medical record. The manual collapsing of all the data in a participant s medical record into a summarized form eliminates redundancy, and allows for the capture of entire medical events. The coding tool could be incorporated into commercial electronic medical record software for use in areas like public health surveillance, hospital systems, clinics, and medical research programs.