Sample records for big heads small

  1. Data: Big and Small.

    PubMed

    Jones-Schenk, Jan

    2017-02-01

    Big data is a big topic in all leadership circles. Leaders in professional development must develop an understanding of what data are available across the organization that can inform effective planning for forecasting. Collaborating with others to integrate data sets can increase the power of prediction. Big data alone is insufficient to make big decisions. Leaders must find ways to access small data and triangulate multiple types of data to ensure the best decision making. J Contin Educ Nurs. 2017;48(2):60-61. Copyright 2017, SLACK Incorporated.

  2. "Small Steps, Big Rewards": Preventing Type 2 Diabetes

    MedlinePlus

    ... please turn Javascript on. Feature: Diabetes "Small Steps, Big Rewards": Preventing Type 2 Diabetes Past Issues / Fall ... These are the plain facts in "Small Steps. Big Rewards: Prevent Type 2 Diabetes," an education campaign ...

  3. Big for small: Validating brain injury guidelines in pediatric traumatic brain injury.

    PubMed

    Azim, Asad; Jehan, Faisal S; Rhee, Peter; O'Keeffe, Terence; Tang, Andrew; Vercruysse, Gary; Kulvatunyou, Narong; Latifi, Rifat; Joseph, Bellal

    2017-12-01

    Brain injury guidelines (BIG) were developed to reduce overutilization of neurosurgical consultation (NC) as well as computed tomography (CT) imaging. Currently, BIG have been successfully applied to adult populations, but the value of implementing these guidelines among pediatric patients remains unassessed. Therefore, the aim of this study was to evaluate the established BIG (BIG-1 category) for managing pediatric traumatic brain injury (TBI) patients with intracranial hemorrhage (ICH) without NC (no-NC). We prospectively implemented the BIG-1 category (normal neurologic examination, ICH ≤ 4 mm limited to one location, no skull fracture) to identify pediatric TBI patients (age, ≤ 21 years) that were to be managed no-NC. Propensity score matching was performed to match these no-NC patients to a similar cohort of patients managed with NC before the implementation of BIG in a 1:1 ratio for demographics, severity of injury, and type as well as size of ICH. Our primary outcome measure was need for neurosurgical intervention. A total of 405 pediatric TBI patients were enrolled, of which 160 (NC, 80; no-NC, 80) were propensity score matched. The mean age was 9.03 ± 7.47 years, 62.1% (n = 85) were male, the median Glasgow Coma Scale score was 15 (13-15), and the median head Abbreviated Injury Scale score was 2 (2-3). A subanalysis based on stratifying patients by age groups showed a decreased in the use of repeat head CT (p = 0.02) in the no-NC group, with no difference in progression (p = 0.34) and the need for neurosurgical intervention (p = 0.9) compared with the NC group. The BIG can be safely and effectively implemented in pediatric TBI patients. Reducing repeat head CT in pediatric patients has long-term sequelae. Likewise, adhering to the guidelines helps in reducing radiation exposure across all age groups. Therapeutic/care management, level III.

  4. "Small Steps, Big Rewards": You Can Prevent Type 2 Diabetes

    MedlinePlus

    ... Home Current Issue Past Issues Special Section "Small Steps, Big Rewards": You Can Prevent Type 2 Diabetes ... onset. Those are the basic facts of "Small Steps. Big Rewards: Prevent type 2 Diabetes," created by ...

  5. Small queens and big-headed workers in a monomorphic ponerine ant

    NASA Astrophysics Data System (ADS)

    Kikuchi, Tomonori; Miyazaki, Satoshi; Ohnishi, Hitoshi; Takahashi, Junichi; Nakajima, Yumiko; Tsuji, Kazuki

    2008-10-01

    Evolution of caste is a central issue in the biology of social insects. Comparative studies on their morphology so far suggest the following three patterns: (1) a positive correlation between queen worker size dimorphism and the divergence in reproductive ability between castes, (2) a negative correlation among workers between morphological diversity and reproductive ability, and (3) a positive correlation between queen worker body shape difference and the diversity in worker morphology. We conducted morphological comparisons between castes in Pachycondyla luteipes, workers of which are monomorphic and lack their reproductive ability. Although the size distribution broadly overlapped, mean head width, head length, and scape length were significantly different between queens and workers. Conversely, in eye length, petiole width, and Weber’s length, the size differences were reversed. The allometries (head length/head width, scape length/head width, and Weber’s length/head width) were also significantly different between queens and workers. Morphological examinations showed that the body shape was different between queens and workers, and the head part of workers was disproportionately larger than that of queens. This pattern of queen worker dimorphism is novel in ants with monomorphic workers and a clear exception to the last pattern. This study suggests that it is possible that the loss of individual-level selection, the lack of reproductive ability, influences morphological modification in ants.

  6. Big World of Small Neutrinos

    Science.gov Websites

    electron neutrino, muon neutrino, or tau neutrino. The three different neutrinos are complemented by anti of the neutrinos we detect will look different (have a different flavor) compared to the time they Big World of Small Neutrinos Neutrinos will find you! Fig 1: Hubble image of the deep field

  7. Small Public Libraries Can Serve Big. ERIC Digest.

    ERIC Educational Resources Information Center

    Parry, Norm

    Small public libraries can deliver service like big libraries, without sacrificing hometown warmth and charm. By borrowing strategies used by successful small businesses in the private sector, defining goals and exploiting low cost technologies, small public libraries can serve customer wants as well as much larger institutions. Responding to just…

  8. Funding big research with small money.

    PubMed

    Hickey, Joanne V; Koithan, Mary; Unruh, Lynn; Lundmark, Vicki

    2014-06-01

    This department highlights change management strategies that maybe successful in strategically planning and executing organizational change initiatives.With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools,and resources that mobilize and sustain organizational change initiatives.In this article, the guest authors introduce crowd sourcing asa strategy for funding big research with small money.

  9. Evaluation of lamprey larvicides in the Big Garlic River and Saux Head Lake

    USGS Publications Warehouse

    Manion, Patrick J.

    1969-01-01

    Bayluscide (5,2'-dichloro-4'-nitrosalicylanilide) and TFM (3-trifluoromethyl-4-nitrophenol) were evaluated as selective larvicides for control of the sea lamprey, Petromyzon marinus, in the Big Garlic River and Saux Head Lake in Marquette County, Michigan. Population estimates and movement of ammocetes were determined from the recapture of marked ammocetes released before chemical treatment. In 1966 the estimated population of 3136 ammocetes off the stream mouth in Saux Head Lake was reduced 89% by treatment with granular Bayluscide; this percentage was supported by a population estimate of 120 ammocetes in 1967, an indicated reduction of 96% from 1966. Post-marking movement of ammocetes was greater upstream than downstream.

  10. "small problems, Big Trouble": An Art and Science Collaborative Exhibition Reflecting Seemingly small problems Leading to Big Threats

    NASA Astrophysics Data System (ADS)

    Waller, J. L.; Brey, J. A.

    2014-12-01

    "small problems, Big Trouble" (spBT) is an exhibition of artist Judith Waller's paintings accompanied by text panels written by Earth scientist Dr. James A. Brey and several science researchers and educators. The text panels' message is as much the focus of the show as the art--true interdisciplinarity! Waller and Brey's history of art and earth science collaborations include the successful exhibition "Layers: Places in Peril". New in spBT is extended collaboration with other scientists in order to create awareness of geoscience and other subjects (i.e. soil, parasites, dust, pollutants, invasive species, carbon, ground water contaminants, solar wind) small in scale which pose significant threats. The paintings are the size of a mirror, a symbol suggesting the problems depicted are those we increasingly need to face, noting our collective reflections of shared current and future reality. Naturalistic rendering and abstract form in the art helps reach a broad audience including those familiar with art and those familiar with science. The goal is that gallery visitors gain greater appreciation and understanding of both—and of the sober content of the show as a whole. "small problems, Big Trouble" premiers in Wisconsin April, 2015. As in previous collaborations, Waller and Brey actively utilize art and science (specifically geoscience) as an educational vehicle for active student learning. Planned are interdisciplinary university and area high school activities linked through spBT. The exhibition in a public gallery offers a means to enhance community awareness of and action on scientific issues through art's power to engage people on an emotional level. This AGU presentation includes a description of past Waller and Brey activities: incorporating art and earth science in lab and studio classrooms, producing gallery and museum exhibitions and delivering workshops and other presentations. They also describe how walking the paths of several past earth science

  11. Comparing return to sport activities after short metaphyseal femoral arthroplasty with resurfacing and big femoral head arthroplasties.

    PubMed

    Karampinas, Panagiotis K; Papadelis, Eustratios G; Vlamis, John A; Basiliadis, Hlias; Pneumaticos, Spiros G

    2017-07-01

    Young patients feel that maintaining sport activities after total hip arthroplasty constitutes an important part of their quality of life. The majority of hip surgeons allow patients to return to low-impact activities, but significant caution is advised to taking part in high-impact activities. The purpose of this study is to compare and evaluate the post-operative return to daily living habits and sport activities following short-metaphyseal hip and high functional total hip arthroplasties (resurfacing and big femoral head arthroplasties). In a study design, 48 patients (55 hips) were enrolled in three different comparative groups, one with the short-metaphyseal arthroplasties, a second with high functional resurfacing arthroplasties and a third of big femoral head arthroplasties. Each patient experienced a clinical examination and evaluated with Harris Hip Score, WOMAC, Sf-36, UCLA activity score, satisfaction VAS, anteroposterior and lateral X-rays of the hip and were followed in an outpatient setting for 2 years. Statistical analysis revealed no notable differences between the three groups regarding their demographic data however significant differences have been found between preoperative and postoperative clinical scores of each group. Also, we fail to reveal any significant differences when comparing data of all three groups at the final 2 years postoperative control regarding their clinical scores. The overall outcome of all three groups was similar, all the patients were satisfied and returned to previous level of sport activities. Short metaphyseal hip arthroplasties in young patients intending to return to previous and even high impact sport activities, similar to high functional resurfacing, big femoral head arthroplasties. Short stems with hard on hard bearing surfaces might become an alternative to standard stems and hip resurfacing.

  12. Optimization of deflection of a big NEO through impact with a small one.

    PubMed

    Zhu, Kaijian; Huang, Weiping; Wang, Yuncai; Niu, Wei; Wu, Gongyou

    2014-01-01

    Using a small near-Earth object (NEO) to impact a larger and potentially threatening NEO has been suggested as an effective method to avert a collision with Earth. This paper develops a procedure for analysis of the technique for specific NEOs. First, an optimization method is used to select a proper small body from the database. Some principles of optimality are achieved with the optimization process. Then, the orbit of the small body is changed to guarantee that it flies toward and impacts the big threatening NEO. Kinetic impact by a spacecraft is chosen as the strategy of deflecting the small body. The efficiency of this method is compared with that of a direct kinetic impact to the big NEO by a spacecraft. Finally, a case study is performed for the deflection of the Apophis NEO, and the efficiency of the method is assessed.

  13. Optimization of Deflection of a Big NEO through Impact with a Small One

    PubMed Central

    Zhu, Kaijian; Huang, Weiping; Wang, Yuncai; Niu, Wei; Wu, Gongyou

    2014-01-01

    Using a small near-Earth object (NEO) to impact a larger and potentially threatening NEO has been suggested as an effective method to avert a collision with Earth. This paper develops a procedure for analysis of the technique for specific NEOs. First, an optimization method is used to select a proper small body from the database. Some principles of optimality are achieved with the optimization process. Then, the orbit of the small body is changed to guarantee that it flies toward and impacts the big threatening NEO. Kinetic impact by a spacecraft is chosen as the strategy of deflecting the small body. The efficiency of this method is compared with that of a direct kinetic impact to the big NEO by a spacecraft. Finally, a case study is performed for the deflection of the Apophis NEO, and the efficiency of the method is assessed. PMID:25525627

  14. Modified duval procedure for small-duct chronic pancreatitis without head dominance.

    PubMed

    Oida, Takatsugu; Aramaki, Osamu; Kano, Hisao; Mimatsu, Kenji; Kawasaki, Atsushi; Kuboi, Youichi; Fukino, Nobutada; Kida, Kazutoshi; Amano, Sadao

    2011-01-01

    In the case of small-duct chronic pancreatitis, surgery for pain relief is broadly divided into resection and drainage procedures. These procedures should be selected according to the location of dominant lesion, diameter of the pancreatic duct and extent of the disease. The appropriate procedure for the treatment of small-duct chronic pancreatitis, especially small-duct chronic pancreatitis without head dominance, remains controversial. We developed the modified Duval procedure for the treatment of small-duct chronic pancreatitis without head dominance and determined the efficacy of this procedure. We retrospectively studied 14 patients who underwent surgical drainage with or without pancreatic resection for chronic pancreatitis with small pancreatic duct (<7mm) without head dominance. These patients were divided into 2 groups; the modified Puestow procedure group and the modified Duval procedure group. No complications occurred in the modified Duval group. In the modified Puestow procedure group, complete and partial pain relief were observed in 62.5%, and 37.5% of patients respectively. In contrast, complete pain relief was observed in all the patients in the modified Duval procedure group. Our modified Duval procedure is useful and should be considered the appropriate surgical technique for the treatment of small-duct chronic pancreatitis without head dominance.

  15. Small values in big data: The continuing need for appropriate metadata

    USGS Publications Warehouse

    Stow, Craig A.; Webster, Katherine E.; Wagner, Tyler; Lottig, Noah R.; Soranno, Patricia A.; Cha, YoonKyung

    2018-01-01

    Compiling data from disparate sources to address pressing ecological issues is increasingly common. Many ecological datasets contain left-censored data – observations below an analytical detection limit. Studies from single and typically small datasets show that common approaches for handling censored data — e.g., deletion or substituting fixed values — result in systematic biases. However, no studies have explored the degree to which the documentation and presence of censored data influence outcomes from large, multi-sourced datasets. We describe left-censored data in a lake water quality database assembled from 74 sources and illustrate the challenges of dealing with small values in big data, including detection limits that are absent, range widely, and show trends over time. We show that substitutions of censored data can also bias analyses using ‘big data’ datasets, that censored data can be effectively handled with modern quantitative approaches, but that such approaches rely on accurate metadata that describe treatment of censored data from each source.

  16. Head or tail: the orientation of the small bowel capsule endoscope movement in the small bowel.

    PubMed

    Kopylov, Uri; Papageorgiou, Neofytos P; Nadler, Moshe; Eliakim, Rami; Ben-Horin, Shomron

    2012-03-01

    The diagnostic accuracy of capsule endoscopy has been suggested to be influenced by the direction of the passage in the intestine. It is currently unknown if a head-first or a tail-first orientation are equally common during the descent through the small bowel. The aim of the study was to identify the orientation of the capsule along the migration through the small bowel. Thirty capsule endoscopies were reviewed by an experienced observer. The direction of the passage through the pylorus and the ileoceccal valve was recorded for all the examinations. In addition, detailed review of the passage of the capsule in different segments of the small bowel was undertaken for all the capsules. The capsule was significantly more likely to pass the pylorus head-first compared to tail-first (25 and 5 out of 30, respectively, OR 5, 95% CI 65-94%, P < 0.001). In 28/30 studies, the capsule exited the ileoceccal valve head-first (OR-14, 95% CI 77-99%, P < 0.001). In an immersion experiment, uneven distribution of weight of the capsule body was demonstrated with the head part (camera tip) being lighter than the tail part. The capsule endoscope usually passes through the pylorus and subsequent segments of the small bowel head-first. This observation suggests that the intestinal peristaltic physiology drives symmetrical bodies with their light part first. The principle of intestinal orientation by weight distribution may bear implications for capsules' design in the future.

  17. Small and big quality in health care.

    PubMed

    Lillrank, Paul Martin

    2015-01-01

    The purpose of this paper is to clarify healthcare quality's ontological and epistemological foundations; and examine how these lead to different measurements and technologies. Conceptual analysis. Small quality denotes conformance to ex ante requirements. Big quality includes product and service design, based on customer requirements and expectations. Healthcare quality can be divided into three areas: clinical decision making; patient safety; and patient experience, each with distinct measurement and improvement technologies. The conceptual model is expected to bring clarity to constructing specific definitions, measures, objectives and technologies for improving healthcare. This paper claims that before healthcare quality can be defined, measured and integrated into systems, it needs to be clearly separated into ontologically and epistemologically different parts.

  18. "Small" data in a big data world: archiving terrestrial ecology data at ORNL DAAC

    NASA Astrophysics Data System (ADS)

    Santhana Vannan, S. K.; Beaty, T.; Boyer, A.; Deb, D.; Hook, L.; Shrestha, R.; Thornton, M.; Virdi, M.; Wei, Y.; Wright, D.

    2016-12-01

    The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC http://daac.ornl.gov), a NASA-funded data center, archives a diverse collection of terrestrial biogeochemistry and ecological dynamics observations and models in support of NASA's Earth Science program. The ORNL DAAC has been addressing the increasing challenge of publishing diverse small data products into an online archive while dealing with the enhanced need for integration and availability of these data to address big science questions. This paper will show examples of "small" diverse data holdings - ranging from the Daymet model output data to site-based soil moisture observation data. We define "small" by the data volume of these data products compared to petabyte scale observations. We will highlight the use of tools and services for visualizing diverse data holdings and subsetting services such as the MODIS land products subsets tool (at ORNL DAAC) that provides big MODIS data in small chunks. Digital Object Identifiers (DOI) and data citations have enhanced the availability of data. The challenge faced by data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing small diverse data products into an online archive. This paper will also present our experiences designing a data curation system for these types of data. The characteristics of these data will be examined and their scientific value will be demonstrated via data citation metrics. We will present case studies of leveraging specialized tools and services that have enabled small data sets to realize their "big" scientific potential. Overall, we will provide a holistic view of the challenges and potential of small diverse terrestrial ecology data sets from data curation to distribution.

  19. Think Big, Bigger ... and Smaller

    ERIC Educational Resources Information Center

    Nisbett, Richard E.

    2010-01-01

    One important principle of social psychology, writes Nisbett, is that some big-seeming interventions have little or no effect. This article discusses a number of cases from the field of education that confirm this principle. For example, Head Start seems like a big intervention, but research has indicated that its effects on academic achievement…

  20. Small female head and neck interaction with a deploying side airbag.

    PubMed

    Duma, Stefan M; Crandall, Jeff R; Rudd, Rodney W; Kent, Richard W

    2003-09-01

    This paper presents dummy and cadaver experiments designed to investigate the injury potential of an out-of-position small female head and neck from a deploying side airbag. Seat-mounted, thoracic-type, side airbags were selected for this study to represent those currently available on selected luxury automobiles. A computer simulation program was used to identify the worst case loading position for the small female head and neck. Once the initial position was identified, experiments were performed with the Hybrid III 5th percentile dummy and three small female cadavers, using three different inflators. Peak head center of gravity (CG) accelerations for the dummy ranged from 71x g to 154 x g, and were greater than cadaver values, which ranged from 68 x g to 103 x g. Peak neck tension as measured at the upper load cell of the dummy increased with inflator aggressivity from 992 to 1670N. A conservative modification of the US National Highway Traffic Safety Administration's (NHTSA's) N(ij) proposed neck injury criteria, which combines neck tension and bending, was used. All values were well below the 1.0 injury threshold for the dummy and suggested a very low possibility of neck injury. In agreement with this prediction, no injuries were observed. Even in a worst case position, small females are at low risk of head or neck injuries under loading from these thoracic-type airbags; however, injury risk increases with increasing inflator aggressivity.

  1. Evaluation research of small and medium-sized enterprise informatization on big data

    NASA Astrophysics Data System (ADS)

    Yang, Na

    2017-09-01

    Under the background of big data, key construction of small and medium-sized enterprise informationization level was needed, but information construction cost was large, while information cost of inputs can bring benefit to small and medium-sized enterprises. This paper established small and medium-sized enterprise informatization evaluation system from hardware and software security level, information organization level, information technology application and the profit level, and information ability level. The rough set theory was used to brief indexes, and then carry out evaluation by support vector machine (SVM) model. At last, examples were used to verify the theory in order to prove the effectiveness of the method.

  2. Use of a "small-bubble technique" to increase the success of Anwar's "big-bubble technique" for deep lamellar keratoplasty with complete baring of Descemet's membrane.

    PubMed

    Parthasarathy, Anand; Por, Yong Ming; Tan, Donald T H

    2007-10-01

    To describe a quick and simple "small-bubble" technique to immediately determine the success of attaining complete Descemet's membrane (DM) separation from corneal stroma through Anwar's "big-bubble" technique of deep anterior lamellar keratoplasty (DALK) for complete stromal removal. A partial trephination was followed by a lamellar dissection of the anterior stroma. Deep stromal air injection was then attempted to achieve the big bubble to help separate the stroma from the DM. To confirm that a big bubble had been achieved, a small air bubble was injected into the anterior chamber (AC) through a limbal paracentesis. If the small bubble is then seen at the corneal periphery, it confirms that the big-bubble separation of DM was successful because the convex nature of the bubble will cause it to protrude posteriorly, forcing the small AC bubble to the periphery. If the small AC bubble is not seen in the corneal periphery, this means that it is present in the centre, beneath the opaque corneal stroma, and therefore the big bubble has not been achieved. We used the small-bubble technique to confirm the presence of the big bubble in three (one keratoconus, one interstitial keratitis and one dense corneal scar) out of 41 patients who underwent DALK. The small-bubble technique confirmed that the big bubble was achieved in the eye of all three patients. Complete stromal removal with baring of the DM was achieved, and postoperatively all three eyes achieved best corrected vision of 6/6. The small-bubble technique can be a useful surgical tool for corneal surgeons attempting lamellar keratoplasty using the big-bubble technique. It helps in confirming the separation of DM from the deep stroma, which is important in achieving total stromal replacement. It will help to make the transition to lamellar keratoplasty smoother, enhance corneal graft success and improve visual outcomes in patients.

  3. Small Is Too Big: Achieving a Critical Anti-Mass in the High School.

    ERIC Educational Resources Information Center

    Gregory, Tom

    Developing more effective conceptions of the high school may require radically reducing its size. In today's big high schools, size ensures that control of students is a primary concern and prevents the development of a collegial atmosphere among teachers. Although research provides ample evidence of the superior social climates of small informal…

  4. Big data from small data: data-sharing in the ‘long tail’ of neuroscience

    PubMed Central

    Ferguson, Adam R; Nielson, Jessica L; Cragin, Melissa H; Bandrowski, Anita E; Martone, Maryann E

    2016-01-01

    The launch of the US BRAIN and European Human Brain Projects coincides with growing international efforts toward transparency and increased access to publicly funded research in the neurosciences. The need for data-sharing standards and neuroinformatics infrastructure is more pressing than ever. However, ‘big science’ efforts are not the only drivers of data-sharing needs, as neuroscientists across the full spectrum of research grapple with the overwhelming volume of data being generated daily and a scientific environment that is increasingly focused on collaboration. In this commentary, we consider the issue of sharing of the richly diverse and heterogeneous small data sets produced by individual neuroscientists, so-called long-tail data. We consider the utility of these data, the diversity of repositories and options available for sharing such data, and emerging best practices. We provide use cases in which aggregating and mining diverse long-tail data convert numerous small data sources into big data for improved knowledge about neuroscience-related disorders. PMID:25349910

  5. jsc2018m000274_Alpha-Space-Small-Business-Makes-Big-Strides_MP4

    NASA Image and Video Library

    2018-03-30

    The path to discovery and exploration is paved with determination, innovation, and most of all, big ideas. The International Space Station is home to many of those ideas and creating new ways for small businesses, entrepreneurs and researchers to test their science and technology in space every day.Formed in 2015 in response to the need for a commercial payload that would be available to private companies aboard the space station, Alpha Space is a woman- and minority-owned small business responsible for developing the Materials International Space Station Experiment Flight Facility (MISSE-FF).

  6. The BIG (brain injury guidelines) project: defining the management of traumatic brain injury by acute care surgeons.

    PubMed

    Joseph, Bellal; Friese, Randall S; Sadoun, Moutamn; Aziz, Hassan; Kulvatunyou, Narong; Pandit, Viraj; Wynne, Julie; Tang, Andrew; O'Keeffe, Terence; Rhee, Peter

    2014-04-01

    It is becoming a standard practice that any "positive" identification of a radiographic intracranial injury requires transfer of the patient to a trauma center for observation and repeat head computed tomography (RHCT). The purpose of this study was to define guidelines-based on each patient's history, physical examination, and initial head CT findings-regarding which patients require a period of observation, RHCT, or neurosurgical consultation. In our retrospective cohort analysis, we reviewed the records of 3,803 blunt traumatic brain injury patients during a 4-year period. We classified patients according to neurologic examination results, use of intoxicants, anticoagulation status, and initial head CT findings. We then developed brain injury guidelines (BIG) based on the individual patient's need for observation or hospitalization, RHCT, or neurosurgical consultation. A total of 1,232 patients had an abnormal head CT finding. In the BIG 1 category, no patients worsened clinically or radiographically or required any intervention. BIG 2 category had radiographic worsening in 2.6% of the patients. All patients who required neurosurgical intervention (13%) were in BIG 3. There was excellent agreement between assigned BIG and verified BIG. κ statistic is equal to 0.98. We have proposed BIG based on patient's history, neurologic examination, and findings of initial head CT scan. These guidelines must be used as supplement to good clinical examination while managing patients with traumatic brain injury. Prospective validation of the BIG is warranted before its widespread implementation. Epidemiologic study, level III.

  7. Small head circumference at birth and early age at adiposity rebound.

    PubMed

    Eriksson, J G; Kajantie, E; Lampl, M; Osmond, C; Barker, D J P

    2014-01-01

    The adiposity rebound is the age in childhood when body mass index is at a minimum before increasing again. The age at rebound is highly variable. An early age is associated with increased obesity in later childhood and adult life. We have reported that an early rebound is predicted by low weight gain between birth and 1 year of age and resulting low body mass index at 1 year. Here, we examine whether age at adiposity rebound is determined by influences during infancy or is a consequence of foetal growth. Our hypothesis was that measurements of body size at birth are related to age at adiposity rebound. Longitudinal study of 2877 children born in Helsinki, Finland, during 1934-1944. Early age at adiposity rebound was associated with small head circumference and biparietal diameter at birth, but not with other measurements of body size at birth. The mean age at adiposity rebound rose from 5.8 years in babies with a head circumference of ≤33 cm to 6.2 in babies with a head circumference of >36 cm (P for trend = 0.007). The association between thinness in infancy and early rebound became apparent at 6 months of age. It was not associated with adverse living conditions. In a simultaneous regression, small head circumference at birth, high mother's body mass index and tall maternal stature each had statistically significant trends with early adiposity rebound (P = 0.002, <0.001, 0.004). We hypothesize that the small head size at birth that preceded an early adiposity rebound was the result of inability to sustain a rapid intra-uterine growth trajectory initiated in association with large maternal body size. This was followed by catch-up growth in infancy, and we hypothesize that this depleted the infant's fat stores. © 2013 Scandinavian Physiological Society. Published by John Wiley & Sons Ltd.

  8. Big Data, Big Problems: A Healthcare Perspective.

    PubMed

    Househ, Mowafa S; Aldosari, Bakheet; Alanazi, Abdullah; Kushniruk, Andre W; Borycki, Elizabeth M

    2017-01-01

    Much has been written on the benefits of big data for healthcare such as improving patient outcomes, public health surveillance, and healthcare policy decisions. Over the past five years, Big Data, and the data sciences field in general, has been hyped as the "Holy Grail" for the healthcare industry promising a more efficient healthcare system with the promise of improved healthcare outcomes. However, more recently, healthcare researchers are exposing the potential and harmful effects Big Data can have on patient care associating it with increased medical costs, patient mortality, and misguided decision making by clinicians and healthcare policy makers. In this paper, we review the current Big Data trends with a specific focus on the inadvertent negative impacts that Big Data could have on healthcare, in general, and specifically, as it relates to patient and clinical care. Our study results show that although Big Data is built up to be as a the "Holy Grail" for healthcare, small data techniques using traditional statistical methods are, in many cases, more accurate and can lead to more improved healthcare outcomes than Big Data methods. In sum, Big Data for healthcare may cause more problems for the healthcare industry than solutions, and in short, when it comes to the use of data in healthcare, "size isn't everything."

  9. Small Molecules-Big Data.

    PubMed

    Császár, Attila G; Furtenbacher, Tibor; Árendás, Péter

    2016-11-17

    Quantum mechanics builds large-scale graphs (networks): the vertices are the discrete energy levels the quantum system possesses, and the edges are the (quantum-mechanically allowed) transitions. Parts of the complete quantum mechanical networks can be probed experimentally via high-resolution, energy-resolved spectroscopic techniques. The complete rovibronic line list information for a given molecule can only be obtained through sophisticated quantum-chemical computations. Experiments as well as computations yield what we call spectroscopic networks (SN). First-principles SNs of even small, three to five atomic molecules can be huge, qualifying for the big data description. Besides helping to interpret high-resolution spectra, the network-theoretical view offers several ideas for improving the accuracy and robustness of the increasingly important information systems containing line-by-line spectroscopic data. For example, the smallest number of measurements necessary to perform to obtain the complete list of energy levels is given by the minimum-weight spanning tree of the SN and network clustering studies may call attention to "weakest links" of a spectroscopic database. A present-day application of spectroscopic networks is within the MARVEL (Measured Active Rotational-Vibrational Energy Levels) approach, whereby the transitions information on a measured SN is turned into experimental energy levels via a weighted linear least-squares refinement. MARVEL has been used successfully for 15 molecules and allowed to validate most of the transitions measured and come up with energy levels with well-defined and realistic uncertainties. Accurate knowledge of the energy levels with computed transition intensities allows the realistic prediction of spectra under many different circumstances, e.g., for widely different temperatures. Detailed knowledge of the energy level structure of a molecule coming from a MARVEL analysis is important for a considerable number of modeling

  10. The BIG Score and Prediction of Mortality in Pediatric Blunt Trauma.

    PubMed

    Davis, Adrienne L; Wales, Paul W; Malik, Tahira; Stephens, Derek; Razik, Fathima; Schuh, Suzanne

    2015-09-01

    To examine the association between in-hospital mortality and the BIG (composed of the base deficit [B], International normalized ratio [I], Glasgow Coma Scale [G]) score measured on arrival to the emergency department in pediatric blunt trauma patients, adjusted for pre-hospital intubation, volume administration, and presence of hypotension and head injury. We also examined the association between the BIG score and mortality in patients requiring admission to the intensive care unit (ICU). A retrospective 2001-2012 trauma database review of patients with blunt trauma ≤ 17 years old with an Injury Severity score ≥ 12. Charts were reviewed for in-hospital mortality, components of the BIG score upon arrival to the emergency department, prehospital intubation, crystalloids ≥ 20 mL/kg, presence of hypotension, head injury, and disposition. 50/621 (8%) of the study patients died. Independent mortality predictors were the BIG score (OR 11, 95% CI 6-25), prior fluid bolus (OR 3, 95% CI 1.3-9), and prior intubation (OR 8, 95% CI 2-40). The area under the receiver operating characteristic curve was 0.95 (CI 0.93-0.98), with the optimal BIG cutoff of 16. With BIG <16, death rate was 3/496 (0.006, 95% CI 0.001-0.007) vs 47/125 (0.38, 95% CI 0.15-0.7) with BIG ≥ 16, (P < .0001). In patients requiring admission to the ICU, the BIG score remained predictive of mortality (OR 14.3, 95% CI 7.3-32, P < .0001). The BIG score accurately predicts mortality in a population of North American pediatric patients with blunt trauma independent of pre-hospital interventions, presence of head injury, and hypotension, and identifies children with a high probability of survival (BIG <16). The BIG score is also associated with mortality in pediatric patients with trauma requiring admission to the ICU. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. The Changing Roles of Online Deans and Department Heads in Small Private Universities

    ERIC Educational Resources Information Center

    Halupa, Colleen M.

    2016-01-01

    This paper provides an overview of best practices and challenges for deans and department heads of online programmes in the ever-changing world of higher education. It concentrates on the challenges for small private universities and tertiary education institutions in the United States, Australia, and New Zealand. Department heads must consider…

  12. Transforming fragments into candidates: small becomes big in medicinal chemistry.

    PubMed

    de Kloe, Gerdien E; Bailey, David; Leurs, Rob; de Esch, Iwan J P

    2009-07-01

    Fragment-based drug discovery (FBDD) represents a logical and efficient approach to lead discovery and optimisation. It can draw on structural, biophysical and biochemical data, incorporating a wide range of inputs, from precise mode-of-binding information on specific fragments to wider ranging pharmacophoric screening surveys using traditional HTS approaches. It is truly an enabling technology for the imaginative medicinal chemist. In this review, we analyse a representative set of 23 published FBDD studies that describe how low molecular weight fragments are being identified and efficiently transformed into higher molecular weight drug candidates. FBDD is now becoming warmly endorsed by industry as well as academia and the focus on small interacting molecules is making a big scientific impact.

  13. Big Policies and a Small World: An Analysis of Policy Problems and Solutions in Physical Education

    ERIC Educational Resources Information Center

    Penney, Dawn

    2017-01-01

    This paper uses Ball's [1998. Big policies/small world: An introduction to international perspectives in education policy. "Comparative Education," 34(2), 119-130] policy analysis and Bernstein's [1990. "The structuring of pedagogic discourse. Volume IV class, codes and control". London: Routledge; 2000, "Pedagogy,…

  14. [Cultivation strategy and path analysis on big brand Chinese medicine for small and medium-sized enterprises].

    PubMed

    Wang, Yong-Yan; Yang, Hong-Jun

    2014-03-01

    Small and medium-sized enterprises (SMEs) are important components in Chinese medicine industry. However, the lack of big brand is becoming an urgent problem which is critical to the survival of SMEs. This article discusses the concept and traits of Chinese medicine of big brand, from clinical, scientific and market value three aspects. Guided by market value, highlighting clinical value, aiming at the scientific value improvement of big brand cultivation, we put forward the key points in cultivation, aiming at obtaining branded Chinese medicine with widely recognized efficacy, good quality control system and mechanism well explained and meanwhile which can bring innovation improvement to theory of Chinese medicine. According to the characters of SMEs, we hold a view that to build multidisciplinary research union could be considered as basic path, and then, from top-level design, skill upgrading and application three stages to probe the implementation strategy.

  15. New to Teaching: Small Changes Can Produce Big Results!

    ERIC Educational Resources Information Center

    Shenton, Megan

    2017-01-01

    In this article, Megan Shenton, a final-year trainee teacher at Nottinghom Trent University, describes using "The Big Question" in her science teaching in a move away from objectives. The Big Question is an innovative pedagogical choice, where instead of implementing a learning objective, a question is posed at the start of the session…

  16. Simulation Experiments: Better Data, Not Just Big Data

    DTIC Science & Technology

    2014-12-01

    Modeling and Computer Simulation 22 (4): 20:1–20:17. Hogan, Joe 2014, June 9. “So Far, Big Data is Small Potatoes ”. Scientific American Blog Network...Available via http://blogs.scientificamerican.com/cross-check/2014/06/09/so-far- big-data-is-small- potatoes /. IBM. 2014. “Big Data at the Speed of Business

  17. Partnership between small biotech and big pharma.

    PubMed

    Wiederrecht, Gregory J; Hill, Raymond G; Beer, Margaret S

    2006-08-01

    The process involved in the identification and development of novel breakthrough medicines at big pharma has recently undergone significant changes, in part because of the extraordinary complexity that is associated with tackling diseases of high unmet need, and also because of the increasingly demanding requirements that have been placed on the pharmaceutical industry by investors and regulatory authorities. In addition, big pharma no longer have a monopoly on the tools and enabling technologies that are required to identify and discover new drugs, as many biotech companies now also have these capabilities. As a result, researchers at biotech companies are able to identify credible drug leads, as well as compounds that have the potential to become marketed medicinal products. This diversification of companies that are involved in drug discovery and development has in turn led to increased partnering interactions between the biotech sector and big pharma. This article examines how Merck and Co Inc, which has historically relied on a combination of internal scientific research and licensed products, has poised itself to become further engaged in partnering with biotech companies, as well as academic institutions, to increase the probability of success associated with identifying novel medicines to treat unmet medical needs--particularly in areas such as central nervous system disorders, obesity/metabolic diseases, atheroma and cancer, and also to cultivate its cardiovascular, respiratory, arthritis, bone, ophthalmology and infectious disease franchises.

  18. Demonstration of variable speed permanent magnet generator at small, low-head hydro site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown Kinloch, David

    Small hydro developers face a limited set of bad choices when choosing a generator for a small low-head hydro site. Direct drive synchronous generators are expensive and technically complex to install. Simpler induction generators are higher speed, requiring a speed increaser, which results in inefficiencies and maintenance problems. In addition, both induction and synchronous generators turn at a fixed speed, causing the turbine to run off its peak efficiency curve whenever the available head is different than the designed optimum head.The solution to these problems is the variable speed Permanent Magnet Generators (PMG). At the Weisenberger Mill in Midway, KY,more » a variable speed Permanent Magnet Generator has been installed and demonstrated. This new PMG system replaced an existing induction generator that had a HTD belt drive speed increaser system. Data was taken from the old generator before it was removed and compared to data collected after the PMG system was installed. The new variable speed PMG system is calculated to produce over 96% more energy than the old induction generator system during an average year. This significant increase was primarily due to the PMG generator operating at the correct speed at the maximum head, and the ability for the PMG generator to reduce its speed to lower optimum speeds as the stream flow increased and the net head decreased.This demonstration showed the importance of being able to adjust the speed of fixed blade turbines. All fixed blade turbines with varying net heads could achieve higher efficiencies if the speed can be matched to the optimum speed as the head changes. In addition, this demonstration showed that there are many potential efficiencies that could be realized with variable speed technology at hydro sites where mismatched turbine and generator speeds result in lower power output, even at maximum head. Funding for this project came from the US Dept. of Energy, through Award Number DE-EE0005429.« less

  19. Penguin head movement detected using small accelerometers: a proxy of prey encounter rate.

    PubMed

    Kokubun, Nobuo; Kim, Jeong-Hoon; Shin, Hyoung-Chul; Naito, Yasuhiko; Takahashi, Akinori

    2011-11-15

    Determining temporal and spatial variation in feeding rates is essential for understanding the relationship between habitat features and the foraging behavior of top predators. In this study we examined the utility of head movement as a proxy of prey encounter rates in medium-sized Antarctic penguins, under the presumption that the birds should move their heads actively when they encounter and peck prey. A field study of free-ranging chinstrap and gentoo penguins was conducted at King George Island, Antarctica. Head movement was recorded using small accelerometers attached to the head, with simultaneous monitoring for prey encounter or body angle. The main prey was Antarctic krill (>99% in wet mass) for both species. Penguin head movement coincided with a slow change in body angle during dives. Active head movements were extracted using a high-pass filter (5 Hz acceleration signals) and the remaining acceleration peaks (higher than a threshold acceleration of 1.0 g) were counted. The timing of head movements coincided well with images of prey taken from the back-mounted cameras: head movement was recorded within ±2.5 s of a prey image on 89.1±16.1% (N=7 trips) of images. The number of head movements varied largely among dive bouts, suggesting large temporal variations in prey encounter rates. Our results show that head movement is an effective proxy of prey encounter, and we suggest that the method will be widely applicable for a variety of predators.

  20. Small Core, Big Network: A Comprehensive Approach to GIS Teaching Practice Based on Digital Three-Dimensional Campus Reconstruction

    ERIC Educational Resources Information Center

    Cheng, Liang; Zhang, Wen; Wang, Jiechen; Li, Manchun; Zhong, Lishan

    2014-01-01

    Geographic information science (GIS) features a wide range of disciplines and has broad applicability. Challenges associated with rapidly developing GIS technology and the currently limited teaching and practice materials hinder universities from cultivating highly skilled GIS graduates. Based on the idea of "small core, big network," a…

  1. Big strokes in small persons.

    PubMed

    Adams, Robert J

    2007-11-01

    in all children with SCD is approximately 0.5% to 1.0% per year. On the basis of STOP, if the patient meets the high-risk TCD criteria, regular blood transfusions are recommended. A second study was performed (2000-2005) to attempt withdrawal of transfusion in selected children in a randomized controlled study. Children with initially abnormal TCD velocities (> or =200 cm/s) treated with regular blood transfusion for 30 months or more, which resulted in reduction of the TCD to less than 170 cm/s, were eligible for randomization into STOP II. Half continued transfusion and half had cessation of transfusion. This trial was halted early for safety reasons. There was an unacceptably high rate of TCD reversion back to high risk (> or =200 cm/s), as well as 2 strokes in children who discontinued transfusion. There are no evidence-based guidelines for the discontinuation of transfusion in children once they have been identified as having high risk based on TCD. The current situation is undesirable because of the long-term effects of transfusion, including iron overload. Iron overload has recently become easier to manage with the introduction of an oral iron chelator. The inflammatory environment known to exist in SCD and the known effect of plasma free hemoglobin, released by hemolysis, of reducing available nitric oxide may contribute to the development of cerebrovascular disease. Further research may lead to more targeted therapies. We can reduce many of the big strokes that occur in these small persons by aggressively screening patients at a young age (and periodically throughout the childhood risk period) and interrupting the process with regular blood transfusions.

  2. ATLAS: Big Data in a Small Package

    NASA Astrophysics Data System (ADS)

    Denneau, Larry; Tonry, John

    2015-08-01

    For even small telescope projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (ATLAS; Tonry 2011) will robotically survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids (NEAs) on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards -- two 0.5 m F/2.0 telescopes -- each year the ATLAS system will obtain ~103 measurements of 109 astronomical sources to a photometric accuracy of <5%. This ever-growing dataset must be searched in real-time for moving objects then archived for further analysis, and alerts for newly discovered near-Earth NEAs disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many ``rifle shot'' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of satellites and pieces of space junk that ATLAS will see each night. Additional interrogation will identify interesting phenomena from beyond the solar system occurring over millions of transient sources per night. The data processing and storage requirements for ATLAS demand a ``big data'' approach typical of commercial Internet enterprises. We describe our approach to deploying a nimble, scalable and reliable data processing infrastructure, and promote ATLAS as steppingstone to eventual processing scales in the era of LSST.

  3. ATLAS: Big Data in a Small Package?

    NASA Astrophysics Data System (ADS)

    Denneau, Larry

    2016-01-01

    For even small astronomy projects, the petabyte scale is now upon us. The Asteroid Terrestrial-impact Last Alert System (Tonry 2011) will survey the entire visible sky from Hawaii multiple times per night to search for near-Earth asteroids on impact trajectories. While the ATLAS optical system is modest by modern astronomical standards - two 0.5 m F/2.0 telescopes - each night the ATLAS system will measure nearly 109 astronomical sources to a photometric accuracy of <5%, totaling 1012 individual observations over its initial 3-year mission. This ever-growing dataset must be searched in real-time for moving objects and transients then archived for further analysis, and alerts for newly discovered near-Earth asteroids (NEAs) disseminated within tens of minutes from detection. ATLAS's all-sky coverage ensures it will discover many `rifle shot' near-misses moving rapidly on the sky as they shoot past the Earth, so the system will need software to automatically detect highly-trailed sources and discriminate them from the thousands of low-Earth orbit (LEO) and geosynchronous orbit (GEO) satellites ATLAS will see each night. Additional interrogation will identify interesting phenomena from millions of transient sources per night beyond the solar system. The data processing and storage requirements for ATLAS demand a `big data' approach typical of commercial internet enterprises. We describe our experience in deploying a nimble, scalable and reliable data processing infrastructure, and suggest ATLAS as steppingstone to data processing capability needed as we enter the era of LSST.

  4. Small Colleges, Big Missions.

    ERIC Educational Resources Information Center

    Griffin, W. A., Jr., Ed.

    This monograph by the members of the American Association of Community Colleges' Commission on Small and/or Rural Community Colleges shares small and rural community college experiences. In "Leaders through Community Service," Jacqueline D. Taylor provides a model for how small and rural community colleges can be involved in building leaders…

  5. Big Data: the challenge for small research groups in the era of cancer genomics

    PubMed Central

    Noor, Aisyah Mohd; Holmberg, Lars; Gillett, Cheryl; Grigoriadis, Anita

    2015-01-01

    In the past decade, cancer research has seen an increasing trend towards high-throughput techniques and translational approaches. The increasing availability of assays that utilise smaller quantities of source material and produce higher volumes of data output have resulted in the necessity for data storage solutions beyond those previously used. Multifactorial data, both large in sample size and heterogeneous in context, needs to be integrated in a standardised, cost-effective and secure manner. This requires technical solutions and administrative support not normally financially accounted for in small- to moderate-sized research groups. In this review, we highlight the Big Data challenges faced by translational research groups in the precision medicine era; an era in which the genomes of over 75 000 patients will be sequenced by the National Health Service over the next 3 years to advance healthcare. In particular, we have looked at three main themes of data management in relation to cancer research, namely (1) cancer ontology management, (2) IT infrastructures that have been developed to support data management and (3) the unique ethical challenges introduced by utilising Big Data in research. PMID:26492224

  6. The Astronaut Glove Challenge: Big Innovation from a (Very) Small Team

    NASA Technical Reports Server (NTRS)

    Homer, Peter

    2008-01-01

    Many measurements were taken by test engineers from Hamilton Sundstrand, the prime contractor for the current EVA suit. Because the raw measurements needed to be converted to torques and combined into a final score, it was impossible to keep track of who was ahead in this phase. The final comfort and dexterity test was performed in a depressurized glove box to simulate real on-orbit conditions. Each competitor was required to exercise the glove through a defined set of finger, thumb, and wrist motions without any sign of abrasion or bruising of the competitor's hand. I learned a lot about arm fatigue! This was a pass-fail event, and both of the remaining competitors came through intact. After taking what seemed like an eternity to tally the final scores, the judges announced that I had won the competition. My glove was the only one to have achieved lower finger-bending torques than the Phase VI glove. Looking back, I see three sources of the success of this project that I believe also operate in other programs where small teams have broken new ground in aerospace technologies. These are awareness, failure, and trust. By remaining aware of the big picture, continuously asking myself, "Am I converging on a solution?" and "Am I converging fast enough?" I was able to see that my original design was not going to succeed, leading to the decision to start over. I was also aware that, had I lingered over this choice or taken time to analyze it, I would not have been ready on the first day of competition. Failure forced me to look outside conventional thinking and opened the door to innovation. Choosing to make incremental failures enabled me to rapidly climb the learning curve. Trusting my "gut" feelings-which are really an internalized accumulation of experiences-and my newly acquired skills allowed me to devise new technologies rapidly and complete both gloves just in time. Awareness, failure, and trust are intertwined: failure provides experiences that inform awareness

  7. Structuring the Curriculum around Big Ideas

    ERIC Educational Resources Information Center

    Alleman, Janet; Knighton, Barbara; Brophy, Jere

    2010-01-01

    This article provides an inside look at Barbara Knighton's classroom teaching. She uses big ideas to guide her planning and instruction and gives other teachers suggestions for adopting the big idea approach and ways for making the approach easier. This article also represents a "small slice" of a dozen years of collaborative research,…

  8. Challenges of Big Data Analysis.

    PubMed

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-06-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions.

  9. Challenges of Big Data Analysis

    PubMed Central

    Fan, Jianqing; Han, Fang; Liu, Han

    2014-01-01

    Big Data bring new opportunities to modern society and challenges to data scientists. On one hand, Big Data hold great promises for discovering subtle population patterns and heterogeneities that are not possible with small-scale data. On the other hand, the massive sample size and high dimensionality of Big Data introduce unique computational and statistical challenges, including scalability and storage bottleneck, noise accumulation, spurious correlation, incidental endogeneity, and measurement errors. These challenges are distinguished and require new computational and statistical paradigm. This article gives overviews on the salient features of Big Data and how these features impact on paradigm change on statistical and computational methods as well as computing architectures. We also provide various new perspectives on the Big Data analysis and computation. In particular, we emphasize on the viability of the sparsest solution in high-confidence set and point out that exogeneous assumptions in most statistical methods for Big Data can not be validated due to incidental endogeneity. They can lead to wrong statistical inferences and consequently wrong scientific conclusions. PMID:25419469

  10. Transition by head-on collision: mechanically mediated manoeuvres in cockroaches and small robots.

    PubMed

    Jayaram, Kaushik; Mongeau, Jean-Michel; Mohapatra, Anand; Birkmeyer, Paul; Fearing, Ronald S; Full, Robert J

    2018-02-01

    Exceptional performance is often considered to be elegant and free of 'errors' or missteps. During the most extreme escape behaviours, neural control can approach or exceed its operating limits in response time and bandwidth. Here we show that small, rapid running cockroaches with robust exoskeletons select head-on collisions with obstacles to maintain the fastest escape speeds possible to transition up a vertical wall. Instead of avoidance, animals use their passive body shape and compliance to negotiate challenging environments. Cockroaches running at over 1 m or 50 body lengths per second transition from the floor to a vertical wall within 75 ms by using their head like an automobile bumper, mechanically mediating the manoeuvre. Inspired by the animal's behaviour, we demonstrate a passive, high-speed, mechanically mediated vertical transitions with a small, palm-sized legged robot. By creating a collision model for animal and human materials, we suggest a size dependence favouring mechanical mediation below 1 kg that we term the 'Haldane limit'. Relying on the mechanical control offered by soft exoskeletons represents a paradigm shift for understanding the control of small animals and the next generation of running, climbing and flying robots where the use of the body can off-load the demand for rapid sensing and actuation. © 2018 The Authors.

  11. Transition by head-on collision: mechanically mediated manoeuvres in cockroaches and small robots

    PubMed Central

    Mongeau, Jean-Michel; Mohapatra, Anand; Birkmeyer, Paul; Fearing, Ronald S.; Full, Robert J.

    2018-01-01

    Exceptional performance is often considered to be elegant and free of ‘errors’ or missteps. During the most extreme escape behaviours, neural control can approach or exceed its operating limits in response time and bandwidth. Here we show that small, rapid running cockroaches with robust exoskeletons select head-on collisions with obstacles to maintain the fastest escape speeds possible to transition up a vertical wall. Instead of avoidance, animals use their passive body shape and compliance to negotiate challenging environments. Cockroaches running at over 1 m or 50 body lengths per second transition from the floor to a vertical wall within 75 ms by using their head like an automobile bumper, mechanically mediating the manoeuvre. Inspired by the animal's behaviour, we demonstrate a passive, high-speed, mechanically mediated vertical transitions with a small, palm-sized legged robot. By creating a collision model for animal and human materials, we suggest a size dependence favouring mechanical mediation below 1 kg that we term the ‘Haldane limit’. Relying on the mechanical control offered by soft exoskeletons represents a paradigm shift for understanding the control of small animals and the next generation of running, climbing and flying robots where the use of the body can off-load the demand for rapid sensing and actuation. PMID:29445036

  12. Longitudinal growth of head circumference in term symmetric and asymmetric small for gestational age infants.

    PubMed

    Kaur, Harvinder; Bhalla, A K; Kumar, Praveen

    2012-07-01

    To study longitudinal growth pattern of head circumference of full-term symmetric and asymmetric small for gestational age (SGA) infants of the two sexes during first year of life. Mixed-longitudinal growth research design. Head circumference amongst full-term 100 symmetric, 100 asymmetric as well as 100 appropriate for gestational age (AGA) infants was measured at birth, 1, 3, 6, 9 and 12 months of age using standardized technique and instrument. The mean head circumference of male symmetric SGA infants measured significantly (p≤0.001) smaller than asymmetric SGA infants while, in female symmetric SGA infants it measured shorter beyond 6 months. As compared to AGA infants, head circumference in symmetric and asymmetric SGA infants measured significantly smaller in size. Growth velocity for head circumference amongst symmetric and asymmetric SGA male infants did not show statistically significant differences. Rate of head circumference growth remained significantly higher amongst female asymmetric SGA infants than the symmetric ones between 3 and 6 months while, a reversal of trend was observed between 9 and 12 months. The better growth attainments for head circumference of male and female asymmetric SGA infants than their symmetric SGA counterparts during first postnatal year of life may be attributed to the continuation of influence of "head sparing" experienced by asymmetric SGA babies during prenatal life. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. The BigBOSS spectrograph

    NASA Astrophysics Data System (ADS)

    Jelinsky, Patrick; Bebek, Chris; Besuner, Robert; Carton, Pierre-Henri; Edelstein, Jerry; Lampton, Michael; Levi, Michael E.; Poppett, Claire; Prieto, Eric; Schlegel, David; Sholl, Michael

    2012-09-01

    BigBOSS is a proposed ground-based dark energy experiment to study baryon acoustic oscillations (BAO) and the growth of structure with a 14,000 square degree galaxy and quasi-stellar object redshift survey. It consists of a 5,000- fiber-positioner focal plane feeding the spectrographs. The optical fibers are separated into ten 500 fiber slit heads at the entrance of ten identical spectrographs in a thermally insulated room. Each of the ten spectrographs has a spectral resolution (λ/Δλ) between 1500 and 4000 over a wavelength range from 360 - 980 nm. Each spectrograph uses two dichroic beam splitters to separate the spectrograph into three arms. It uses volume phase holographic (VPH) gratings for high efficiency and compactness. Each arm uses a 4096x4096 15 μm pixel charge coupled device (CCD) for the detector. We describe the requirements and current design of the BigBOSS spectrograph. Design trades (e.g. refractive versus reflective) and manufacturability are also discussed.

  14. Small Schools, Big Future

    ERIC Educational Resources Information Center

    Halsey, R. John

    2011-01-01

    Historically, small schools have played a very important role in the provision of schooling in Australia. Numerically, using an enrollment of 200 or less, small schools represent approximately 45% of the schools in Australia. Population growth and the consequences of this, in particular for food production, water and energy, mean that the…

  15. Big Project, Small Leaders

    ERIC Educational Resources Information Center

    Schon, Jennifer A.; Eitel, Karla B.; Bingaman, Deirdre; Miller, Brant G.; Rittenburg, Rebecca A.

    2014-01-01

    Donnelly, Idaho, is a small town surrounded by private ranches and Forest Service property. Through the center of Donnelly runs Boulder Creek, a small tributary feeding into Cascade Lake Reservoir. Boulder Creek originates from a mountain lake north of Donnelly. Since 1994 it has been listed as "impaired" by the Environmental Protection…

  16. Big Obscures Small

    NASA Image and Video Library

    2010-02-17

    NASA Cassini spacecraft captures a mutual event between Titan and Mimas in front of a backdrop of the planet rings. This image was snapped shortly before Saturn largest moon passed in front of and occulted the small moon Mimas.

  17. Big Data and medicine: a big deal?

    PubMed

    Mayer-Schönberger, V; Ingelsson, E

    2018-05-01

    Big Data promises huge benefits for medical research. Looking beyond superficial increases in the amount of data collected, we identify three key areas where Big Data differs from conventional analyses of data samples: (i) data are captured more comprehensively relative to the phenomenon under study; this reduces some bias but surfaces important trade-offs, such as between data quantity and data quality; (ii) data are often analysed using machine learning tools, such as neural networks rather than conventional statistical methods resulting in systems that over time capture insights implicit in data, but remain black boxes, rarely revealing causal connections; and (iii) the purpose of the analyses of data is no longer simply answering existing questions, but hinting at novel ones and generating promising new hypotheses. As a consequence, when performed right, Big Data analyses can accelerate research. Because Big Data approaches differ so fundamentally from small data ones, research structures, processes and mindsets need to adjust. The latent value of data is being reaped through repeated reuse of data, which runs counter to existing practices not only regarding data privacy, but data management more generally. Consequently, we suggest a number of adjustments such as boards reviewing responsible data use, and incentives to facilitate comprehensive data sharing. As data's role changes to a resource of insight, we also need to acknowledge the importance of collecting and making data available as a crucial part of our research endeavours, and reassess our formal processes from career advancement to treatment approval. © 2017 The Association for the Publication of the Journal of Internal Medicine.

  18. Deep sequencing reveals unique small RNA repertoire that is regulated during head regeneration in Hydra magnipapillata.

    PubMed

    Krishna, Srikar; Nair, Aparna; Cheedipudi, Sirisha; Poduval, Deepak; Dhawan, Jyotsna; Palakodeti, Dasaradhi; Ghanekar, Yashoda

    2013-01-07

    Small non-coding RNAs such as miRNAs, piRNAs and endo-siRNAs fine-tune gene expression through post-transcriptional regulation, modulating important processes in development, differentiation, homeostasis and regeneration. Using deep sequencing, we have profiled small non-coding RNAs in Hydra magnipapillata and investigated changes in small RNA expression pattern during head regeneration. Our results reveal a unique repertoire of small RNAs in hydra. We have identified 126 miRNA loci; 123 of these miRNAs are unique to hydra. Less than 50% are conserved across two different strains of Hydra vulgaris tested in this study, indicating a highly diverse nature of hydra miRNAs in contrast to bilaterian miRNAs. We also identified siRNAs derived from precursors with perfect stem-loop structure and that arise from inverted repeats. piRNAs were the most abundant small RNAs in hydra, mapping to transposable elements, the annotated transcriptome and unique non-coding regions on the genome. piRNAs that map to transposable elements and the annotated transcriptome display a ping-pong signature. Further, we have identified several miRNAs and piRNAs whose expression is regulated during hydra head regeneration. Our study defines different classes of small RNAs in this cnidarian model system, which may play a role in orchestrating gene expression essential for hydra regeneration.

  19. Deep sequencing reveals unique small RNA repertoire that is regulated during head regeneration in Hydra magnipapillata

    PubMed Central

    Krishna, Srikar; Nair, Aparna; Cheedipudi, Sirisha; Poduval, Deepak; Dhawan, Jyotsna; Palakodeti, Dasaradhi; Ghanekar, Yashoda

    2013-01-01

    Small non-coding RNAs such as miRNAs, piRNAs and endo-siRNAs fine-tune gene expression through post-transcriptional regulation, modulating important processes in development, differentiation, homeostasis and regeneration. Using deep sequencing, we have profiled small non-coding RNAs in Hydra magnipapillata and investigated changes in small RNA expression pattern during head regeneration. Our results reveal a unique repertoire of small RNAs in hydra. We have identified 126 miRNA loci; 123 of these miRNAs are unique to hydra. Less than 50% are conserved across two different strains of Hydra vulgaris tested in this study, indicating a highly diverse nature of hydra miRNAs in contrast to bilaterian miRNAs. We also identified siRNAs derived from precursors with perfect stem–loop structure and that arise from inverted repeats. piRNAs were the most abundant small RNAs in hydra, mapping to transposable elements, the annotated transcriptome and unique non-coding regions on the genome. piRNAs that map to transposable elements and the annotated transcriptome display a ping–pong signature. Further, we have identified several miRNAs and piRNAs whose expression is regulated during hydra head regeneration. Our study defines different classes of small RNAs in this cnidarian model system, which may play a role in orchestrating gene expression essential for hydra regeneration. PMID:23166307

  20. Big Opportunities and Big Concerns of Big Data in Education

    ERIC Educational Resources Information Center

    Wang, Yinying

    2016-01-01

    Against the backdrop of the ever-increasing influx of big data, this article examines the opportunities and concerns over big data in education. Specifically, this article first introduces big data, followed by delineating the potential opportunities of using big data in education in two areas: learning analytics and educational policy. Then, the…

  1. Recombinant Interleukin-15 in Treating Patients With Advanced Melanoma, Kidney Cancer, Non-small Cell Lung Cancer, or Squamous Cell Head and Neck Cancer

    ClinicalTrials.gov

    2017-09-14

    Head and Neck Squamous Cell Carcinoma; Recurrent Head and Neck Carcinoma; Recurrent Non-Small Cell Lung Carcinoma; Recurrent Renal Cell Carcinoma; Recurrent Skin Carcinoma; Stage III Renal Cell Cancer; Stage IIIA Cutaneous Melanoma AJCC v7; Stage IIIA Non-Small Cell Lung Cancer AJCC v7; Stage IIIB Cutaneous Melanoma AJCC v7; Stage IIIB Non-Small Cell Lung Cancer AJCC v7; Stage IIIC Cutaneous Melanoma AJCC v7; Stage IV Cutaneous Melanoma AJCC v6 and v7; Stage IV Non-Small Cell Lung Cancer AJCC v7; Stage IV Renal Cell Cancer

  2. Factors associated with small head circumference at birth among infants born before the 28th week

    PubMed Central

    McElrath, Thomas F.; Allred, Elizabeth N.; Kuban, Karl; Hecht, Jonathan L.; Onderdonk, Andrew; O’Shea, T. Michael; Paneth, Nigel; Leviton, Alan

    2010-01-01

    OBJECTIVE We sought to identify risk factors for congenital microcephaly in extremely low gestational age newborns. STUDY DESIGN Demographic, clinical, and placental characteristics of 1445 infants born before the 28th week were gathered and evaluated for their relationship with congenital microcephaly. RESULTS Almost 10% of newborns (n = 138), rather than the expected 2.2%, had microcephaly defined as a head circumference >2 SD below the median. In multivariable models, microcephaly was associated with nonwhite race, severe intrauterine growth restriction, delivery for preeclampsia, placental infarction, and being female. The risk factors for a head circumference between <1 and >2 SD below the median were similar to those of microcephaly. CONCLUSION Characteristics associated with fetal growth restriction and preeclampsia are among the strongest correlates of microcephaly among children born at extremely low gestational ages. The elevated risk of a small head among nonwhites and females might reflect the lack of appropriate head circumference standards. PMID:20541727

  3. How big of an effect do small dams have? Using geomorphological footprints to quantify spatial impact of low-head dams and identify patterns of across-dam variation

    USGS Publications Warehouse

    Fencl, Jane S.; Mather, Martha E.; Costigan, Katie H.; Daniels, Melinda D.

    2015-01-01

    Longitudinal connectivity is a fundamental characteristic of rivers that can be disrupted by natural and anthropogenic processes. Dams are significant disruptions to streams. Over 2,000,000 low-head dams (<7.6 m high) fragment United States rivers. Despite potential adverse impacts of these ubiquitous disturbances, the spatial impacts of low-head dams on geomorphology and ecology are largely untested. Progress for research and conservation is impaired by not knowing the magnitude of low-head dam impacts. Based on the geomorphic literature, we refined a methodology that allowed us to quantify the spatial extent of low-head dam impacts (herein dam footprint), assessed variation in dam footprints across low-head dams within a river network, and identified select aspects of the context of this variation. Wetted width, depth, and substrate size distributions upstream and downstream of six low-head dams within the Upper Neosho River, Kansas, United States of America were measured. Total dam footprints averaged 7.9 km (3.0–15.3 km) or 287 wetted widths (136–437 wetted widths). Estimates included both upstream (mean: 6.7 km or 243 wetted widths) and downstream footprints (mean: 1.2 km or 44 wetted widths). Altogether the six low-head dams impacted 47.3 km (about 17%) of the mainstem in the river network. Despite differences in age, size, location, and primary function, the sizes of geomorphic footprints of individual low-head dams in the Upper Neosho river network were relatively similar. The number of upstream dams and distance to upstream dams, but not dam height, affected the spatial extent of dam footprints. In summary, ubiquitous low-head dams individually and cumulatively altered lotic ecosystems. Both characteristics of individual dams and the context of neighboring dams affected low-head dam impacts within the river network. For these reasons, low-head dams require a different, more integrative, approach for research and management than the individualistic

  4. How Big of an Effect Do Small Dams Have? Using Geomorphological Footprints to Quantify Spatial Impact of Low-Head Dams and Identify Patterns of Across-Dam Variation

    PubMed Central

    Costigan, Katie H.; Daniels, Melinda D.

    2015-01-01

    Longitudinal connectivity is a fundamental characteristic of rivers that can be disrupted by natural and anthropogenic processes. Dams are significant disruptions to streams. Over 2,000,000 low-head dams (<7.6 m high) fragment United States rivers. Despite potential adverse impacts of these ubiquitous disturbances, the spatial impacts of low-head dams on geomorphology and ecology are largely untested. Progress for research and conservation is impaired by not knowing the magnitude of low-head dam impacts. Based on the geomorphic literature, we refined a methodology that allowed us to quantify the spatial extent of low-head dam impacts (herein dam footprint), assessed variation in dam footprints across low-head dams within a river network, and identified select aspects of the context of this variation. Wetted width, depth, and substrate size distributions upstream and downstream of six low-head dams within the Upper Neosho River, Kansas, United States of America were measured. Total dam footprints averaged 7.9 km (3.0–15.3 km) or 287 wetted widths (136–437 wetted widths). Estimates included both upstream (mean: 6.7 km or 243 wetted widths) and downstream footprints (mean: 1.2 km or 44 wetted widths). Altogether the six low-head dams impacted 47.3 km (about 17%) of the mainstem in the river network. Despite differences in age, size, location, and primary function, the sizes of geomorphic footprints of individual low-head dams in the Upper Neosho river network were relatively similar. The number of upstream dams and distance to upstream dams, but not dam height, affected the spatial extent of dam footprints. In summary, ubiquitous low-head dams individually and cumulatively altered lotic ecosystems. Both characteristics of individual dams and the context of neighboring dams affected low-head dam impacts within the river network. For these reasons, low-head dams require a different, more integrative, approach for research and management than the individualistic

  5. Small and big Hodgkin-Reed-Sternberg cells of Hodgkin lymphoma cell lines L-428 and L-1236 lack consistent differences in gene expression profiles and are capable to reconstitute each other.

    PubMed

    Rengstl, Benjamin; Kim, Sooji; Döring, Claudia; Weiser, Christian; Bein, Julia; Bankov, Katrin; Herling, Marco; Newrzela, Sebastian; Hansmann, Martin-Leo; Hartmann, Sylvia

    2017-01-01

    The hallmark of classical Hodgkin lymphoma (cHL) is the presence of giant, mostly multinucleated Hodgkin-Reed-Sternberg (HRS) cells. Whereas it has recently been shown that giant HRS cells evolve from small Hodgkin cells by incomplete cytokinesis and re-fusion of tethered sister cells, it remains unsolved why this phenomenon particularly takes place in this lymphoma and what the differences between these cell types of variable sizes are. The aim of the present study was to characterize microdissected small and giant HRS cells by gene expression profiling and to assess differences of clonal growth behavior as well as susceptibility toward cytotoxic intervention between these different cell types to provide more insight into their distinct cellular potential. Applying stringent filter criteria, only two differentially expressed genes between small and giant HRS cells, SHFM1 and LDHB, were identified. With looser filter criteria, 13 genes were identified to be differentially overexpressed in small compared to giant HRS cells. These were mainly related to energy metabolism and protein synthesis, further suggesting that small Hodgkin cells resemble the proliferative compartment of cHL. SHFM1, which is known to be involved in the generation of giant cells, was downregulated in giant RS cells at the RNA level. However, reduced mRNA levels of SHFM1, LDHB and HSPA8 did not translate into decreased protein levels in giant HRS cells. In cell culture experiments it was observed that the fraction of small and big HRS cells was adjusted to the basic level several days after enrichment of these populations via cell sorting, indicating that small and big HRS cells can reconstitute the full spectrum of cells usually observed in the culture. However, assessment of clonal growth of HRS cells indicated a significantly reduced potential of big HRS cells to form single cell colonies. Taken together, our findings pinpoint to strong similarities but also some differences between small and

  6. Small Moon Makes Big Waves

    NASA Image and Video Library

    2012-12-31

    Saturn small moon Daphnis is caught in the act of raising waves on the edges of the Keeler gap, which is the thin dark band in the left half of the image. Waves like these allow scientists to locate small moons in gaps and measure their masses.

  7. The Study of “big data” to support internal business strategists

    NASA Astrophysics Data System (ADS)

    Ge, Mei

    2018-01-01

    How is big data different from previous data analysis systems? The primary purpose behind traditional small data analytics that all managers are more or less familiar with is to support internal business strategies. But big data also offers a promising new dimension: to discover new opportunities to offer customers high-value products and services. The study focus to introduce some strategists which big data support to. Business decisions using big data can also involve some areas for analytics. They include customer satisfaction, customer journeys, supply chains, risk management, competitive intelligence, pricing, discovery and experimentation or facilitating big data discovery.

  8. Making a Big Bang on the small screen

    NASA Astrophysics Data System (ADS)

    Thomas, Nick

    2010-01-01

    While the quality of some TV sitcoms can leave viewers feeling cheated out of 30 minutes of their lives, audiences and critics are raving about the science-themed US comedy The Big Bang Theory. First shown on the CBS network in 2007, the series focuses on two brilliant postdoc physicists, Leonard and Sheldon, who are totally absorbed by science. Adhering to the stereotype, they also share a fanatical interest in science fiction, video-gaming and comic books, but unfortunately lack the social skills required to connect with their 20-something nonacademic contemporaries.

  9. Small Bodies, Big Concepts: Engaging Teachers and Their Students in Visual Analysis of Comets and Asteroids

    NASA Astrophysics Data System (ADS)

    Cobb, W. H.; Buxner, S.; Lebofsky, L. A.; Ristvey, J.; Weeks, S.; Zolensky, M.

    2011-12-01

    Small Bodies, Big Concepts is a multi-disciplinary, professional development project that engages 5th - 8th grade teachers in high end planetary science using a research-based pedagogical framework, Designing Effective Science Instruction (DESI). In addition to developing sound background knowledge with a focus on visual analysis, teachers' awareness of the process of learning new content is heightened, and they use that experience to deepen their science teaching practice. Culling from NASA E/PO educational materials, activities are sequenced to enhance conceptual understanding of big ideas in space science: what do we know, how do we know it, why do we care? Helping teachers develop a picture of the history and evolution of our understanding of the solar system, and honing in on the place of comets and asteroids in helping us answer old questions and discover new ones, teachers see the power and excitement underlying planetary science as human endeavor. Research indicates that science inquiry is powerful in the classroom and mission scientists are real-life models of science inquiry in action. Using guest scientist facilitators from the Planetary Science Institute, NASA Johnson Space Center, Lockheed Martin, and NASA E/PO professionals from McREL and NASA AESP, teachers practice framing scientific questions, using current visual data, and adapting NASA E/PO activities related to current exploration of asteroids and comets in our Solar System. Cross-curricular elements included examining research-based strategies for enhancing English language learners' ability to engage in higher order questions and a professional astronomy artist's insight into how visual analysis requires not just our eyes engaged, but our brains: comparing, synthesizing, questioning, evaluating, and wondering. This summer we pilot tested the SBBC curriculum with thirteen 5th- 10th grade teachers modeling a variety of instructional approaches over eight days. Each teacher developed lesson plans

  10. Big Data and SME financing in China

    NASA Astrophysics Data System (ADS)

    Tian, Z.; Hassan, A. F. S.; Razak, N. H. A.

    2018-05-01

    Big Data is becoming more and more prevalent in recent years, and it attracts lots of attention from various perspectives of the world such as academia, industry, and even government. Big Data can be seen as the next-generation source of power for the economy. Today, Big Data represents a new way to approach information and help all industry and business fields. The Chinese financial market has long been dominated by state-owned banks; however, these banks provide low-efficiency help toward small- and medium-sized enterprises (SMEs) and private businesses. The development of Big Data is changing the financial market, with more and more financial products and services provided by Internet companies in China. The credit rating models and borrower identification make online financial services more efficient than conventional banks. These services also challenge the domination of state-owned banks.

  11. Using "Big Ideas" to Enhance Teaching and Student Learning

    ERIC Educational Resources Information Center

    Mitchell, Ian; Keast, Stephen; Panizzon, Debra; Mitchell, Judie

    2017-01-01

    Organising teaching of a topic around a small number of "big ideas" has been argued by many to be important in teaching for deep understanding, with big ideas being able to link different activities and to be framed in ways that provide perceived relevance and routes into engagement. However it is our view that, at present, the…

  12. Big Science and the Large Hadron Collider

    NASA Astrophysics Data System (ADS)

    Giudice, Gian Francesco

    2012-03-01

    The Large Hadron Collider (LHC), the particle accelerator operating at CERN, is probably the most complex and ambitious scientific project ever accomplished by humanity. The sheer size of the enterprise, in terms of financial and human resources, naturally raises the question whether society should support such costly basic-research programs. I address this question by first reviewing the process that led to the emergence of Big Science and the role of large projects in the development of science and technology. I then compare the methodologies of Small and Big Science, emphasizing their mutual linkage. Finally, after examining the cost of Big Science projects, I highlight several general aspects of their beneficial implications for society.

  13. How Big Are "Martin's Big Words"? Thinking Big about the Future.

    ERIC Educational Resources Information Center

    Gardner, Traci

    "Martin's Big Words: The Life of Dr. Martin Luther King, Jr." tells of King's childhood determination to use "big words" through biographical information and quotations. In this lesson, students in grades 3 to 5 explore information on Dr. King to think about his "big" words, then they write about their own…

  14. Multilevel groundwater monitoring of hydraulic head and temperature in the eastern Snake River Plain aquifer, Idaho National Laboratory, Idaho, 2011-13

    USGS Publications Warehouse

    Twining, Brian V.; Fisher, Jason C.

    2015-01-01

    Normalized mean head values were analyzed for all 11 multilevel monitoring wells for the period of record (2007–13). The mean head values suggest a moderately positive correlation among all boreholes and generally reflect regional fluctuations in water levels in response to seasonal climatic changes. Boreholes within volcanic rift zones and near the southern boundary (USGS 103, USGS 105, USGS 108, USGS 132, USGS 135, USGS 137A) display a temporal correlation that is strongly positive. Boreholes in the Big Lost Trough display some variations in temporal correlations that may result from proximity to the mountain front to the northwest and episodic flow in the Big Lost River drainage system. For example, during June 2012, boreholes MIDDLE 2050A and MIDDLE 2051 showed head buildup within the upper zones when compared to the June 2010 profile event, which correlates to years when surface water was reported for the Big Lost River several months preceding the measurement period. With the exception of borehole USGS 134, temporal correlation between MLMS wells completed within the Big Lost Trough is generally positive. Temporal correlation for borehole USGS 134 shows the least agreement with other MLMS boreholes located within the Big Lost Trough; however, borehole USGS 134 is close to the mountain front where tributary valley subsurface inflow is suspected.

  15. Multiple Small Diameter Drillings Increase Femoral Neck Stability Compared with Single Large Diameter Femoral Head Core Decompression Technique for Avascular Necrosis of the Femoral Head.

    PubMed

    Brown, Philip J; Mannava, Sandeep; Seyler, Thorsten M; Plate, Johannes F; Van Sikes, Charles; Stitzel, Joel D; Lang, Jason E

    2016-10-26

    Femoral head core decompression is an efficacious joint-preserving procedure for treatment of early stage avascular necrosis. However, postoperative fractures have been described which may be related to the decompression technique used. Femoral head decompressions were performed on 12 matched human cadaveric femora comparing large 8mm single bore versus multiple 3mm small drilling techniques. Ultimate failure strength of the femora was tested using a servo-hydraulic material testing system. Ultimate load to failure was compared between the different decompression techniques using two paired ANCOVA linear regression models. Prior to biomechanical testing and after the intervention, volumetric bone mineral density was determined using quantitative computed tomography to account for variation between cadaveric samples and to assess the amount of bone disruption by the core decompression. Core decompression, using the small diameter bore and multiple drilling technique, withstood significantly greater load prior to failure compared with the single large bore technique after adjustment for bone mineral density (p< 0.05). The 8mm single bore technique removed a significantly larger volume of bone compared to the 3mm multiple drilling technique (p< 0.001). However, total fracture energy was similar between the two core decompression techniques. When considering core decompression for the treatment of early stage avascular necrosis, the multiple small bore technique removed less bone volume, thereby potentially leading to higher load to failure.

  16. A Big Bang versus a Small Bang Approach: A Case Study of the Expeditionary Combat Support System (ECSS) and the Maintenance, Repair, and Overhaul Initiative (MROi)

    DTIC Science & Technology

    resource planning (ERP) solution called the Expeditionary Combat Support System (ECSS), a big - bang approach. In early 2012, the ECSS program was cancelled...Repair, and Overhaul initiative (MROi), a small- bang approach, to increase enterprise visibility and efficiency across all three Air Logistics

  17. Metal atom dynamics in superbulky metallocenes: a comparison of (Cp(BIG))2Sn and (Cp(BIG))2Eu.

    PubMed

    Harder, Sjoerd; Naglav, Dominik; Schwerdtfeger, Peter; Nowik, Israel; Herber, Rolfe H

    2014-02-17

    Cp(BIG)2Sn (Cp(BIG) = (4-n-Bu-C6H4)5cyclopentadienyl), prepared by reaction of 2 equiv of Cp(BIG)Na with SnCl2, crystallized isomorphous to other known metallocenes with this ligand (Ca, Sr, Ba, Sm, Eu, Yb). Similarly, it shows perfect linearity, C-H···C(π) bonding between the Cp(BIG) rings and out-of-plane bending of the aryl substituents toward the metal. Whereas all other Cp(BIG)2M complexes show large disorder in the metal position, the Sn atom in Cp(BIG)2Sn is perfectly ordered. In contrast, (119)Sn and (151)Eu Mößbauer investigations on the corresponding Cp(BIG)2M metallocenes show that Sn(II) is more dynamic and loosely bound than Eu(II). The large displacement factors in the group 2 and especially in the lanthanide(II) metallocenes Cp(BIG)2M can be explained by static metal disorder in a plane parallel to the Cp(BIG) rings. Despite parallel Cp(BIG) rings, these metallocenes have a nonlinear Cpcenter-M-Cpcenter geometry. This is explained by an ionic model in which metal atoms are polarized by the negatively charged Cp rings. The extent of nonlinearity is in line with trends found in M(2+) ion polarizabilities. The range of known calculated dipole polarizabilities at the Douglas-Kroll CCSD(T) level was extended with values (atomic units) for Sn(2+) 15.35, Sm(2+)(4f(6) (7)F) 9.82, Eu(2+)(4f(7) (8)S) 8.99, and Yb(2+)(4f(14) (1)S) 6.55. This polarizability model cannot be applied to predominantly covalently bound Cp(BIG)2Sn, which shows a perfectly ordered structure. The bent geometry of Cp*2Sn should therefore not be explained by metal polarizability but is due to van der Waals Cp*···Cp* attraction and (to some extent) to a small p-character component in the Sn lone pair.

  18. 49 CFR 572.192 - Head assembly.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 7 2014-10-01 2014-10-01 false Head assembly. 572.192 Section 572.192... Test Dummy, Small Adult Female § 572.192 Head assembly. (a) The head assembly consists of the head (180...) of this section, the head assembly shall meet performance requirements specified in paragraph (c) of...

  19. 49 CFR 572.192 - Head assembly.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 7 2011-10-01 2011-10-01 false Head assembly. 572.192 Section 572.192... Dummy, Small Adult Female § 572.192 Head assembly. (a) The head assembly consists of the head (180-1000...) of this section, the head assembly shall meet performance requirements specified in paragraph (c) of...

  20. 49 CFR 572.192 - Head assembly.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 7 2013-10-01 2013-10-01 false Head assembly. 572.192 Section 572.192... Test Dummy, Small Adult Female § 572.192 Head assembly. (a) The head assembly consists of the head (180...) of this section, the head assembly shall meet performance requirements specified in paragraph (c) of...

  1. 49 CFR 572.192 - Head assembly.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 7 2012-10-01 2012-10-01 false Head assembly. 572.192 Section 572.192... Dummy, Small Adult Female § 572.192 Head assembly. (a) The head assembly consists of the head (180-1000...) of this section, the head assembly shall meet performance requirements specified in paragraph (c) of...

  2. 49 CFR 572.192 - Head assembly.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 7 2010-10-01 2010-10-01 false Head assembly. 572.192 Section 572.192... Dummy, Small Adult Female § 572.192 Head assembly. (a) The head assembly consists of the head (180-1000...) of this section, the head assembly shall meet performance requirements specified in paragraph (c) of...

  3. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives

    PubMed Central

    Miron-Shatz, T.; Lau, A. Y. S.; Paton, C.

    2014-01-01

    Summary Objectives As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. Methods A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Results Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to “small data” would also be useful. PMID:25123717

  4. Interleaved 3D-CNNs for joint segmentation of small-volume structures in head and neck CT images.

    PubMed

    Ren, Xuhua; Xiang, Lei; Nie, Dong; Shao, Yeqin; Zhang, Huan; Shen, Dinggang; Wang, Qian

    2018-05-01

    Accurate 3D image segmentation is a crucial step in radiation therapy planning of head and neck tumors. These segmentation results are currently obtained by manual outlining of tissues, which is a tedious and time-consuming procedure. Automatic segmentation provides an alternative solution, which, however, is often difficult for small tissues (i.e., chiasm and optic nerves in head and neck CT images) because of their small volumes and highly diverse appearance/shape information. In this work, we propose to interleave multiple 3D Convolutional Neural Networks (3D-CNNs) to attain automatic segmentation of small tissues in head and neck CT images. A 3D-CNN was designed to segment each structure of interest. To make full use of the image appearance information, multiscale patches are extracted to describe the center voxel under consideration and then input to the CNN architecture. Next, as neighboring tissues are often highly related in the physiological and anatomical perspectives, we interleave the CNNs designated for the individual tissues. In this way, the tentative segmentation result of a specific tissue can contribute to refine the segmentations of other neighboring tissues. Finally, as more CNNs are interleaved and cascaded, a complex network of CNNs can be derived, such that all tissues can be jointly segmented and iteratively refined. Our method was validated on a set of 48 CT images, obtained from the Medical Image Computing and Computer Assisted Intervention (MICCAI) Challenge 2015. The Dice coefficient (DC) and the 95% Hausdorff Distance (95HD) are computed to measure the accuracy of the segmentation results. The proposed method achieves higher segmentation accuracy (with the average DC: 0.58 ± 0.17 for optic chiasm, and 0.71 ± 0.08 for optic nerve; 95HD: 2.81 ± 1.56 mm for optic chiasm, and 2.23 ± 0.90 mm for optic nerve) than the MICCAI challenge winner (with the average DC: 0.38 for optic chiasm, and 0.68 for optic nerve; 95HD: 3.48 for

  5. Determination of material distribution in heading process of small bimetallic bar

    NASA Astrophysics Data System (ADS)

    Presz, Wojciech; Cacko, Robert

    2018-05-01

    The electrical connectors mostly have silver contacts joined by riveting. In order to reduce costs, the core of the contact rivet can be replaced with cheaper material, e.g. copper. There is a wide range of commercially available bimetallic (silver-copper) rivets on the market for the production of contacts. Following that, new conditions in the riveting process are created because the bi-metal object is riveted. In the analyzed example, it is a small size object, which can be placed on the border of microforming. Based on the FEM modeling of the load process of bimetallic rivets with different material distributions, the desired distribution was chosen and the choice was justified. Possible material distributions were parameterized with two parameters referring to desirable distribution characteristics. The parameter: Coefficient of Mutual Interactions of Plastic Deformations and the method of its determination have been proposed. The parameter is determined based of two-parameter stress-strain curves and is a function of these parameters and the range of equivalent strains occurring in the analyzed process. The proposed method was used for the upsetting process of the bimetallic head of the electrical contact. A nomogram was established to predict the distribution of materials in the head of the rivet and the appropriate selection of a pair of materials to achieve the desired distribution.

  6. The hydrodynamics of the Big Horn Basin: a study of the role of faults

    USGS Publications Warehouse

    Bredehoeft, J.D.; Belitz, K.; Sharp-Hansen, S.

    1992-01-01

    A three-dimensional mathematical model simulates groundwater flow in the Big Horn basin, Wyoming. The hydraulic head at depth over much of the Big Horn basin is near the land surface elevation, a condition usually defined as hydrostatic. This condition indicates a high, regional-scale, vertical conductivity for the sediments in the basin. Our hypothesis to explain the high conductivity is that the faults act as vertical conduits for fluid flow. These same faults can act as either horizontal barriers to flow or nonbarriers, depending upon whether the fault zones are more permeable or less permeable than the adjoining aquifers. -from Authors

  7. Gender Differences in Personality across the Ten Aspects of the Big Five.

    PubMed

    Weisberg, Yanna J; Deyoung, Colin G; Hirsh, Jacob B

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness, and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level.

  8. Gender Differences in Personality across the Ten Aspects of the Big Five

    PubMed Central

    Weisberg, Yanna J.; DeYoung, Colin G.; Hirsh, Jacob B.

    2011-01-01

    This paper investigates gender differences in personality traits, both at the level of the Big Five and at the sublevel of two aspects within each Big Five domain. Replicating previous findings, women reported higher Big Five Extraversion, Agreeableness, and Neuroticism scores than men. However, more extensive gender differences were found at the level of the aspects, with significant gender differences appearing in both aspects of every Big Five trait. For Extraversion, Openness, and Conscientiousness, the gender differences were found to diverge at the aspect level, rendering them either small or undetectable at the Big Five level. These findings clarify the nature of gender differences in personality and highlight the utility of measuring personality at the aspect level. PMID:21866227

  9. The Problem with Big Data: Operating on Smaller Datasets to Bridge the Implementation Gap.

    PubMed

    Mann, Richard P; Mushtaq, Faisal; White, Alan D; Mata-Cervantes, Gabriel; Pike, Tom; Coker, Dalton; Murdoch, Stuart; Hiles, Tim; Smith, Clare; Berridge, David; Hinchliffe, Suzanne; Hall, Geoff; Smye, Stephen; Wilkie, Richard M; Lodge, J Peter A; Mon-Williams, Mark

    2016-01-01

    Big datasets have the potential to revolutionize public health. However, there is a mismatch between the political and scientific optimism surrounding big data and the public's perception of its benefit. We suggest a systematic and concerted emphasis on developing models derived from smaller datasets to illustrate to the public how big data can produce tangible benefits in the long term. In order to highlight the immediate value of a small data approach, we produced a proof-of-concept model predicting hospital length of stay. The results demonstrate that existing small datasets can be used to create models that generate a reasonable prediction, facilitating health-care delivery. We propose that greater attention (and funding) needs to be directed toward the utilization of existing information resources in parallel with current efforts to create and exploit "big data."

  10. How Big Is Too Big?

    ERIC Educational Resources Information Center

    Cibes, Margaret; Greenwood, James

    2016-01-01

    Media Clips appears in every issue of Mathematics Teacher, offering readers contemporary, authentic applications of quantitative reasoning based on print or electronic media. This issue features "How Big is Too Big?" (Margaret Cibes and James Greenwood) in which students are asked to analyze the data and tables provided and answer a…

  11. Habitat use of woodpeckers in the Big Woods of eastern Arkansas

    USGS Publications Warehouse

    Krementz, David G.; Lehnen, Sarah E.; Luscier, J.D.

    2012-01-01

    The Big Woods of eastern Arkansas contain some of the highest densities of woodpeckers recorded within bottomland hardwood forests of the southeastern United States. A better understanding of habitat use patterns by these woodpeckers is a priority for conservationists seeking to maintain these high densities in the Big Woods and the Lower Mississippi Alluvial Valley as a whole. Hence, we used linear mixed-effects and linear models to estimate the importance of habitat characteristics to woodpecker density in the Big Woods during the breeding seasons of 2006 and 2007 and the winter of 2007. Northern flicker Colaptes auratus density was negatively related to tree density both for moderate (. 25 cm diameter at breast height) and larger trees (>61 cm diameter at breast height). Red-headed woodpeckers Melanerpes erythrocephalus also had a negative relationship with density of large (. 61 cm diameter at breast height) trees. Bark disfiguration (an index of tree health) was negatively related to red-bellied woodpecker Melanerpes carolinus and yellow-bellied sapsucker Sphyrapicus varius densities. No measured habitat variables explained pileated woodpecker Dryocopus pileatus density. Overall, the high densities of woodpeckers observed in our study suggest that the current forest management of the Big Woods of Arkansas is meeting the nesting, roosting, and foraging requirements for these birds.

  12. BigDog

    NASA Astrophysics Data System (ADS)

    Playter, R.; Buehler, M.; Raibert, M.

    2006-05-01

    BigDog's goal is to be the world's most advanced quadruped robot for outdoor applications. BigDog is aimed at the mission of a mechanical mule - a category with few competitors to date: power autonomous quadrupeds capable of carrying significant payloads, operating outdoors, with static and dynamic mobility, and fully integrated sensing. BigDog is about 1 m tall, 1 m long and 0.3 m wide, and weighs about 90 kg. BigDog has demonstrated walking and trotting gaits, as well as standing up and sitting down. Since its creation in the fall of 2004, BigDog has logged tens of hours of walking, climbing and running time. It has walked up and down 25 & 35 degree inclines and trotted at speeds up to 1.8 m/s. BigDog has walked at 0.7 m/s over loose rock beds and carried over 50 kg of payload. We are currently working to expand BigDog's rough terrain mobility through the creation of robust locomotion strategies and terrain sensing capabilities.

  13. Biosignal-based relaxation evaluation of head-care robot.

    PubMed

    Ando, Takeshi; Takeda, Maki; Maruyama, Tomomi; Susuki, Yuto; Hirose, Toshinori; Fujioka, Soichiro; Mizuno, Osamu; Yamada, Kenji; Ohno, Yuko; Yukio, Honda

    2013-01-01

    Such popular head care procedures as shampooing and scalp massages provide physical and mental relaxation. However, they place a big burden such as chapped hands on beauticians and other practitioners. Based on our robot hand technology, we have been developing a head care robot. In this paper, we quantitatively evaluated its relaxation effect using the following biosignals: accelerated plethymography (SDNN, HF/TP, LF/HF), heart rate (HR), blood pressure, salivary amylase (sAA) and peripheral skin temperature (PST). We compared the relaxation of our developed head care robot with the head care provided by nurses. In our experimental result with 54 subjects, the activity of the autonomic nerve system changed before and after head care procedures performed by both a human nurse and our proposed robot. Especially, in the proposed robot, we confirmed significant differences with the procedure performed by our proposed head care robot in five indexes: HF/TP, LF/HF, HR, sAA, and PST. The activity of the sympathetic nerve system decreased, because the values of its indexes significantly decreased: LF/HF, HR, and sAA. On the other hand, the activity of the parasympathetic nerve system increased, because of the increase of its indexes value: HF/TP and PST. Our developed head care robot provided satisfactory relaxation in just five minutes of use.

  14. Small Bodies, Big Discoveries: NASA's Small Bodies Education Program

    NASA Astrophysics Data System (ADS)

    Mayo, L.; Erickson, K. J.

    2014-12-01

    2014 is turning out to be a watershed year for celestial events involving the solar system's unsung heroes, small bodies. This includes the close flyby of comet C/2013 A1 / Siding Spring with Mars in October and the historic Rosetta mission with its Philae lander to comet 67P/Churyumov-Gerasimenko. Beyond 2014, the much anticipated 2015 Pluto flyby by New Horizons and the February Dawn Mission arrival at Ceres will take center stage. To deliver the excitement and wonder of our solar system's small bodies to worldwide audiences, NASA's JPL and GSFC education teams in partnership with NASA EDGE will reach out to the public through multiple venues including broadcast media, social media, science and math focused educational activities, observing challenges, interactive visualization tools like "Eyes on the Solar System" and more. This talk will highlight NASA's focused education effort to engage the public in small bodies mission science and the role these objects play in our understanding of the formation and evolution of the solar system.

  15. Nursing Needs Big Data and Big Data Needs Nursing.

    PubMed

    Brennan, Patricia Flatley; Bakken, Suzanne

    2015-09-01

    Contemporary big data initiatives in health care will benefit from greater integration with nursing science and nursing practice; in turn, nursing science and nursing practice has much to gain from the data science initiatives. Big data arises secondary to scholarly inquiry (e.g., -omics) and everyday observations like cardiac flow sensors or Twitter feeds. Data science methods that are emerging ensure that these data be leveraged to improve patient care. Big data encompasses data that exceed human comprehension, that exist at a volume unmanageable by standard computer systems, that arrive at a velocity not under the control of the investigator and possess a level of imprecision not found in traditional inquiry. Data science methods are emerging to manage and gain insights from big data. The primary methods included investigation of emerging federal big data initiatives, and exploration of exemplars from nursing informatics research to benchmark where nursing is already poised to participate in the big data revolution. We provide observations and reflections on experiences in the emerging big data initiatives. Existing approaches to large data set analysis provide a necessary but not sufficient foundation for nursing to participate in the big data revolution. Nursing's Social Policy Statement guides a principled, ethical perspective on big data and data science. There are implications for basic and advanced practice clinical nurses in practice, for the nurse scientist who collaborates with data scientists, and for the nurse data scientist. Big data and data science has the potential to provide greater richness in understanding patient phenomena and in tailoring interventional strategies that are personalized to the patient. © 2015 Sigma Theta Tau International.

  16. What are Head Cavities? - A History of Studies on Vertebrate Head Segmentation.

    PubMed

    Kuratani, Shigeru; Adachi, Noritaka

    2016-06-01

    Motivated by the discovery of segmental epithelial coeloms, or "head cavities," in elasmobranch embryos toward the end of the 19th century, the debate over the presence of mesodermal segments in the vertebrate head became a central problem in comparative embryology. The classical segmental view assumed only one type of metamerism in the vertebrate head, in which each metamere was thought to contain one head somite and one pharyngeal arch, innervated by a set of cranial nerves serially homologous to dorsal and ventral roots of spinal nerves. The non-segmental view, on the other hand, rejected the somite-like properties of head cavities. A series of small mesodermal cysts in early Torpedo embryos, which were thought to represent true somite homologs, provided a third possible view on the nature of the vertebrate head. Recent molecular developmental data have shed new light on the vertebrate head problem, explaining that head mesoderm evolved, not by the modification of rostral somites of an amphioxus-like ancestor, but through the polarization of unspecified paraxial mesoderm into head mesoderm anteriorly and trunk somites posteriorly.

  17. Second BRITE-Constellation Science Conference: Small satellites—big science, Proceedings of the Polish Astronomical Society volume 5

    NASA Astrophysics Data System (ADS)

    Zwintz, Konstanze; Poretti, Ennio

    2017-09-01

    In 2016 the BRITE-Constellation mission had been operational for more than two years. At that time, several hundreds of bright stars of various types had been observed successfully in the two BRITE lters and astonishing new discoveries had been made. Therefore, the time was ripe to host the Second BRITE-Constellation Science Conference: Small satellites | big science" from August 22 to 26, 2016, in the beautiful Madonnensaal of the University of Innsbruck, Austria. With this conference, we brought together the scientic community interested in BRITE-Constellation, pro- vided an update on the status of the mission, presented and discussed latest scientic results, shared our experiences with the data, illustrated successful cooperations between professional and amateur ground-based observers and BRITE scientists, and explored new ideas for future BRITE-Constellation observations.

  18. Start small, dream big: Experiences of physical activity in public spaces in Colombia.

    PubMed

    Díaz Del Castillo, Adriana; González, Silvia Alejandra; Ríos, Ana Paola; Páez, Diana C; Torres, Andrea; Díaz, María Paula; Pratt, Michael; Sarmiento, Olga L

    2017-10-01

    Multi-sectoral strategies to promote active recreation and physical activity in public spaces are crucial to building a "culture of health". However, studies on the sustainability and scalability of these strategies are limited. This paper identifies the factors related to the sustainability and scaling up of two community-based programs offering physical activity classes in public spaces in Colombia: Bogotá's Recreovía and Colombia's "Healthy Habits and Lifestyles Program-HEVS". Both programs have been sustained for more than 10years, and have benefited 1455 communities. We used a mixed-methods approach including semi-structured interviews, document review and an analysis of data regarding the programs' history, characteristics, funding, capacity building and challenges. Interviews were conducted between May-October 2015. Based on the sustainability frameworks of Shediac-Rizkallah and Bone and Scheirer, we developed categories to independently code each interview. All information was independently analyzed by four of the authors and cross-compared between programs. Findings showed that these programs underwent adaptation processes to address the challenges that threatened their continuation and growth. The primary strategies included flexibility/adaptability, investing in the working conditions and training of instructors, allocating public funds and requesting accountability, diversifying resources, having community support and champions at different levels and positions, and carrying out continuous advocacy to include physical activity in public policies. Recreovía and HEVS illustrate sustainability as an incremental, multi-level process at different levels. Lessons learned for similar initiatives include the importance of individual actions and small events, a willingness to start small while dreaming big, being flexible, and prioritizing the human factor. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Big Trouble for Small Schools

    ERIC Educational Resources Information Center

    Bailey, Jon; Preston, Kim

    2007-01-01

    An analysis of the proposed changes to Nebraska's school finance formula and school structure shows that many of Nebraska's rural schools could suffer from imposition of a "small by choice" factor. Research has consistently shown that smaller schools have some advantages over their larger counterparts. The 2005 session of the Nebraska…

  20. Big Ideas and Small Solutions

    ERIC Educational Resources Information Center

    Tennant, Roy

    2004-01-01

    Small solutions solve discrete, well-bounded problems and can be pieces of larger solutions. They can move things forward by mixing and matching available components in new and previously unimagined ways. A number of innovations, which at first glance are completely unrelated, can come together and create important synergics. This article…

  1. Crowd-funded micro-grants for genomics and "big data": an actionable idea connecting small (artisan) science, infrastructure science, and citizen philanthropy.

    PubMed

    Özdemir, Vural; Badr, Kamal F; Dove, Edward S; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N; Sabra, Ramzi; Sarkissian, Christineh N; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K; Kickbusch, Ilona

    2013-04-01

    Biomedical science in the 21(st) century is embedded in, and draws from, a digital commons and "Big Data" created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., "the lone genius" or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21(st) century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists-only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the "bottom one billion"-the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while sharing similar disease

  2. Population and harvest trends of big game and small game species: a technical document supporting the USDA Forest Service Interim Update of the 2000 RPA Assessment

    Treesearch

    Curtis H. Flather; Michael S. Knowles; Stephen J. Brady

    2009-01-01

    This technical document supports the Forest Service's requirement to assess the status of renewable natural resources as mandated by the Forest and Rangeland Renewable Resources Planning Act of 1974 (RPA). It updates past reports on national and regional trends in population and harvest estimates for species classified as big game and small game. The trends...

  3. Small cause - big effect: improvement in interface design results in improved data quality - a multicenter crossover study.

    PubMed

    Ahlbrandt, Janko; Henrich, Michael; Hartmann, Bernd A; Bundschuh, Bettina B; Schwarz, Julia; Klasen, Joachim; Röhrig, Rainer

    2012-01-01

    In Germany the core data set for anesthesia version 3.0 was recently introduced for external quality assurance, which includes five surgical tracer procedures. We found a low rate of correctly documented tracers when compared to procedure data (OPS-Codes) documented separately. Examination revealed that the graphical user interface (GUI) contravened the dialogue principles as defined in EN ISO 9241-110. We worked with the manufacturer to implement small improvements and roll out the software. A crossover study was conducted at a university hospital and a municipal hospital chain with five hospitals. All study sites and surgical tracer procedures combined, we found an improvement from 42% to 65% (p<0.001; N=34,610) correctly documented anesthesias. We also saw improvements for most of the observed surgical tracer procedures at all hospitals. Our results show the big effect small changes to the GUI can have on data quality. They also raise the question, if highly flexible and parameterized clinical documentation systems are suited to achieve high usability. Finding the right balance between GUIs designed by usability experts and the flexibility of parameterization by administrators will be a difficult task for the future and subject to further research.

  4. Encoding of head direction by hippocampal place cells in bats.

    PubMed

    Rubin, Alon; Yartsev, Michael M; Ulanovsky, Nachum

    2014-01-15

    Most theories of navigation rely on the concept of a mental map and compass. Hippocampal place cells are neurons thought to be important for representing the mental map; these neurons become active when the animal traverses a specific location in the environment (the "place field"). Head-direction cells are found outside the hippocampus, and encode the animal's head orientation, thus implementing a neural compass. The prevailing view is that the activity of head-direction cells is not tuned to a single place, while place cells do not encode head direction. However, little work has been done to investigate in detail the possible head-directional tuning of hippocampal place cells across species. Here we addressed this by recording the activity of single neurons in the hippocampus of two evolutionarily distant bat species, Egyptian fruit bat and big brown bat, which crawled randomly in three different open-field arenas. We found that a large fraction of hippocampal neurons, in both bat species, showed conjunctive sensitivity to the animal's spatial position (place field) and to its head direction. We introduced analytical methods to demonstrate that the head-direction tuning was significant even after controlling for the behavioral coupling between position and head direction. Surprisingly, some hippocampal neurons preserved their head direction tuning even outside the neuron's place field, suggesting that "spontaneous" extra-field spikes are not noise, but in fact carry head-direction information. Overall, these findings suggest that bat hippocampal neurons can convey both map information and compass information.

  5. Envisioning the future of 'big data' biomedicine.

    PubMed

    Bui, Alex A T; Van Horn, John Darrell

    2017-05-01

    Through the increasing availability of more efficient data collection procedures, biomedical scientists are now confronting ever larger sets of data, often finding themselves struggling to process and interpret what they have gathered. This, while still more data continues to accumulate. This torrent of biomedical information necessitates creative thinking about how the data are being generated, how they might be best managed, analyzed, and eventually how they can be transformed into further scientific understanding for improving patient care. Recognizing this as a major challenge, the National Institutes of Health (NIH) has spearheaded the "Big Data to Knowledge" (BD2K) program - the agency's most ambitious biomedical informatics effort ever undertaken to date. In this commentary, we describe how the NIH has taken on "big data" science head-on, how a consortium of leading research centers are developing the means for handling large-scale data, and how such activities are being marshalled for the training of a new generation of biomedical data scientists. All in all, the NIH BD2K program seeks to position data science at the heart of 21 st Century biomedical research. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. BigBWA: approaching the Burrows-Wheeler aligner to Big Data technologies.

    PubMed

    Abuín, José M; Pichel, Juan C; Pena, Tomás F; Amigo, Jorge

    2015-12-15

    BigBWA is a new tool that uses the Big Data technology Hadoop to boost the performance of the Burrows-Wheeler aligner (BWA). Important reductions in the execution times were observed when using this tool. In addition, BigBWA is fault tolerant and it does not require any modification of the original BWA source code. BigBWA is available at the project GitHub repository: https://github.com/citiususc/BigBWA. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Description of the tadpoles of two endemic frogs: the Phu Luang cascade frog Odorrana aureola (Anura: Ranidae) and the Isan big-headed frog Limnonectes isanensis (Anura: Dicroglossidae) from northeastern Thailand.

    PubMed

    Ampai, Natee; Rujirawan, Attapol; Arkajag, Jirachai; Mcleod, David S; Aowphol, Anchalee

    2015-07-07

    We describe the external morphology of the tadpoles of two frogs endemic to Thailand: the Phu Luang cascade frog    (Odorrana aureola) and the Isan big-headed frog (Limnonectes isanensis) from the type localities in the Phu Luang Wildlife Sanctuary, Loei Province, northeastern Thailand. Morphological and genetic characters (16S rRNA) were used to identify specimen and match tadpoles to the adults. Detailed descriptions of external morphology and coloration in life are provided for both species. We provide a brief discussion of the ecology of these tadpoles and a comparison to previously published data from tadpoles of closely related taxa. Additionally, we provide evidence for the utility of larval morphology in resolving the taxonomic puzzles presented by cryptic species complexes.

  8. Head growth and neurocognitive outcomes.

    PubMed

    Wright, Charlotte M; Emond, Alan

    2015-06-01

    There is a lack of evidence on the value of head circumference (HC) as a screening measure. We aimed to describe the incidence of head centile shifting and the relationship between extremes of head size and later neurodevelopmental problems in the Avon Longitudinal Study of Parents and Children. HC was measured routinely at 2, 9, and 18 or 24 months and by researchers at ages 4, 8, 12, and 18 months. IQ according to the Wechsler Intelligence Scale for Children was measured in research clinics at age 8 for all. Neurocognitive disorders (NCDs) were identified from chart review. There were 10 851 children with ≥2 head measurements. At each age, 2% to 3% of children had scores that were < -2 or >2 SDs below or above the mean, but for most children this was only found at 1 age. More than 15% of children showed centile shifts, but less than one-third of these were sustained at subsequent measurements. Only 0.5% showed a sustained shift beyond the normal range. Children with consistently small heads were up to 7 times more likely to have an NCD, but 85% of children with small heads had no NCDs, and 93% of children with NCDs had head SD scores within the normal range. Centile shifts within the normal range occur commonly and seem mainly to reflect measurement error. This finding makes robust assessment of the head trajectory difficult and may result in many children being investigated unnecessarily. Extreme head size is neither specific nor sensitive for detecting NCDs, suggesting that routine measurement of HC is unhelpful. Copyright © 2015 by the American Academy of Pediatrics.

  9. Crowd-Funded Micro-Grants for Genomics and “Big Data”: An Actionable Idea Connecting Small (Artisan) Science, Infrastructure Science, and Citizen Philanthropy

    PubMed Central

    Badr, Kamal F.; Dove, Edward S.; Endrenyi, Laszlo; Geraci, Christy Jo; Hotez, Peter J.; Milius, Djims; Neves-Pereira, Maria; Pang, Tikki; Rotimi, Charles N.; Sabra, Ramzi; Sarkissian, Christineh N.; Srivastava, Sanjeeva; Tims, Hesther; Zgheib, Nathalie K.; Kickbusch, Ilona

    2013-01-01

    Abstract Biomedical science in the 21st century is embedded in, and draws from, a digital commons and “Big Data” created by high-throughput Omics technologies such as genomics. Classic Edisonian metaphors of science and scientists (i.e., “the lone genius” or other narrow definitions of expertise) are ill equipped to harness the vast promises of the 21st century digital commons. Moreover, in medicine and life sciences, experts often under-appreciate the important contributions made by citizen scholars and lead users of innovations to design innovative products and co-create new knowledge. We believe there are a large number of users waiting to be mobilized so as to engage with Big Data as citizen scientists—only if some funding were available. Yet many of these scholars may not meet the meta-criteria used to judge expertise, such as a track record in obtaining large research grants or a traditional academic curriculum vitae. This innovation research article describes a novel idea and action framework: micro-grants, each worth $1000, for genomics and Big Data. Though a relatively small amount at first glance, this far exceeds the annual income of the “bottom one billion”—the 1.4 billion people living below the extreme poverty level defined by the World Bank ($1.25/day). We describe two types of micro-grants. Type 1 micro-grants can be awarded through established funding agencies and philanthropies that create micro-granting programs to fund a broad and highly diverse array of small artisan labs and citizen scholars to connect genomics and Big Data with new models of discovery such as open user innovation. Type 2 micro-grants can be funded by existing or new science observatories and citizen think tanks through crowd-funding mechanisms described herein. Type 2 micro-grants would also facilitate global health diplomacy by co-creating crowd-funded micro-granting programs across nation-states in regions facing political and financial instability, while

  10. A Big Year for Small Bodies

    NASA Astrophysics Data System (ADS)

    Mayo, Louis; Erickson, K.

    2013-10-01

    2013 is a watershed year for celestial events involving the solar system’s unsung heroes, small bodies. The Cosmic Valentine of Asteroid 2012 DA14 which passed within ~ 3.5 Earth radii of the Earth's surface (February 15, 2013), Comet C/2011 L4 PANSTARRS and the Thanksgiving 2013 pass of Comet ISON, which will pass less than 0.012 AU (1.8 million km) from the solar surface and could be visible during the day. All this in addition to Comet Lemmon and a host of meteor showers makes 2013 a landmark year to deliver the excitement of planetary science to the audiences worldwide. To deliver the excitement and wonder of our solar system’s small bodies to worldwide audiences, NASA’s JPL and GSFC education teams in partnership with NASA EDGE will reach out to the public through multiple venues including broadcast media, social media, science and math focused educational activities, observing challenges, interactive visualization tools like “Eyes on the Solar System” and more culminating in the Thanksgiving Day Comet ISON perihelion passage. This talk will highlight NASA’s focused education effort to engage the public in small bodies science and the role these objects play in our understanding of the formation and evolution of the solar system.

  11. Measurements of characteristic parameters of extremely small cogged wheels with low module by means of low-coherence interferometry

    NASA Astrophysics Data System (ADS)

    Pakula, Anna; Tomczewski, Slawomir; Skalski, Andrzej; Biało, Dionizy; Salbut, Leszek

    2010-05-01

    This paper presents novel application of Low Coherence Interferometry (LCI) in measurements of characteristic parameters as circular pitch, foot diameter, heads diameter, in extremely small cogged wheels (cogged wheel diameter lower than θ=3 mm and module m = 0.15) produced from metal and ceramics. The most interesting issue concerning small diameter cogged wheels occurs during their production. The characteristic parameters of the wheel depend strongly on the manufacturing process and while inspecting small diameter wheels the shrinkage during the cast varies with the slight change of fabrication process. In the paper the LCI interferometric Twyman - Green setup with pigtailed high power light emitting diode, for cogged wheels measurement, is described. Due to its relatively big field of view the whole wheel can be examined in one measurement, without the necessity of numerical stitching. For purposes of small cogged wheel's characteristic parameters measurement the special binarization algorithm was developed and successfully applied. At the end the results of measurement of heads and foot diameters of two cogged wheels obtained by proposed LCI setup are presented and compared with the results obtained by the commercial optical profiler. The results of examination of injection moulds used for fabrication of measured cogged wheels are also presented. Additionally, the value of cogged wheels shrinkage is calculated as a conclusion for obtained results. Proposed method is suitable for complex measurements of small diameter cogged wheels with low module especially when there are no measurements standards for such objects.

  12. Buckling analysis of Big Dee Vacuum Vessel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lightner, S.; Gallix, R.

    1983-12-01

    A simplified three-dimensional shell buckling analysis of the GA Technologies Inc., Big Dee Vacuum Vessel (V/V) was performed using the finite element program TRICO. A coarse-mesh linear elastic model, which accommodated the support boundary conditions, was used to determine the buckling mode shape under a uniform external pressure. Using this buckling mode shape, refined models were used to calculate the linear buckling load (P/sub crit/) more accurately. Several different designs of the Big Dee V/V were considered in this analysis. The supports for the V/V were equally-spaced radial pins at the outer diameter of the mid-plane. For all the casesmore » considered, the buckling mode was axisymmetric in the toroidal direction. Therefore, it was possible to use only a small angular sector of a toric shell for the refined analysis. P/sub crit/ for the Big Dee is about 60 atm for a uniform external pressure. Also investigated in this analysis were the effects of geometrical imperfections and non-uniform pressure distributions.« less

  13. Thinking Big about Getting Small: An Ideological Genealogy of Small-School Reform

    ERIC Educational Resources Information Center

    Kafka, Judith

    2008-01-01

    Background: Support for small schools, and specifically for the creation of small, autonomous schools of choice, has grown considerably in the past decade--particularly in the context of urban schooling. Funded by private and public monies, small-school initiatives have been implemented in most of the nation's city school districts and have become…

  14. Benchmarking Big Data Systems and the BigData Top100 List.

    PubMed

    Baru, Chaitanya; Bhandarkar, Milind; Nambiar, Raghunath; Poess, Meikel; Rabl, Tilmann

    2013-03-01

    "Big data" has become a major force of innovation across enterprises of all sizes. New platforms with increasingly more features for managing big datasets are being announced almost on a weekly basis. Yet, there is currently a lack of any means of comparability among such platforms. While the performance of traditional database systems is well understood and measured by long-established institutions such as the Transaction Processing Performance Council (TCP), there is neither a clear definition of the performance of big data systems nor a generally agreed upon metric for comparing these systems. In this article, we describe a community-based effort for defining a big data benchmark. Over the past year, a Big Data Benchmarking Community has become established in order to fill this void. The effort focuses on defining an end-to-end application-layer benchmark for measuring the performance of big data applications, with the ability to easily adapt the benchmark specification to evolving challenges in the big data space. This article describes the efforts that have been undertaken thus far toward the definition of a BigData Top100 List. While highlighting the major technical as well as organizational challenges, through this article, we also solicit community input into this process.

  15. Big data, big knowledge: big data for personalized healthcare.

    PubMed

    Viceconti, Marco; Hunter, Peter; Hose, Rod

    2015-07-01

    The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.

  16. Genetic structure, nestmate recognition and behaviour of two cryptic species of the invasive big-headed ant Pheidole megacephala.

    PubMed

    Fournier, Denis; Tindo, Maurice; Kenne, Martin; Mbenoun Masse, Paul Serge; Van Bossche, Vanessa; De Coninck, Eliane; Aron, Serge

    2012-01-01

    Biological invasions are recognized as a major cause of biodiversity decline and have considerable impact on the economy and human health. The African big-headed ant Pheidole megacephala is considered one of the world's most harmful invasive species. To better understand its ecological and demographic features, we combined behavioural (aggression tests), chemical (quantitative and qualitative analyses of cuticular lipids) and genetic (mitochondrial divergence and polymorphism of DNA microsatellite markers) data obtained for eight populations in Cameroon. Molecular data revealed two cryptic species of P. megacephala, one inhabiting urban areas and the other rainforests. Urban populations belong to the same phylogenetic group than those introduced in Australia and in other parts of the world. Behavioural analyses show that the eight populations sampled make up four mutually aggressive supercolonies. The maximum distance between nests from the same supercolony was 49 km and the closest distance between two nests belonging to two different supercolonies was 46 m. The genetic data and chemical analyses confirmed the behavioural tests as all of the nests were correctly assigned to their supercolony. Genetic diversity appears significantly greater in Africa than in introduced populations in Australia; by contrast, urban and Australian populations are characterized by a higher chemical diversity than rainforest ones. Overall, our study shows that populations of P. megacephala in Cameroon adopt a unicolonial social structure, like invasive populations in Australia. However, the size of the supercolonies appears several orders of magnitude smaller in Africa. This implies competition between African supercolonies and explains why they persist over evolutionary time scales.

  17. Research on the magnetorheological finishing (MRF) technology with dual polishing heads

    NASA Astrophysics Data System (ADS)

    Huang, Wen; Zhang, Yunfei; He, Jianguo; Zheng, Yongcheng; Luo, Qing; Hou, Jing; Yuan, Zhigang

    2014-08-01

    Magnetorheological finishing (MRF) is a key polishing technique capable of rapidly converging to the required surface figure. Due to the deficiency of general one-polishing-head MRF technology, a dual polishing heads MRF technology was studied and a dual polishing heads MRF machine with 8 axes was developed. The machine has the ability to manufacture large aperture optics with high figure accuracy. The large polishing head is suitable for polishing large aperture optics, controlling large spatial length's wave structures, correcting low-medium frequency errors with high removal rates. While the small polishing head has more advantages in manufacturing small aperture optics, controlling small spatial wavelength's wave structures, correcting mid-high frequency and removing nanoscale materials. Material removal characteristic and figure correction ability for each of large and small polishing head was studied. Each of two polishing heads respectively acquired stable and valid polishing removal function and ultra-precision flat sample. After a single polishing iteration using small polishing head, the figure error in 45mm diameter of a 50 mm diameter plano optics was significantly improved from 0.21λ to 0.08λ by PV (RMS 0.053λ to 0.015λ). After three polishing iterations using large polishing head , the figure error in 410mm×410mm of a 430mm×430mm large plano optics was significantly improved from 0.40λ to 0.10λ by PV (RMS 0.068λ to 0.013λ) .This results show that the dual polishing heads MRF machine not only have good material removal stability, but also excellent figure correction capability.

  18. Experimental investigation of head resistance reduction in bubbly Couette-Taylor flow

    NASA Astrophysics Data System (ADS)

    Maryami, R.; Javadpoor, M.; Farahat, S.

    2016-12-01

    Small bubble experiments are carried out in a circulating vertical Couette-Taylor flow system to investigate the effect of air bubbles on head resistance. In the system with inner rotating cylinder and circulating flow, flow is combined with circumferential and axial flow. Moreover, the variation range of rotational Reynolds number is 7 × 103 ≤ {Re}_{ω } ≤ 70 × 103 and small bubbles are dispersed into fully turbulent flow which consists of Taylor vortices. The modification of head resistance is examined by measuring the pressure difference between two certain holes along the cylinders axis. The results show that head resistance is decreased in the presence of small bubbles and a head resistance reduction greater than 60 % is achieved in low {Re}_{ω } s and in all {Re}_{ax} s changing from 299.15 to 396.27. The effect of air bubbles on vortices could be possible reason for head resistance reduction. Since Taylor vortices are stable in this regime, bubbles decrease the momentum transfer by elongating vortices along the axis of cylinders and decreasing their numbers. The positive effect of air bubbles on head resistance reduction is diminished when {Re}_{ω } is increased. Moreover, in certain ranges of {Re}_{ω }, small bubbles enhance head resistance when {Re}_{ax} is increased. It is predicted that negative effect of small bubbles on head resistance reduction is due to flow turbulence enhancement when {Re}_{ω } and {Re}_{ax} are increased.

  19. Core decompression of the femoral head for osteonecrosis using percutaneous multiple small-diameter drilling.

    PubMed

    Mont, Michael A; Ragland, Phillip S; Etienne, Gracia

    2004-12-01

    Osteonecrosis is a disease with a wide ranging etiology and poorly understood pathogenesis seen commonly in young patients. Core decompression has historically been used in patients with small-sized or medium-sized precollapse lesions in an attempt to forestall disease progression. Typically, an 8-10 mm wide cannula trephine is used to do this procedure. The authors report on a new technique using multiple small drillings with a 3-mm Steinman pin to effectuate the core decompression. In this report, there were 32 of 45 hips (71%; 35 patients) with a successful clinical result at a mean followup of 2 years (range, 20-39 months). Twenty four of 30 Stage I hips (80%; 23 patients) had successful outcomes compared with 8 of 15 Stage II hips (57%; 12 patients) with no surgical complications occurring with this technique. This procedure is technically straightforward and led to minimal morbidity with no surgical complications. It may be effective in delaying the need for total hip arthroplasty in young patients with early (precollapse) stages of femoral head osteonecrosis.

  20. Big-data-based edge biomarkers: study on dynamical drug sensitivity and resistance in individuals.

    PubMed

    Zeng, Tao; Zhang, Wanwei; Yu, Xiangtian; Liu, Xiaoping; Li, Meiyi; Chen, Luonan

    2016-07-01

    Big-data-based edge biomarker is a new concept to characterize disease features based on biomedical big data in a dynamical and network manner, which also provides alternative strategies to indicate disease status in single samples. This article gives a comprehensive review on big-data-based edge biomarkers for complex diseases in an individual patient, which are defined as biomarkers based on network information and high-dimensional data. Specifically, we firstly introduce the sources and structures of biomedical big data accessible in public for edge biomarker and disease study. We show that biomedical big data are typically 'small-sample size in high-dimension space', i.e. small samples but with high dimensions on features (e.g. omics data) for each individual, in contrast to traditional big data in many other fields characterized as 'large-sample size in low-dimension space', i.e. big samples but with low dimensions on features. Then, we demonstrate the concept, model and algorithm for edge biomarkers and further big-data-based edge biomarkers. Dissimilar to conventional biomarkers, edge biomarkers, e.g. module biomarkers in module network rewiring-analysis, are able to predict the disease state by learning differential associations between molecules rather than differential expressions of molecules during disease progression or treatment in individual patients. In particular, in contrast to using the information of the common molecules or edges (i.e.molecule-pairs) across a population in traditional biomarkers including network and edge biomarkers, big-data-based edge biomarkers are specific for each individual and thus can accurately evaluate the disease state by considering the individual heterogeneity. Therefore, the measurement of big data in a high-dimensional space is required not only in the learning process but also in the diagnosing or predicting process of the tested individual. Finally, we provide a case study on analyzing the temporal expression

  1. Big Ozone Holes Headed For Extinction By 2040

    NASA Image and Video Library

    2015-05-06

    Caption: This is a conceptual animation showing ozone-depleting chemicals moving from the equator to the poles. The chemicals become trapped by the winds of the polar vortex, a ring of fast moving air that circles the South Pole. Watch full video: youtu.be/7n2km69jZu8 -- The next three decades will see an end of the era of big ozone holes. In a new study, scientists from NASA Goddard Space Flight Center say that the ozone hole will be consistently smaller than 12 million square miles by the year 2040. Ozone-depleting chemicals in the atmosphere cause an ozone hole to form over Antarctica during the winter months in the Southern Hemisphere. Since the Montreal Protocol agreement in 1987, emissions have been regulated and chemical levels have been declining. However, the ozone hole has still remained bigger than 12 million square miles since the early 1990s, with exact sizes varying from year to year. The size of the ozone hole varies due to both temperature and levels of ozone-depleting chemicals in the atmosphere. In order to get a more accurate picture of the future size of the ozone hole, scientists used NASA’s AURA satellite to determine how much the levels of these chemicals in the atmosphere varied each year. With this new knowledge, scientists can confidently say that the ozone hole will be consistently smaller than 12 million square miles by the year 2040. Scientists will continue to use satellites to monitor the recovery of the ozone hole and they hope to see its full recovery by the end of the century. Research: Inorganic chlorine variability in the Antarctic vortex and implications for ozone recovery. Journal: Geophysical Research: Atmospheres, December 18, 2014. Link to paper: onlinelibrary.wiley.com/doi/10.1002/2014JD022295/abstract.

  2. Hubble Spies Big Bang Frontiers

    NASA Image and Video Library

    2017-12-08

    Observations by the NASA/ESA Hubble Space Telescope have taken advantage of gravitational lensing to reveal the largest sample of the faintest and earliest known galaxies in the universe. Some of these galaxies formed just 600 million years after the big bang and are fainter than any other galaxy yet uncovered by Hubble. The team has determined for the first time with some confidence that these small galaxies were vital to creating the universe that we see today. An international team of astronomers, led by Hakim Atek of the Ecole Polytechnique Fédérale de Lausanne, Switzerland, has discovered over 250 tiny galaxies that existed only 600-900 million years after the big bang— one of the largest samples of dwarf galaxies yet to be discovered at these epochs. The light from these galaxies took over 12 billion years to reach the telescope, allowing the astronomers to look back in time when the universe was still very young. Read more: www.nasa.gov/feature/goddard/hubble-spies-big-bang-frontiers Credit: NASA/ESA NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  3. How Small Is a Billionth?

    ERIC Educational Resources Information Center

    Gough, John

    2007-01-01

    Children's natural curiosity about numbers, big and small can lead to exploring place-value ideas. But how can these abstract concepts be experienced more concretely? This article presents some practical approaches for conceptualising very small numbers using linear models, area models, volume models, and diagrams.

  4. Big data uncertainties.

    PubMed

    Maugis, Pierre-André G

    2018-07-01

    Big data-the idea that an always-larger volume of information is being constantly recorded-suggests that new problems can now be subjected to scientific scrutiny. However, can classical statistical methods be used directly on big data? We analyze the problem by looking at two known pitfalls of big datasets. First, that they are biased, in the sense that they do not offer a complete view of the populations under consideration. Second, that they present a weak but pervasive level of dependence between all their components. In both cases we observe that the uncertainty of the conclusion obtained by statistical methods is increased when used on big data, either because of a systematic error (bias), or because of a larger degree of randomness (increased variance). We argue that the key challenge raised by big data is not only how to use big data to tackle new problems, but to develop tools and methods able to rigorously articulate the new risks therein. Copyright © 2016. Published by Elsevier Ltd.

  5. What Can Big Data Offer the Pharmacovigilance of Orphan Drugs?

    PubMed

    Price, John

    2016-12-01

    The pharmacovigilance of drugs for orphan diseases presents problems related to the small patient population. Obtaining high-quality information on individual reports of suspected adverse reactions is of particular importance for the pharmacovigilance of orphan drugs. The possibility of mining "big data" to detect suspected adverse reactions is being explored in pharmacovigilance generally but may have limited application to orphan drugs. Sources of big data such as social media may be infrequently used as communication channels by patients with rare disease or their caregivers or by health care providers; any adverse reactions identified are likely to reflect what is already known about the safety of the drug from the network of support that grows up around these patients. Opportunities related to potential future big data sources are discussed. Copyright © 2016 Elsevier HS Journals, Inc. All rights reserved.

  6. Five Big Ideas

    ERIC Educational Resources Information Center

    Morgan, Debbie

    2012-01-01

    Designing quality continuing professional development (CPD) for those teaching mathematics in primary schools is a challenge. If the CPD is to be built on the scaffold of five big ideas in mathematics, what might be these five big ideas? Might it just be a case of, if you tell me your five big ideas, then I'll tell you mine? Here, there is…

  7. [Big data in imaging].

    PubMed

    Sewerin, Philipp; Ostendorf, Benedikt; Hueber, Axel J; Kleyer, Arnd

    2018-04-01

    Until now, most major medical advancements have been achieved through hypothesis-driven research within the scope of clinical trials. However, due to a multitude of variables, only a certain number of research questions could be addressed during a single study, thus rendering these studies expensive and time consuming. Big data acquisition enables a new data-based approach in which large volumes of data can be used to investigate all variables, thus opening new horizons. Due to universal digitalization of the data as well as ever-improving hard- and software solutions, imaging would appear to be predestined for such analyses. Several small studies have already demonstrated that automated analysis algorithms and artificial intelligence can identify pathologies with high precision. Such automated systems would also seem well suited for rheumatology imaging, since a method for individualized risk stratification has long been sought for these patients. However, despite all the promising options, the heterogeneity of the data and highly complex regulations covering data protection in Germany would still render a big data solution for imaging difficult today. Overcoming these boundaries is challenging, but the enormous potential advances in clinical management and science render pursuit of this goal worthwhile.

  8. Husbandry and propagation of the Chinese big-headed turtle (Platysternon megacephalum) at the Wildlife Conservation Society's Prospect Park Zoo.

    PubMed

    Shelmidine, Nichole; Murphy, Brittany; Massarone, Katelyn

    2016-01-01

    Turtles worldwide are facing increasing pressures on their wild populations and many are listed as endangered or critically endangered. Chinese big-headed turtles (Platysternon megacephalum) are currently listed on IUCN's Red List as endangered and on Cites Appendix II. As part of the Wildlife Conservation Society's initiative on turtle and tortoise conservation, this species became a focus for propagation at Prospect Park Zoo (PPZ) in 2008. PPZ successfully bred and obtained eggs, with successful hatchings in 2013 and 2014. The staff fluctuated water and ambient temperatures along with photoperiod in order to simulate seasonal changes. Each May, the female was placed in the male's enclosure daily for at least 15 min for breeding. Once two confirmed copulations were observed, breeding introductions were discontinued. The female laid her eggs in July and August, and clutch sizes ranged from 5 to 6 eggs. Eggs were successfully incubated in a RCOM Juragon reptile incubator at 23.3°C with 90-95% humidity. The eggs hatched after an average incubation period of 102 days (98-105 days, n = 9). Hatchlings had a mean body mass of 8.84 g (8.11-10 g) and average carapace length × width of 36.17 × 32.20 mm. This article aims to share the team's experiences working with this species as well as build upon previous publications and successes. Our hope is that with continued efforts to increase our knowledgebase a future viable, sustainable North American captive population will become a reality for this species. © 2016 Wiley Periodicals, Inc.

  9. Starting Small for Big School Improvement

    ERIC Educational Resources Information Center

    Scharff, Helen A.; DeAngelis, Deirdre A.; Talbert, Joan E.

    2010-01-01

    In the Scaffolded Apprenticeship Model (SAM), a school improvement strategy in place in New York City; Boston, Massachusetts; and Oakland, California, teacher teams improve their schools by studying and closing high-leverage learning gaps for small groups of struggling students as a strategy for systemic change. SAM's goal is for each school to…

  10. Classical stability of sudden and big rip singularities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrow, John D.; Lip, Sean Z. W.

    2009-08-15

    We introduce a general characterization of sudden cosmological singularities and investigate the classical stability of homogeneous and isotropic cosmological solutions of all curvatures containing these singularities to small scalar, vector, and tensor perturbations using gauge-invariant perturbation theory. We establish that sudden singularities at which the scale factor, expansion rate, and density are finite are stable except for a set of special parameter values. We also apply our analysis to the stability of Big Rip singularities and find the conditions for their stability against small scalar, vector, and tensor perturbations.

  11. Waste management in small hospitals: trouble for environment.

    PubMed

    Pant, Deepak

    2012-07-01

    Small hospitals are the grassroots for the big hospital structures, so proper waste management practices require to be initiated from there. Small hospitals contribute a lot in the health care facilities, but due to their poor waste management practices, they pose serious biomedical waste pollution. A survey was conducted with 13 focus questions collected from the 100 hospital present in Dehradun. Greater value of per day per bed waste was found among the small hospitals (178 g compared with 114 g in big hospitals), indicating unskilled waste management practices. Small hospitals do not follow the proper way for taking care of segregation of waste generated in the hospital, and most biomedical wastes were collected without segregation into infectious and noninfectious categories.

  12. Transportation Limitation Access to the Small Islands (Case Study: Banggai Laut Regency)

    NASA Astrophysics Data System (ADS)

    Sunarti, S.

    2018-02-01

    Indonesia is as an archipelago and maritime country, the large number of Islands owned and scattered in all directions makes a challenge for the Government in equitable development. Development in Indonesia has not been spread evenly and tends to focus on the big island, while the smaller islands are still far behind and lack of government attention. One of them is the lack of infrastructure especially the access to the small islands. Among the small islands in Indonesia with minimal maritime infrastructure or transportation is Banggai Laut Regency, Central Sulawesi Province. This Regency is a new regency that separate itself from its previous regency that is Banggai Kepulauan Regency in about 4 years ago. For the development of the Banggai Laut Regency, access to reach that regency is quite difficult. Therefore, the aim of this research is to find infrastructure development strategy to support the development of Banggai Laut Regency. The research method used was the concurrent model mixed method. Data collection method was done with primary data through field observation and interview, secondary data through literature and document review. Analytical techniques used are qualitative descriptive and Map Overlay techniques using GIS to describe the characteristics of study areas and spatial relationships between islands. The results of this research conclude that the Banggai Laut Regency requires infrastructure development particularly maritime transportation to enhance accessibility of the community headed to Banggi Laut Regency or headed to another island from the Banggai Laut Regency.

  13. Native Perennial Forb Variation Between Mountain Big Sagebrush and Wyoming Big Sagebrush Plant Communities

    NASA Astrophysics Data System (ADS)

    Davies, Kirk W.; Bates, Jon D.

    2010-09-01

    Big sagebrush ( Artemisia tridentata Nutt.) occupies large portions of the western United States and provides valuable wildlife habitat. However, information is lacking quantifying differences in native perennial forb characteristics between mountain big sagebrush [ A. tridentata spp. vaseyana (Rydb.) Beetle] and Wyoming big sagebrush [ A. tridentata spp. wyomingensis (Beetle & A. Young) S.L. Welsh] plant communities. This information is critical to accurately evaluate the quality of habitat and forage that these communities can produce because many wildlife species consume large quantities of native perennial forbs and depend on them for hiding cover. To compare native perennial forb characteristics on sites dominated by these two subspecies of big sagebrush, we sampled 106 intact big sagebrush plant communities. Mountain big sagebrush plant communities produced almost 4.5-fold more native perennial forb biomass and had greater native perennial forb species richness and diversity compared to Wyoming big sagebrush plant communities ( P < 0.001). Nonmetric multidimensional scaling (NMS) and the multiple-response permutation procedure (MRPP) demonstrated that native perennial forb composition varied between these plant communities ( P < 0.001). Native perennial forb composition was more similar within plant communities grouped by big sagebrush subspecies than expected by chance ( A = 0.112) and composition varied between community groups ( P < 0.001). Indicator analysis did not identify any perennial forbs that were completely exclusive and faithful, but did identify several perennial forbs that were relatively good indicators of either mountain big sagebrush or Wyoming big sagebrush plant communities. Our results suggest that management plans and habitat guidelines should recognize differences in native perennial forb characteristics between mountain and Wyoming big sagebrush plant communities.

  14. Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives. Contribution of the IMIA Social Media Working Group.

    PubMed

    Hansen, M M; Miron-Shatz, T; Lau, A Y S; Paton, C

    2014-08-15

    As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts: The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to "small data" would also be useful.

  15. Prism-type holographic optical element design and verification for the blue-light small-form-factor optical pickup head.

    PubMed

    Shih, Hsi-Fu; Chiu, Yi; Cheng, Stone; Lee, Yuan-Chin; Lu, Chun-Shin; Chen, Yung-Chih; Chiou, Jin-Chern

    2012-08-20

    This paper presents the prism-type holographic optical element (PT-HOE) design for a small-form-factor (SFF) optical pickup head (OPH). The surface of the PT-HOE was simulated by three steps of optimization and generated by binary optics. Its grating pattern was fabricated on the inclined plane of a microprism by using the standard photolithography and specific dicing procedures. The optical characteristics of the device were verified. Based on the virtual image method, the SFF-OPH with the device was assembled and realized.

  16. Big Data and Neuroimaging.

    PubMed

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  17. Big assumptions for small samples in crop insurance

    Treesearch

    Ashley Elaine Hungerford; Barry Goodwin

    2014-01-01

    The purpose of this paper is to investigate the effects of crop insurance premiums being determined by small samples of yields that are spatially correlated. If spatial autocorrelation and small sample size are not properly accounted for in premium ratings, the premium rates may inaccurately reflect the risk of a loss.

  18. Heading-vector navigation based on head-direction cells and path integration.

    PubMed

    Kubie, John L; Fenton, André A

    2009-05-01

    , heading-based navigation is used in small mammals and humans. Copyright 2008 Wiley-Liss, Inc.

  19. The Big Bang and the Search for a Theory of Everything

    NASA Technical Reports Server (NTRS)

    Kogut, Alan

    2010-01-01

    How did the universe begin? Is the gravitational physics that governs the shape and evolution of the cosmos connected in a fundamental way to the sub-atomic physics of particle colliders? Light from the Big Bang still permeates the universe and carries within it faint clues to the physics at the start of space and time. I will describe how current and planned measurements of the cosmic microwave background will observe the Big Bang to provide new insight into a "Theory of Everything" uniting the physics of the very large with the physics of the very small.

  20. Cryptography for Big Data Security

    DTIC Science & Technology

    2015-07-13

    Cryptography for Big Data Security Book Chapter for Big Data: Storage, Sharing, and Security (3S) Distribution A: Public Release Ariel Hamlin1 Nabil...Email: arkady@ll.mit.edu ii Contents 1 Cryptography for Big Data Security 1 1.1 Introduction...48 Chapter 1 Cryptography for Big Data Security 1.1 Introduction With the amount

  1. Small Districts, Big Problems: Making School Everybody's House.

    ERIC Educational Resources Information Center

    Schmuck, Richard A.; Schmuck, Patricia A.

    During a 6-month odyssey over America's back roads, 80 schools in 25 small school districts in 21 states were visited, in hopes of finding effective schools where smallness facilitated participation by all. District size ranged from 450 to 2,000 students. Data collection included observation of classes and meetings; group interviews of classes;…

  2. [Effectiveness of multiple small-diameter drilling decompression combined with hip arthroscopy for early osteonecrosis of the femoral head].

    PubMed

    Li, Ji; Li, Zhongli; Su, Xiangzheng; Liu, Chunhui; Zhang, Hao; Wang, Ketao

    2017-09-01

    To evaluate the effectiveness of multiple small-diameter drilling decompression combined with hip arthroscopy for early oeteonecrosis of the femoral head (ONFH). Between March 2010 and December 2013, 91 patients with early ONFH were treated with the operation of multiple small-diameter drilling decompression combined with hip arthroscopy in 39 cases (53 hips, group A) or with drilling decompression alone in 52 cases (74 hips, group B). The patients in 2 groups had obvious hip pain and limited motion before operation. There was no significant difference in gender, age, etiology, effected side, stage of osteonecrosis, and preoperative Harris score between 2 groups ( P >0.05). All operations succeeded and all incisions healed by first intention. The operation time was significantly longer in group A [(73.3±10.6) minutes] than in group B [(41.5±7.2) minutes] ( t =8.726, P =0.000). Temporary of sciatic nerve apraxia after operation occurred in 2 patients of group A, and no complication occurred in other patients. Patients were followed up 24-52 months (mean, 39.3 months) in group A and 24-48 months (mean, 34.6 months) in group B. At last follow-up, the Harris scores were 83.34±8.76 in group A and 76.61±9.22 in group B, showing significant differences when compared between 2 groups ( t =-4.247, P =0.029) and when compared with preoperative values in 2 groups ( t =-10.327, P =0.001; t =-8.216, P =0.008). X-ray films showed that the collapse of the femoral head was observed in 6 hips (1 hip at stage Ⅰand 5 hips at stage Ⅱ) in group A, and in 16 hips (4 hips at stageⅠand 12 hips at stage Ⅱ) in group B; and hip arthroplasty was performed. The total effective rates were 88.68% (47/53) in group A and 78.38% (58/74) in group B, respectively; showing significant difference between 2 groups ( χ 2 =5.241, P =0.041). Multiple small-diameter drilling decompression combined with hip arthroscopy is effective in pain relief, improvement of hip function, slowing-down the

  3. Big Data in industry

    NASA Astrophysics Data System (ADS)

    Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.

    2016-08-01

    The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.

  4. Antenatal antecedents of a small head circumference at age 24-months post-term equivalent in a sample of infants born before the 28th post-menstrual week.

    PubMed

    Leviton, Alan; Kuban, Karl; Allred, Elizabeth N; Hecht, Jonathan L; Onderdonk, Andrew; O'Shea, T Michael; McElrath, Thomas; Paneth, Nigel

    2010-08-01

    Little is known about the antecedents of microcephaly in early childhood among children born at extremely low gestational age. To identify some of the antecedents of microcephaly at age two years among children born before the 28th week of gestation. Observational cohort study. 1004 infants born before the 28th week of gestation. Head circumference Z-scores of <-2 and >or=-2, <-1. Risk of microcephaly and a less severely restricted head circumference decreased monotonically with increasing gestational age. After adjusting for gestational age and other potential confounders, the risk of microcephaly at age 2 years was increased if microcephaly was present at birth [odds ratio: 8.8 ((95% confidence interval: 3.7, 21)], alpha hemolytic Streptococci were recovered from the placenta parenchyma [2.9 (1.2, 6.9)], the child was a boy [2.8 (1.6, 4.9)], and the child's mother was not married [2.5 (1.5, 4.3)]. Antecedents associated not with microcephaly, but with a less extreme reduction in head circumference were recovery of Propionibacterium sp from the placenta parenchyma [2.9 (1.5, 5.5)], tobacco exposure [2.0 (1.4, 3.0)], and increased syncytial knots in the placenta [2.0 (1.2, 3.2)]. Although microcephaly at birth predicts a small head circumference at 2 years among children born much before term, pregnancy and maternal characteristics provide supplemental information about the risk of a small head circumference years later. Two findings appear to be novel. Tobacco exposure during pregnancy, and organisms recovered from the placenta predict reduced head circumference at age two years. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Antenatal antecedents of a small head circumference at age 24-months post-term equivalent in a sample of infants born before the 28th post-menstrual week

    PubMed Central

    Leviton, Alan; Kuban, Karl; Allred, Elizabeth N.; Hecht, Jonathan L.; Onderdonk, Andrew; O'Shea, T. Michael; McElrath, Thomas; Paneth, Nigel

    2010-01-01

    Background Little is known about the antecedents of microcephaly in early childhood among children born at extremely low gestational age. Aim To identify some of the antecedents of microcephaly at age two years among children born before the 28th week of gestation. Study design Observational cohort study. Subjects 1004 infants born before the 28th week of gestation. Outcome measures Head circumference Z-scores of <−2 and ≥−2, <−1. Results Risk of microcephaly and a less severely restricted head circumference decreased monotonically with increasing gestational age. After adjusting for gestational age and other potential confounders, the risk of microcephaly at age 2 years was increased if microcephaly was present at birth [odds ratio: 8.8 ((95% confidence interval: 3.7, 21)], alpha hemolytic Streptococci were recovered from the placenta parenchyma [2.9 (1.2, 6.9)], the child was a boy [2.8 (1.6, 4.9)], and the child's mother was not married [2.5 (1.5, 4.3)]. Antecedents associated not with microcephaly, but with a less extreme reduction in head circumference were recovery of Propionibacterium sp from the placenta parenchyma [2.9 (1.5, 5.5)], tobacco exposure [2.0 (1.4, 3.0)], and increased syncytial knots in the placenta [2.0 (1.2, 3.2)]. Conclusions Although microcephaly at birth predicts a small head circumference at 2 years among children born much before term, pregnancy and maternal characteristics provide supplemental information about the risk of a small head circumference years later. Two findings appear to be novel. Tobacco exposure during pregnancy, and organisms recovered from the placenta predict reduced head circumference at age two years. PMID:20674197

  6. Radial head fracture - aftercare

    MedlinePlus

    Elbow fracture - radial head - aftercare ... to 2 weeks. If you have a small fracture and your bones did not move around much, ... to see a bone doctor (orthopedic surgeon). Some fractures require surgery to: Insert pins and plates to ...

  7. Sense Things in the Big Deep Water Bring the Big Deep Water to Computers so People can understand the Deep Water all the Time without getting wet

    NASA Astrophysics Data System (ADS)

    Pelz, M.; Heesemann, M.; Scherwath, M.; Owens, D.; Hoeberechts, M.; Moran, K.

    2015-12-01

    Senses help us learn stuff about the world. We put sense things in, over, and under the water to help people understand water, ice, rocks, life and changes over time out there in the big water. Sense things are like our eyes and ears. We can use them to look up and down, right and left all of the time. We can also use them on top of or near the water to see wind and waves. As the water gets deep, we can use our sense things to see many a layer of different water that make up the big water. On the big water we watch ice grow and then go away again. We think our sense things will help us know if this is different from normal, because it could be bad for people soon if it is not normal. Our sense things let us hear big water animals talking low (but sometimes high). We can also see animals that live at the bottom of the big water and we take lots of pictures of them. Lots of the animals we see are soft and small or hard and small, but sometimes the really big ones are seen too. We also use our sense things on the bottom and sometimes feel the ground shaking. Sometimes, we get little pockets of bad smelling air going up, too. In other areas of the bottom, we feel hot hot water coming out of the rock making new rocks and we watch some animals even make houses and food out of the hot hot water that turns to rock as it cools. To take care of the sense things we use and control water cars and smaller water cars that can dive deep in the water away from the bigger water car. We like to put new things in the water and take things out of the water that need to be fixed at least once a year. Sense things are very cool because you can use the sense things with your computer too. We share everything for free on our computers, which your computer talks to and gets pictures and sounds for you. Sharing the facts from the sense things is the best part about having the sense things because we can get many new ideas about understanding the big water from anyone with a computer!

  8. Small Talk: A Big Communicative Function in the Organization?

    ERIC Educational Resources Information Center

    Levine, Deborah Clark

    Defining small talk as "superficial talk about matters of little concern," a study examined the role of small talk in the work place. Subjects, 51 white collar workers and clerical employees at three corporations, an Eastern state university, and two small businesses completed a questionnaire concerning the following questions: (1) What…

  9. The big data-big model (BDBM) challenges in ecological research

    NASA Astrophysics Data System (ADS)

    Luo, Y.

    2015-12-01

    The field of ecology has become a big-data science in the past decades due to development of new sensors used in numerous studies in the ecological community. Many sensor networks have been established to collect data. For example, satellites, such as Terra and OCO-2 among others, have collected data relevant on global carbon cycle. Thousands of field manipulative experiments have been conducted to examine feedback of terrestrial carbon cycle to global changes. Networks of observations, such as FLUXNET, have measured land processes. In particular, the implementation of the National Ecological Observatory Network (NEON), which is designed to network different kinds of sensors at many locations over the nation, will generate large volumes of ecological data every day. The raw data from sensors from those networks offer an unprecedented opportunity for accelerating advances in our knowledge of ecological processes, educating teachers and students, supporting decision-making, testing ecological theory, and forecasting changes in ecosystem services. Currently, ecologists do not have the infrastructure in place to synthesize massive yet heterogeneous data into resources for decision support. It is urgent to develop an ecological forecasting system that can make the best use of multiple sources of data to assess long-term biosphere change and anticipate future states of ecosystem services at regional and continental scales. Forecasting relies on big models that describe major processes that underlie complex system dynamics. Ecological system models, despite great simplification of the real systems, are still complex in order to address real-world problems. For example, Community Land Model (CLM) incorporates thousands of processes related to energy balance, hydrology, and biogeochemistry. Integration of massive data from multiple big data sources with complex models has to tackle Big Data-Big Model (BDBM) challenges. Those challenges include interoperability of multiple

  10. Big Machines and Big Science: 80 Years of Accelerators at Stanford

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loew, Gregory

    2008-12-16

    Longtime SLAC physicist Greg Loew will present a trip through SLAC's origins, highlighting its scientific achievements, and provide a glimpse of the lab's future in 'Big Machines and Big Science: 80 Years of Accelerators at Stanford.'

  11. Effects of head tilt on visual field testing with a head-mounted perimeter imo

    PubMed Central

    Matsumoto, Chota; Nomoto, Hiroki; Numata, Takuya; Eura, Mariko; Yamashita, Marika; Hashimoto, Shigeki; Okuyama, Sachiko; Kimura, Shinji; Yamanaka, Kenzo; Chiba, Yasutaka; Aihara, Makoto; Shimomura, Yoshikazu

    2017-01-01

    Purpose A newly developed head-mounted perimeter termed “imo” enables visual field (VF) testing without a fixed head position. Because the positional relationship between the subject’s head and the imo is fixed, the effects of head position changes on the test results are small compared with those obtained using a stationary perimeter. However, only ocular counter-roll (OCR) induced by head tilt might affect VF testing. To quantitatively reveal the effects of head tilt and OCR on the VF test results, we investigated the associations among the head-tilt angle, OCR amplitude and VF testing results. Subjects and methods For 20 healthy subjects, we binocularly recorded static OCR (s-OCR) while tilting the subject’s head at an arbitrary angle ranging from 0° to 60° rightward or leftward in 10° increments. By monitoring iris patterns, we evaluated the s-OCR amplitude. We also performed blind spot detection while tilting the subject’s head by an arbitrary angle ranging from 0° to 50° rightward or leftward in 10° increments to calculate the angle by which the blind spot rotates because of head tilt. Results The association between s-OCR amplitude and head-tilt angle showed a sinusoidal relationship. In blind spot detection, the blind spot rotated to the opposite direction of the head tilt, and the association between the rotation angle of the blind spot and the head-tilt angle also showed a sinusoidal relationship. The rotation angle of the blind spot was strongly correlated with the s-OCR amplitude (R2≥0.94, p<0.0001). A head tilt greater than 20° with imo causes interference between adjacent test areas. Conclusions Both the s-OCR amplitude and the rotation angle of the blind spot were correlated with the head-tilt angle by sinusoidal regression. The rotated VF was correlated with the s-OCR amplitude. During perimetry using imo, the change in the subject’s head tilt should be limited to 20°. PMID:28945777

  12. Big bang nucleosynthesis: The strong nuclear force meets the weak anthropic principle

    NASA Astrophysics Data System (ADS)

    MacDonald, J.; Mullan, D. J.

    2009-08-01

    Contrary to a common argument that a small increase in the strength of the strong force would lead to destruction of all hydrogen in the big bang due to binding of the diproton and the dineutron with a catastrophic impact on life as we know it, we show that provided the increase in strong force coupling constant is less than about 50% substantial amounts of hydrogen remain. The reason is that an increase in strong force strength leads to tighter binding of the deuteron, permitting nucleosynthesis to occur earlier in the big bang at higher temperature than in the standard big bang. Photodestruction of the less tightly bound diproton and dineutron delays their production to after the bulk of nucleosynthesis is complete. The decay of the diproton can, however, lead to relatively large abundances of deuterium.

  13. Vacuum compatible miniature CCD camera head

    DOEpatents

    Conder, Alan D.

    2000-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close(0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  14. Big data need big theory too

    PubMed Central

    Dougherty, Edward R.; Highfield, Roger R.

    2016-01-01

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their ‘depth’ and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote ‘blind’ big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698035

  15. Big data need big theory too.

    PubMed

    Coveney, Peter V; Dougherty, Edward R; Highfield, Roger R

    2016-11-13

    The current interest in big data, machine learning and data analytics has generated the widespread impression that such methods are capable of solving most problems without the need for conventional scientific methods of inquiry. Interest in these methods is intensifying, accelerated by the ease with which digitized data can be acquired in virtually all fields of endeavour, from science, healthcare and cybersecurity to economics, social sciences and the humanities. In multiscale modelling, machine learning appears to provide a shortcut to reveal correlations of arbitrary complexity between processes at the atomic, molecular, meso- and macroscales. Here, we point out the weaknesses of pure big data approaches with particular focus on biology and medicine, which fail to provide conceptual accounts for the processes to which they are applied. No matter their 'depth' and the sophistication of data-driven methods, such as artificial neural nets, in the end they merely fit curves to existing data. Not only do these methods invariably require far larger quantities of data than anticipated by big data aficionados in order to produce statistically reliable results, but they can also fail in circumstances beyond the range of the data used to train them because they are not designed to model the structural characteristics of the underlying system. We argue that it is vital to use theory as a guide to experimental design for maximal efficiency of data collection and to produce reliable predictive models and conceptual knowledge. Rather than continuing to fund, pursue and promote 'blind' big data projects with massive budgets, we call for more funding to be allocated to the elucidation of the multiscale and stochastic processes controlling the behaviour of complex systems, including those of life, medicine and healthcare.This article is part of the themed issue 'Multiscale modelling at the physics-chemistry-biology interface'. © 2015 The Authors.

  16. Increased plasma levels of big-endothelin-2 and big-endothelin-3 in patients with end-stage renal disease.

    PubMed

    Miyauchi, Yumi; Sakai, Satoshi; Maeda, Seiji; Shimojo, Nobutake; Watanabe, Shigeyuki; Honma, Satoshi; Kuga, Keisuke; Aonuma, Kazutaka; Miyauchi, Takashi

    2012-10-15

    Big endothelins (pro-endothelin; inactive-precursor) are converted to biologically active endothelins (ETs). Mammals and humans produce three ET family members: ET-1, ET-2 and ET-3, from three different genes. Although ET-1 is produced by vascular endothelial cells, these cells do not produce ET-3, which is produced by neuronal cells and organs such as the thyroid, salivary gland and the kidney. In patients with end-stage renal disease, abnormal vascular endothelial cell function and elevated plasma ET-1 and big ET-1 levels have been reported. It is unknown whether big ET-2 and big ET-3 plasma levels are altered in these patients. The purpose of the present study was to determine whether endogenous ET-1, ET-2, and ET-3 systems including big ETs are altered in patients with end-stage renal disease. We measured plasma levels of ET-1, ET-3 and big ET-1, big ET-2, and big ET-3 in patients on chronic hemodialysis (n=23) and age-matched healthy subjects (n=17). In patients on hemodialysis, plasma levels (measured just before hemodialysis) of both ET-1 and ET-3 and big ET-1, big ET-2, and big ET-3 were markedly elevated, and the increase was higher for big ETs (Big ET-1, 4-fold; big ET-2, 6-fold; big ET-3: 5-fold) than for ETs (ET-1, 1.7-fold; ET-3, 2-fold). In hemodialysis patients, plasma levels of the inactive precursors big ET-1, big ET-2, and big ET-3 levels are markedly increased, yet there is only a moderate increase in plasma levels of the active products, ET-1 and ET-3. This suggests that the activity of endothelin converting enzyme contributing to circulating levels of ET-1 and ET-3 may be decreased in patients on chronic hemodialysis. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. [Duodenum-preserving total pancreatic head resection and pancreatic head resection with segmental duodenostomy].

    PubMed

    Takada, Tadahiro; Yasuda, Hideki; Nagashima, Ikuo; Amano, Hodaka; Yoshiada, Masahiro; Toyota, Naoyuki

    2003-06-01

    A duodenum-preserving pancreatic head resection (DPPHR) was first reported by Beger et al. in 1980. However, its application has been limited to chronic pancreatitis because of it is a subtotal pancreatic head resection. In 1990, we reported duodenum-preserving total pancreatic head resection (DPTPHR) in 26 cases. This opened the way for total pancreatic head resection, expanding the application of this approach to tumorigenic morbidities such as intraductal papillary mucinous tumor (IMPT), other benign tumors, and small pancreatic cancers. On the other hand, Nakao et al. reported pancreatic head resection with segmental duodenectomy (PHRSD) as an alternative pylorus-preserving pancreatoduodenectomy technique in 24 cases. Hirata et al. also reported this technique as a new pylorus-preserving pancreatoduodenostomy with increased vessel preservation. When performing DPTPHR, the surgeon should ensure adequate duodenal blood supply. Avoidance of duodenal ischemia is very important in this operation, and thus it is necessary to maintain blood flow in the posterior pancreatoduodenal artery and to preserve the mesoduodenal vessels. Postoperative pancreatic functional tests reveal that DPTPHR is superior to PPPD, including PHSRD, because the entire duodenum and duodenal integrity is very important for postoperative pancreatic function.

  18. Untapped Potential: Fulfilling the Promise of Big Brothers Big Sisters and the Bigs and Littles They Represent

    ERIC Educational Resources Information Center

    Bridgeland, John M.; Moore, Laura A.

    2010-01-01

    American children represent a great untapped potential in our country. For many young people, choices are limited and the goal of a productive adulthood is a remote one. This report paints a picture of who these children are, shares their insights and reflections about the barriers they face, and offers ways forward for Big Brothers Big Sisters as…

  19. Comparative validity of brief to medium-length Big Five and Big Six Personality Questionnaires.

    PubMed

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-12-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are faced with a variety of options as to inventory length. Furthermore, a 6-factor model has been proposed to extend and update the Big Five model, in part by adding a dimension of Honesty/Humility or Honesty/Propriety. In this study, 3 popular brief to medium-length Big Five measures (NEO Five Factor Inventory, Big Five Inventory [BFI], and International Personality Item Pool), and 3 six-factor measures (HEXACO Personality Inventory, Questionnaire Big Six Scales, and a 6-factor version of the BFI) were placed in competition to best predict important student life outcomes. The effect of test length was investigated by comparing brief versions of most measures (subsets of items) with original versions. Personality questionnaires were administered to undergraduate students (N = 227). Participants' college transcripts and student conduct records were obtained 6-9 months after data was collected. Six-factor inventories demonstrated better predictive ability for life outcomes than did some Big Five inventories. Additional behavioral observations made on participants, including their Facebook profiles and cell-phone text usage, were predicted similarly by Big Five and 6-factor measures. A brief version of the BFI performed surprisingly well; across inventory platforms, increasing test length had little effect on predictive validity. Comparative validity of the models and measures in terms of outcome prediction and parsimony is discussed.

  20. Headed to College: The Effects of New York City's Small High Schools of Choice on Postsecondary Enrollment. Supplementary Tables for the Policy Brief

    ERIC Educational Resources Information Center

    MDRC, 2014

    2014-01-01

    This paper provides a set of four supplementary tables for the policy brief "Headed to College The Effects of New York City's Small High Schools of Choice on Postsecondary Enrollment. Policy Brief". Included are the following table titles: (1) Supplementary Table 1: SSC Effects on Four-Year High School Graduation Rated by Student Cohort,…

  1. Comparative Validity of Brief to Medium-Length Big Five and Big Six Personality Questionnaires

    ERIC Educational Resources Information Center

    Thalmayer, Amber Gayle; Saucier, Gerard; Eigenhuis, Annemarie

    2011-01-01

    A general consensus on the Big Five model of personality attributes has been highly generative for the field of personality psychology. Many important psychological and life outcome correlates with Big Five trait dimensions have been established. But researchers must choose between multiple Big Five inventories when conducting a study and are…

  2. Implementing Big History.

    ERIC Educational Resources Information Center

    Welter, Mark

    2000-01-01

    Contends that world history should be taught as "Big History," a view that includes all space and time beginning with the Big Bang. Discusses five "Cardinal Questions" that serve as a course structure and address the following concepts: perspectives, diversity, change and continuity, interdependence, and causes. (CMK)

  3. Big data for health.

    PubMed

    Andreu-Perez, Javier; Poon, Carmen C Y; Merrifield, Robert D; Wong, Stephen T C; Yang, Guang-Zhong

    2015-07-01

    This paper provides an overview of recent developments in big data in the context of biomedical and health informatics. It outlines the key characteristics of big data and how medical and health informatics, translational bioinformatics, sensor informatics, and imaging informatics will benefit from an integrated approach of piecing together different aspects of personalized information from a diverse range of data sources, both structured and unstructured, covering genomics, proteomics, metabolomics, as well as imaging, clinical diagnosis, and long-term continuous physiological sensing of an individual. It is expected that recent advances in big data will expand our knowledge for testing new hypotheses about disease management from diagnosis to prevention to personalized treatment. The rise of big data, however, also raises challenges in terms of privacy, security, data ownership, data stewardship, and governance. This paper discusses some of the existing activities and future opportunities related to big data for health, outlining some of the key underlying issues that need to be tackled.

  4. Big Crater as Viewed by Pathfinder Lander

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    Mars Pathfinder is the second in NASA

  5. [Hepaticojejunostomy after pancreatic head resection - technical aspects for reconstruction of small and fragile bile ducts with T-tube drainage].

    PubMed

    Herzog, T; Belyaev, O; Uhl, W; Seelig, M H; Chromik, A

    2012-12-01

    After pancreatic head resection the reconstruction of small and fragile bile ducts is technically demanding, resulting in more postoperative bile leaks. One option for the reconstruction is the placement of a T-tube drainage at the site of the anastomosis. Standard reconstruction after pancreatic head resection was an end-to-side hepaticojejunostomy with PDS 5.0, 15-25 cm distally from the pancreaticojejunostomy. For patients with a small bile duct diameter (≤ 5 mm) or a fragile bile duct wall the reconstruction was performed with PDS 6.0 and a T-tube drainage at the side of the anastomosis. The reconstruction with a T-tube drainage at the site of the anastomosis is technically easy to perform and offers the opportunity for immediate visualisation of the anastomosis in the postoperative period by application of water soluble contrast medium. If a bile leak occurs, biliary deviation through the T-tube drainage can enable a conservative management without revisional laparotomy in selected patients. Whether or not a conservative management of postoperative bile leaks will lead to more bile duct strictures is a subject for further investigations. A T-tube drainage at the site of the anastomosis can probably not prevent postoperative bile leaks from a difficult hepaticojejunostomy, but in selected patients it offers the opportunity for a conservative management resulting in less re-operations. Therefore we recommend the augmentation of a difficult hepaticojejunostomy with a T-tube drainage. Georg Thieme Verlag KG Stuttgart · New York.

  6. Big Data: Implications for Health System Pharmacy

    PubMed Central

    Stokes, Laura B.; Rogers, Joseph W.; Hertig, John B.; Weber, Robert J.

    2016-01-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services. PMID:27559194

  7. Big Data: Implications for Health System Pharmacy.

    PubMed

    Stokes, Laura B; Rogers, Joseph W; Hertig, John B; Weber, Robert J

    2016-07-01

    Big Data refers to datasets that are so large and complex that traditional methods and hardware for collecting, sharing, and analyzing them are not possible. Big Data that is accurate leads to more confident decision making, improved operational efficiency, and reduced costs. The rapid growth of health care information results in Big Data around health services, treatments, and outcomes, and Big Data can be used to analyze the benefit of health system pharmacy services. The goal of this article is to provide a perspective on how Big Data can be applied to health system pharmacy. It will define Big Data, describe the impact of Big Data on population health, review specific implications of Big Data in health system pharmacy, and describe an approach for pharmacy leaders to effectively use Big Data. A few strategies involved in managing Big Data in health system pharmacy include identifying potential opportunities for Big Data, prioritizing those opportunities, protecting privacy concerns, promoting data transparency, and communicating outcomes. As health care information expands in its content and becomes more integrated, Big Data can enhance the development of patient-centered pharmacy services.

  8. A Quantum Universe Before the Big Bang(s)?

    NASA Astrophysics Data System (ADS)

    Veneziano, Gabriele

    2017-08-01

    The predictions of general relativity have been verified by now in a variety of different situations, setting strong constraints on any alternative theory of gravity. Nonetheless, there are strong indications that general relativity has to be regarded as an approximation of a more complete theory. Indeed theorists have long been looking for ways to connect general relativity, which describes the cosmos and the infinitely large, to quantum physics, which has been remarkably successful in explaining the infinitely small world of elementary particles. These two worlds, however, come closer and closer to each other as we go back in time all the way up to the big bang. Actually, modern cosmology has changed completely the old big bang paradigm: we now have to talk about (at least) two (big?) bangs. If we know quite something about the one closer to us, at the end of inflation, we are much more ignorant about the one that may have preceded inflation and possibly marked the beginning of time. No one doubts that quantum mechanics plays an essential role in answering these questions: unfortunately a unified theory of gravity and quantum mechanics is still under construction. Finding such a synthesis and confirming it experimentally will no doubt be one of the biggest challenges of this century’s physics.

  9. BigWig and BigBed: enabling browsing of large distributed datasets.

    PubMed

    Kent, W J; Zweig, A S; Barber, G; Hinrichs, A S; Karolchik, D

    2010-09-01

    BigWig and BigBed files are compressed binary indexed files containing data at several resolutions that allow the high-performance display of next-generation sequencing experiment results in the UCSC Genome Browser. The visualization is implemented using a multi-layered software approach that takes advantage of specific capabilities of web-based protocols and Linux and UNIX operating systems files, R trees and various indexing and compression tricks. As a result, only the data needed to support the current browser view is transmitted rather than the entire file, enabling fast remote access to large distributed data sets. Binaries for the BigWig and BigBed creation and parsing utilities may be downloaded at http://hgdownload.cse.ucsc.edu/admin/exe/linux.x86_64/. Source code for the creation and visualization software is freely available for non-commercial use at http://hgdownload.cse.ucsc.edu/admin/jksrc.zip, implemented in C and supported on Linux. The UCSC Genome Browser is available at http://genome.ucsc.edu.

  10. Recombination and evolution of duplicate control regions in the mitochondrial genome of the Asian big-headed turtle, Platysternon megacephalum.

    PubMed

    Zheng, Chenfei; Nie, Liuwang; Wang, Jue; Zhou, Huaxing; Hou, Huazhen; Wang, Hao; Liu, Juanjuan

    2013-01-01

    Complete mitochondrial (mt) genome sequences with duplicate control regions (CRs) have been detected in various animal species. In Testudines, duplicate mtCRs have been reported in the mtDNA of the Asian big-headed turtle, Platysternon megacephalum, which has three living subspecies. However, the evolutionary pattern of these CRs remains unclear. In this study, we report the completed sequences of duplicate CRs from 20 individuals belonging to three subspecies of this turtle and discuss the micro-evolutionary analysis of the evolution of duplicate CRs. Genetic distances calculated with MEGA 4.1 using the complete duplicate CR sequences revealed that within turtle subspecies, genetic distances between orthologous copies from different individuals were 0.63% for CR1 and 1.2% for CR2app:addword:respectively, and the average distance between paralogous copies of CR1 and CR2 was 4.8%. Phylogenetic relationships were reconstructed from the CR sequences, excluding the variable number of tandem repeats (VNTRs) at the 3' end using three methods: neighbor-joining, maximum likelihood algorithm, and Bayesian inference. These data show that any two CRs within individuals were more genetically distant from orthologous genes in different individuals within the same subspecies. This suggests independent evolution of the two mtCRs within each P. megacephalum subspecies. Reconstruction of separate phylogenetic trees using different CR components (TAS, CD, CSB, and VNTRs) suggested the role of recombination in the evolution of duplicate CRs. Consequently, recombination events were detected using RDP software with break points at ≈290 bp and ≈1,080 bp. Based on these results, we hypothesize that duplicate CRs in P. megacephalum originated from heterological ancestral recombination of mtDNA. Subsequent recombination could have resulted in homogenization during independent evolutionary events, thus maintaining the functions of duplicate CRs in the mtDNA of P. megacephalum.

  11. Recombination and Evolution of Duplicate Control Regions in the Mitochondrial Genome of the Asian Big-Headed Turtle, Platysternon megacephalum

    PubMed Central

    Zheng, Chenfei; Nie, Liuwang; Wang, Jue; Zhou, Huaxing; Hou, Huazhen; Wang, Hao; Liu, Juanjuan

    2013-01-01

    Complete mitochondrial (mt) genome sequences with duplicate control regions (CRs) have been detected in various animal species. In Testudines, duplicate mtCRs have been reported in the mtDNA of the Asian big-headed turtle, Platysternon megacephalum, which has three living subspecies. However, the evolutionary pattern of these CRs remains unclear. In this study, we report the completed sequences of duplicate CRs from 20 individuals belonging to three subspecies of this turtle and discuss the micro-evolutionary analysis of the evolution of duplicate CRs. Genetic distances calculated with MEGA 4.1 using the complete duplicate CR sequences revealed that within turtle subspecies, genetic distances between orthologous copies from different individuals were 0.63% for CR1 and 1.2% for CR2app:addword:respectively, and the average distance between paralogous copies of CR1 and CR2 was 4.8%. Phylogenetic relationships were reconstructed from the CR sequences, excluding the variable number of tandem repeats (VNTRs) at the 3′ end using three methods: neighbor-joining, maximum likelihood algorithm, and Bayesian inference. These data show that any two CRs within individuals were more genetically distant from orthologous genes in different individuals within the same subspecies. This suggests independent evolution of the two mtCRs within each P. megacephalum subspecies. Reconstruction of separate phylogenetic trees using different CR components (TAS, CD, CSB, and VNTRs) suggested the role of recombination in the evolution of duplicate CRs. Consequently, recombination events were detected using RDP software with break points at ≈290 bp and ≈1,080 bp. Based on these results, we hypothesize that duplicate CRs in P. megacephalum originated from heterological ancestral recombination of mtDNA. Subsequent recombination could have resulted in homogenization during independent evolutionary events, thus maintaining the functions of duplicate CRs in the mtDNA of P. megacephalum. PMID

  12. BigDataScript: a scripting language for data pipelines.

    PubMed

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. © The Author 2014. Published by Oxford University Press.

  13. BigDataScript: a scripting language for data pipelines

    PubMed Central

    Cingolani, Pablo; Sladek, Rob; Blanchette, Mathieu

    2015-01-01

    Motivation: The analysis of large biological datasets often requires complex processing pipelines that run for a long time on large computational infrastructures. We designed and implemented a simple script-like programming language with a clean and minimalist syntax to develop and manage pipeline execution and provide robustness to various types of software and hardware failures as well as portability. Results: We introduce the BigDataScript (BDS) programming language for data processing pipelines, which improves abstraction from hardware resources and assists with robustness. Hardware abstraction allows BDS pipelines to run without modification on a wide range of computer architectures, from a small laptop to multi-core servers, server farms, clusters and clouds. BDS achieves robustness by incorporating the concepts of absolute serialization and lazy processing, thus allowing pipelines to recover from errors. By abstracting pipeline concepts at programming language level, BDS simplifies implementation, execution and management of complex bioinformatics pipelines, resulting in reduced development and debugging cycles as well as cleaner code. Availability and implementation: BigDataScript is available under open-source license at http://pcingola.github.io/BigDataScript. Contact: pablo.e.cingolani@gmail.com PMID:25189778

  14. Interventions for treating osteoarthritis of the big toe joint.

    PubMed

    Zammit, Gerard V; Menz, Hylton B; Munteanu, Shannon E; Landorf, Karl B; Gilheany, Mark F

    2010-09-08

    Osteoarthritis affecting of the big toe joint of the foot (hallux limitus or rigidus) is a common and painful condition. Although several treatments have been proposed, few have been adequately evaluated. To identify controlled trials evaluating interventions for osteoarthritis of the big toe joint and to determine the optimum intervention(s). Literature searches were conducted across the following electronic databases: CENTRAL; MEDLINE; EMBASE; CINAHL; and PEDro (to 14th January 2010). No language restrictions were applied. Randomised controlled trials, quasi-randomised trials, or controlled clinical trials that assessed treatment outcomes for osteoarthritis of the big toe joint. Participants of any age or gender with osteoarthritis of the big toe joint (defined either radiographically or clinically) were included. Two authors examined the list of titles and abstracts identified by the literature searches. One content area expert and one methodologist independently applied the pre-determined inclusion and exclusion criteria to the full text of identified trials. To minimise error and reduce potential bias, data were extracted independently by two content experts. Only one trial satisfactorily fulfilled the inclusion criteria and was included in this review. This trial evaluated the effectiveness of two physical therapy programs in 20 individuals with osteoarthritis of the big toe joint. Assessment outcomes included pain levels, big toe joint range of motion and plantar flexion strength of the hallux. Mean differences at four weeks follow up were 3.80 points (95% CI 2.74 to 4.86) for self reported pain, 28.30 degrees (95% CI 21.37 to 35.23) for big toe joint range of motion, and 2.80 kg (95% CI 2.13 to 3.47) for muscle strength. Although differences in outcomes between treatment and control groups were reported, the risk of bias was high. The trial failed to employ appropriate randomisation or adequate allocation concealment, used a relatively small sample and

  15. [Big data, Roemer's law and avoidable hospital admissions].

    PubMed

    van der Horst, H E

    2016-01-01

    From an analysis of data from 23 European countries to determine the impact of primary care on avoidable hospital admissions for uncontrolled diabetes it appeared that, contrary to expectation, countries with strong primary care did not have a lower rate of avoidable hospital admission. It is clear that Roemer's law, 'a bed built is a bed filled,' still applies. However, the validity of this sort of analysis can be questioned, as these data are highly aggregated, and registration quality differs between countries. It is also questionable if these datasets can be considered as 'big data' as there are relatively small numbers per country. Big data analyses are useful for discerning patterns and formulating hypotheses, but not for proving causality. An unwanted side effect of this kind of analysis might be that policymakers use these not so valid results to underpin their policy to their advantage.

  16. Mountain big sagebrush age distribution and relationships on the northern Yellowstone Winter Range

    Treesearch

    Carl L. Wambolt; Trista L. Hoffman

    2001-01-01

    This study was conducted within the Gardiner Basin, an especially critical wintering area for native ungulates utilizing the Northern Yellowstone Winter Range. Mountain big sagebrush plants on 33 sites were classified as large (≥22 cm canopy cover), small (

  17. Fixing the Big Bang Theory's Lithium Problem

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2017-02-01

    How did our universe come into being? The Big Bang theory is a widely accepted and highly successful cosmological model of the universe, but it does introduce one puzzle: the cosmological lithium problem. Have scientists now found a solution?Too Much LithiumIn the Big Bang theory, the universe expanded rapidly from a very high-density and high-temperature state dominated by radiation. This theory has been validated again and again: the discovery of the cosmic microwave background radiation and observations of the large-scale structure of the universe both beautifully support the Big Bang theory, for instance. But one pesky trouble-spot remains: the abundance of lithium.The arrows show the primary reactions involved in Big Bang nucleosynthesis, and their flux ratios, as predicted by the authors model, are given on the right. Synthesizing primordial elements is complicated! [Hou et al. 2017]According to Big Bang nucleosynthesis theory, primordial nucleosynthesis ran wild during the first half hour of the universes existence. This produced most of the universes helium and small amounts of other light nuclides, including deuterium and lithium.But while predictions match the observed primordial deuterium and helium abundances, Big Bang nucleosynthesis theory overpredicts the abundance of primordial lithium by about a factor of three. This inconsistency is known as the cosmological lithium problem and attempts to resolve it using conventional astrophysics and nuclear physics over the past few decades have not been successful.In a recent publicationled by Suqing Hou (Institute of Modern Physics, Chinese Academy of Sciences) and advisorJianjun He (Institute of Modern Physics National Astronomical Observatories, Chinese Academy of Sciences), however, a team of scientists has proposed an elegant solution to this problem.Time and temperature evolution of the abundances of primordial light elements during the beginning of the universe. The authors model (dotted lines

  18. Big Challenges and Big Opportunities: The Power of "Big Ideas" to Change Curriculum and the Culture of Teacher Planning

    ERIC Educational Resources Information Center

    Hurst, Chris

    2014-01-01

    Mathematical knowledge of pre-service teachers is currently "under the microscope" and the subject of research. This paper proposes a different approach to teacher content knowledge based on the "big ideas" of mathematics and the connections that exist within and between them. It is suggested that these "big ideas"…

  19. Countering misinformation concerning big sagebrush

    Treesearch

    Bruce L Welch; Craig Criddle

    2003-01-01

    This paper examines the scientific merits of eight axioms of range or vegetative management pertaining to big sagebrush. These axioms are: (1) Wyoming big sagebrush (Artemisia tridentata ssp. wyomingensis) does not naturally exceed 10 percent canopy cover and mountain big sagebrush (A. t. ssp. vaseyana) does not naturally exceed 20 percent canopy...

  20. BigNeuron dataset V.0.0

    DOE Data Explorer

    Ramanathan, Arvind

    2016-01-01

    The cleaned bench testing reconstructions for the gold166 datasets have been put online at github https://github.com/BigNeuron/Events-and-News/wiki/BigNeuron-Events-and-News https://github.com/BigNeuron/Data/releases/tag/gold166_bt_v1.0 The respective image datasets were released a while ago from other sites (major pointer is available at github as well https://github.com/BigNeuron/Data/releases/tag/Gold166_v1 but since the files were big, the actual downloading was distributed at 3 continents separately)

  1. Ocean Networks Canada's "Big Data" Initiative

    NASA Astrophysics Data System (ADS)

    Dewey, R. K.; Hoeberechts, M.; Moran, K.; Pirenne, B.; Owens, D.

    2013-12-01

    Ocean Networks Canada operates two large undersea observatories that collect, archive, and deliver data in real time over the Internet. These data contribute to our understanding of the complex changes taking place on our ocean planet. Ocean Networks Canada's VENUS was the world's first cabled seafloor observatory to enable researchers anywhere to connect in real time to undersea experiments and observations. Its NEPTUNE observatory is the largest cabled ocean observatory, spanning a wide range of ocean environments. Most recently, we installed a new small observatory in the Arctic. Together, these observatories deliver "Big Data" across many disciplines in a cohesive manner using the Oceans 2.0 data management and archiving system that provides national and international users with open access to real-time and archived data while also supporting a collaborative work environment. Ocean Networks Canada operates these observatories to support science, innovation, and learning in four priority areas: study of the impact of climate change on the ocean; the exploration and understanding the unique life forms in the extreme environments of the deep ocean and below the seafloor; the exchange of heat, fluids, and gases that move throughout the ocean and atmosphere; and the dynamics of earthquakes, tsunamis, and undersea landslides. To date, the Ocean Networks Canada archive contains over 130 TB (collected over 7 years) and the current rate of data acquisition is ~50 TB per year. This data set is complex and diverse. Making these "Big Data" accessible and attractive to users is our priority. In this presentation, we share our experience as a "Big Data" institution where we deliver simple and multi-dimensional calibrated data cubes to a diverse pool of users. Ocean Networks Canada also conducts extensive user testing. Test results guide future tool design and development of "Big Data" products. We strive to bridge the gap between the raw, archived data and the needs and

  2. Big data - a 21st century science Maginot Line? No-boundary thinking: shifting from the big data paradigm.

    PubMed

    Huang, Xiuzhen; Jennings, Steven F; Bruce, Barry; Buchan, Alison; Cai, Liming; Chen, Pengyin; Cramer, Carole L; Guan, Weihua; Hilgert, Uwe Kk; Jiang, Hongmei; Li, Zenglu; McClure, Gail; McMullen, Donald F; Nanduri, Bindu; Perkins, Andy; Rekepalli, Bhanu; Salem, Saeed; Specker, Jennifer; Walker, Karl; Wunsch, Donald; Xiong, Donghai; Zhang, Shuzhong; Zhang, Yu; Zhao, Zhongming; Moore, Jason H

    2015-01-01

    Whether your interests lie in scientific arenas, the corporate world, or in government, you have certainly heard the praises of big data: Big data will give you new insights, allow you to become more efficient, and/or will solve your problems. While big data has had some outstanding successes, many are now beginning to see that it is not the Silver Bullet that it has been touted to be. Here our main concern is the overall impact of big data; the current manifestation of big data is constructing a Maginot Line in science in the 21st century. Big data is not "lots of data" as a phenomena anymore; The big data paradigm is putting the spirit of the Maginot Line into lots of data. Big data overall is disconnecting researchers and science challenges. We propose No-Boundary Thinking (NBT), applying no-boundary thinking in problem defining to address science challenges.

  3. Big Challenges for a Small City School

    ERIC Educational Resources Information Center

    Principal Leadership, 2007

    2007-01-01

    So well-chronicled are the challenges faced by schools in large urban and metropolitan areas that a lay person may perceive the nation's rural and small-city schools as bucolic settings where educators do not have a care in the world other than keeping the occasional cow from wandering onto the playground during recess. The reality, of course, is…

  4. Big Data and Chemical Education

    ERIC Educational Resources Information Center

    Pence, Harry E.; Williams, Antony J.

    2016-01-01

    The amount of computerized information that organizations collect and process is growing so large that the term Big Data is commonly being used to describe the situation. Accordingly, Big Data is defined by a combination of the Volume, Variety, Velocity, and Veracity of the data being processed. Big Data tools are already having an impact in…

  5. Big data in fashion industry

    NASA Astrophysics Data System (ADS)

    Jain, S.; Bruniaux, J.; Zeng, X.; Bruniaux, P.

    2017-10-01

    Significant work has been done in the field of big data in last decade. The concept of big data includes analysing voluminous data to extract valuable information. In the fashion world, big data is increasingly playing a part in trend forecasting, analysing consumer behaviour, preference and emotions. The purpose of this paper is to introduce the term fashion data and why it can be considered as big data. It also gives a broad classification of the types of fashion data and briefly defines them. Also, the methodology and working of a system that will use this data is briefly described.

  6. IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.

    2014-12-01

    The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to

  7. The Big6 Collection: The Best of the Big6 Newsletter.

    ERIC Educational Resources Information Center

    Eisenberg, Michael B.; Berkowitz, Robert E.

    The Big6 is a complete approach to implementing meaningful learning and teaching of information and technology skills, essential for 21st century living. Including in-depth articles, practical tips, and explanations, this book offers a varied range of material about students and teachers, the Big6, and curriculum. The book is divided into 10 main…

  8. Big Data Bioinformatics

    PubMed Central

    GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO

    2017-01-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:27908398

  9. Big Data Bioinformatics

    PubMed Central

    GREENE, CASEY S.; TAN, JIE; UNG, MATTHEW; MOORE, JASON H.; CHENG, CHAO

    2017-01-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the “big data” era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both “machine learning” algorithms as well as “unsupervised” and “supervised” examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. PMID:24799088

  10. Big data bioinformatics.

    PubMed

    Greene, Casey S; Tan, Jie; Ung, Matthew; Moore, Jason H; Cheng, Chao

    2014-12-01

    Recent technological advances allow for high throughput profiling of biological systems in a cost-efficient manner. The low cost of data generation is leading us to the "big data" era. The availability of big data provides unprecedented opportunities but also raises new challenges for data mining and analysis. In this review, we introduce key concepts in the analysis of big data, including both "machine learning" algorithms as well as "unsupervised" and "supervised" examples of each. We note packages for the R programming language that are available to perform machine learning analyses. In addition to programming based solutions, we review webservers that allow users with limited or no programming background to perform these analyses on large data compendia. © 2014 Wiley Periodicals, Inc.

  11. Changing the personality of a face: Perceived Big Two and Big Five personality factors modeled in real photographs.

    PubMed

    Walker, Mirella; Vetter, Thomas

    2016-04-01

    General, spontaneous evaluations of strangers based on their faces have been shown to reflect judgments of these persons' intention and ability to harm. These evaluations can be mapped onto a 2D space defined by the dimensions trustworthiness (intention) and dominance (ability). Here we go beyond general evaluations and focus on more specific personality judgments derived from the Big Two and Big Five personality concepts. In particular, we investigate whether Big Two/Big Five personality judgments can be mapped onto the 2D space defined by the dimensions trustworthiness and dominance. Results indicate that judgments of the Big Two personality dimensions almost perfectly map onto the 2D space. In contrast, at least 3 of the Big Five dimensions (i.e., neuroticism, extraversion, and conscientiousness) go beyond the 2D space, indicating that additional dimensions are necessary to describe more specific face-based personality judgments accurately. Building on this evidence, we model the Big Two/Big Five personality dimensions in real facial photographs. Results from 2 validation studies show that the Big Two/Big Five are perceived reliably across different samples of faces and participants. Moreover, results reveal that participants differentiate reliably between the different Big Two/Big Five dimensions. Importantly, this high level of agreement and differentiation in personality judgments from faces likely creates a subjective reality which may have serious consequences for those being perceived-notably, these consequences ensue because the subjective reality is socially shared, irrespective of the judgments' validity. The methodological approach introduced here might prove useful in various psychological disciplines. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  12. Big game hunting practices, meanings, motivations and constraints: a survey of Oregon big game hunters

    Treesearch

    Suresh K. Shrestha; Robert C. Burns

    2012-01-01

    We conducted a self-administered mail survey in September 2009 with randomly selected Oregon hunters who had purchased big game hunting licenses/tags for the 2008 hunting season. Survey questions explored hunting practices, the meanings of and motivations for big game hunting, the constraints to big game hunting participation, and the effects of age, years of hunting...

  13. The Big Bang Theory

    ScienceCinema

    Lincoln, Don

    2018-01-16

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  14. The Big Bang Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    The Big Bang is the name of the most respected theory of the creation of the universe. Basically, the theory says that the universe was once smaller and denser and has been expending for eons. One common misconception is that the Big Bang theory says something about the instant that set the expansion into motion, however this isn’t true. In this video, Fermilab’s Dr. Don Lincoln tells about the Big Bang theory and sketches some speculative ideas about what caused the universe to come into existence.

  15. Seeding considerations in restoring big sagebrush habitat

    Treesearch

    Scott M. Lambert

    2005-01-01

    This paper describes methods of managing or seeding to restore big sagebrush communities for wildlife habitat. The focus is on three big sagebrush subspecies, Wyoming big sagebrush (Artemisia tridentata ssp. wyomingensis), basin big sagebrush (Artemisia tridentata ssp. tridentata), and mountain...

  16. Vacuum Head Checks Foam/Substrate Bonds

    NASA Technical Reports Server (NTRS)

    Lloyd, James F.

    1989-01-01

    Electromechanical inspection system quickly gives measurements indicating adhesion, or lack thereof, between rigid polyurethane foam and aluminum substrate. Does not damage inspected article, easy to operate, and used to perform "go/no-go" evaluations or as supplement to conventional destructive pull-plug testing. Applies vacuum to small area of foam panel and measures distance through which foam pulled into vacuum. Probe head applied to specimen and evacuated through hose to controller/monitor unit. Digital voltmeter in unit reads deflection of LVDT probe head.

  17. ARTIST CONCEPT - BIG JOE

    NASA Image and Video Library

    1963-09-01

    S63-19317 (October 1963) --- Pen and ink views of comparative arrangements of several capsules including the existing "Big Joe" design, the compromise "Big Joe" design, and the "Little Joe". All capsule designs are labeled and include dimensions. Photo credit: NASA

  18. Big Society, Big Deal?

    ERIC Educational Resources Information Center

    Thomson, Alastair

    2011-01-01

    Political leaders like to put forward guiding ideas or themes which pull their individual decisions into a broader narrative. For John Major it was Back to Basics, for Tony Blair it was the Third Way and for David Cameron it is the Big Society. While Mr. Blair relied on Lord Giddens to add intellectual weight to his idea, Mr. Cameron's legacy idea…

  19. Population connectivity of endangered Ozark big-eared bats (Corynorhinus townsendii ingens)

    USGS Publications Warehouse

    Lee, Dana N.; Stark, Richard C.; Puckette, William L.; Hamilton, Meredith J.; Leslie, David M.; Van Den Bussche, Ronald A.

    2015-01-01

    The endangered Ozark big-eared bat (Corynorhinus townsendii ingens) is restricted to eastern Oklahoma and western and north-central Arkansas, where populations may be susceptible to losses of genetic variation due to patchy distribution of colonies and potentially small effective population sizes. We used mitochondrial D-loop DNA sequences and 15 nuclear microsatellite loci to determine population connectivity among Ozark big-eared bat caves. Assessment of 7 caves revealed a haplotype not detected in a previous study (2002–2003) and gene flow among colonies in eastern Oklahoma. Our data suggest genetic mixing of individuals, which may be occurring at nearby swarming sites in the autumn. Further evidence of limited gene flow between caves in Oklahoma with a cave in Arkansas highlights the importance of including samples from geographically widespread caves to fully understand gene flow in this subspecies. It appears autumn swarming sites and winter hibernacula play an important role in providing opportunities for mating; therefore, we suggest protection of these sites, maternity caves, and surrounding habitat to facilitate gene flow among populations of Ozark big-eared bats.

  20. Big Data Analytics in Medicine and Healthcare.

    PubMed

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  1. The Big Bang Singularity

    NASA Astrophysics Data System (ADS)

    Ling, Eric

    The big bang theory is a model of the universe which makes the striking prediction that the universe began a finite amount of time in the past at the so called "Big Bang singularity." We explore the physical and mathematical justification of this surprising result. After laying down the framework of the universe as a spacetime manifold, we combine physical observations with global symmetrical assumptions to deduce the FRW cosmological models which predict a big bang singularity. Next we prove a couple theorems due to Stephen Hawking which show that the big bang singularity exists even if one removes the global symmetrical assumptions. Lastly, we investigate the conditions one needs to impose on a spacetime if one wishes to avoid a singularity. The ideas and concepts used here to study spacetimes are similar to those used to study Riemannian manifolds, therefore we compare and contrast the two geometries throughout.

  2. 58. View of the big meadow at the Billings Farm ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    58. View of the big meadow at the Billings Farm & Museum, looking west toward Mount Tom. The view illustrates the relation of the forested hillside lands to the agicultural fields in the valley, and includes the east facade of the mansion, visible as a small gap in the distant trees at center. - Marsh-Billings-Rockefeller National Historical Park, 54 Elm Street, Woodstock, Windsor County, VT

  3. Big Data Science: Opportunities and Challenges to Address Minority Health and Health Disparities in the 21st Century

    PubMed Central

    Zhang, Xinzhi; Pérez-Stable, Eliseo J.; Bourne, Philip E.; Peprah, Emmanuel; Duru, O. Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S.; Wong, David W.S.; Denny, Joshua

    2017-01-01

    Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them. PMID:28439179

  4. Big Data Science: Opportunities and Challenges to Address Minority Health and Health Disparities in the 21st Century.

    PubMed

    Zhang, Xinzhi; Pérez-Stable, Eliseo J; Bourne, Philip E; Peprah, Emmanuel; Duru, O Kenrik; Breen, Nancy; Berrigan, David; Wood, Fred; Jackson, James S; Wong, David W S; Denny, Joshua

    2017-01-01

    Addressing minority health and health disparities has been a missing piece of the puzzle in Big Data science. This article focuses on three priority opportunities that Big Data science may offer to the reduction of health and health care disparities. One opportunity is to incorporate standardized information on demographic and social determinants in electronic health records in order to target ways to improve quality of care for the most disadvantaged populations over time. A second opportunity is to enhance public health surveillance by linking geographical variables and social determinants of health for geographically defined populations to clinical data and health outcomes. Third and most importantly, Big Data science may lead to a better understanding of the etiology of health disparities and understanding of minority health in order to guide intervention development. However, the promise of Big Data needs to be considered in light of significant challenges that threaten to widen health disparities. Care must be taken to incorporate diverse populations to realize the potential benefits. Specific recommendations include investing in data collection on small sample populations, building a diverse workforce pipeline for data science, actively seeking to reduce digital divides, developing novel ways to assure digital data privacy for small populations, and promoting widespread data sharing to benefit under-resourced minority-serving institutions and minority researchers. With deliberate efforts, Big Data presents a dramatic opportunity for reducing health disparities but without active engagement, it risks further widening them.

  5. Reliability-Based Life Assessment of Stirling Convertor Heater Head

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Halford, Gary R.; Korovaichuk, Igor

    2004-01-01

    Onboard radioisotope power systems being developed and planned for NASA's deep-space missions require reliable design lifetimes of up to 14 yr. The structurally critical heater head of the high-efficiency Stirling power convertor has undergone extensive computational analysis of operating temperatures, stresses, and creep resistance of the thin-walled Inconel 718 bill of material. A preliminary assessment of the effect of uncertainties in the material behavior was also performed. Creep failure resistance of the thin-walled heater head could show variation due to small deviations in the manufactured thickness and in uncertainties in operating temperature and pressure. Durability prediction and reliability of the heater head are affected by these deviations from nominal design conditions. Therefore, it is important to include the effects of these uncertainties in predicting the probability of survival of the heater head under mission loads. Furthermore, it may be possible for the heater head to experience rare incidences of small temperature excursions of short duration. These rare incidences would affect the creep strain rate and, therefore, the life. This paper addresses the effects of such rare incidences on the reliability. In addition, the sensitivities of variables affecting the reliability are quantified, and guidelines developed to improve the reliability are outlined. Heater head reliability is being quantified with data from NASA Glenn Research Center's accelerated benchmark testing program.

  6. Constraining antimatter domains in the early universe with big bang nucleosynthesis.

    PubMed

    Kurki-Suonio, H; Sihvola, E

    2000-04-24

    We consider the effect of a small-scale matter-antimatter domain structure on big bang nucleosynthesis and place upper limits on the amount of antimatter in the early universe. For small domains, which annihilate before nucleosynthesis, this limit comes from underproduction of 4He. For larger domains, the limit comes from 3He overproduction. Since most of the 3He from &pmacr; 4He annihilation are themselves annihilated, the main source of primordial 3He is the photodisintegration of 4He by the electromagnetic cascades initiated by the annihilation.

  7. Medical big data: promise and challenges.

    PubMed

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-03-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology.

  8. Medical big data: promise and challenges

    PubMed Central

    Lee, Choong Ho; Yoon, Hyung-Jin

    2017-01-01

    The concept of big data, commonly characterized by volume, variety, velocity, and veracity, goes far beyond the data type and includes the aspects of data analysis, such as hypothesis-generating, rather than hypothesis-testing. Big data focuses on temporal stability of the association, rather than on causal relationship and underlying probability distribution assumptions are frequently not required. Medical big data as material to be analyzed has various features that are not only distinct from big data of other disciplines, but also distinct from traditional clinical epidemiology. Big data technology has many areas of application in healthcare, such as predictive modeling and clinical decision support, disease or safety surveillance, public health, and research. Big data analytics frequently exploits analytic methods developed in data mining, including classification, clustering, and regression. Medical big data analyses are complicated by many technical issues, such as missing values, curse of dimensionality, and bias control, and share the inherent limitations of observation study, namely the inability to test causality resulting from residual confounding and reverse causation. Recently, propensity score analysis and instrumental variable analysis have been introduced to overcome these limitations, and they have accomplished a great deal. Many challenges, such as the absence of evidence of practical benefits of big data, methodological issues including legal and ethical issues, and clinical integration and utility issues, must be overcome to realize the promise of medical big data as the fuel of a continuous learning healthcare system that will improve patient outcome and reduce waste in areas including nephrology. PMID:28392994

  9. Measuring the Promise of Big Data Syllabi

    ERIC Educational Resources Information Center

    Friedman, Alon

    2018-01-01

    Growing interest in Big Data is leading industries, academics and governments to accelerate Big Data research. However, how teachers should teach Big Data has not been fully examined. This article suggests criteria for redesigning Big Data syllabi in public and private degree-awarding higher education establishments. The author conducted a survey…

  10. Forget the hype or reality. Big data presents new opportunities in Earth Science.

    NASA Astrophysics Data System (ADS)

    Lee, T. J.

    2015-12-01

    Earth science is arguably one of the most mature science discipline which constantly acquires, curates, and utilizes a large volume of data with diverse variety. We deal with big data before there is big data. For example, while developing the EOS program in the 1980s, the EOS data and information system (EOSDIS) was developed to manage the vast amount of data acquired by the EOS fleet of satellites. EOSDIS continues to be a shining example of modern science data systems in the past two decades. With the explosion of internet, the usage of social media, and the provision of sensors everywhere, the big data era has bring new challenges. First, Goggle developed the search algorithm and a distributed data management system. The open source communities quickly followed up and developed Hadoop file system to facility the map reduce workloads. The internet continues to generate tens of petabytes of data every day. There is a significant shortage of algorithms and knowledgeable manpower to mine the data. In response, the federal government developed the big data programs that fund research and development projects and training programs to tackle these new challenges. Meanwhile, comparatively to the internet data explosion, Earth science big data problem has become quite small. Nevertheless, the big data era presents an opportunity for Earth science to evolve. We learned about the MapReduce algorithms, in memory data mining, machine learning, graph analysis, and semantic web technologies. How do we apply these new technologies to our discipline and bring the hype to Earth? In this talk, I will discuss how we might want to apply some of the big data technologies to our discipline and solve many of our challenging problems. More importantly, I will propose new Earth science data system architecture to enable new type of scientific inquires.

  11. The big bang

    NASA Astrophysics Data System (ADS)

    Silk, Joseph

    Our universe was born billions of years ago in a hot, violent explosion of elementary particles and radiation - the big bang. What do we know about this ultimate moment of creation, and how do we know it? Drawing upon the latest theories and technology, this new edition of The big bang, is a sweeping, lucid account of the event that set the universe in motion. Joseph Silk begins his story with the first microseconds of the big bang, on through the evolution of stars, galaxies, clusters of galaxies, quasars, and into the distant future of our universe. He also explores the fascinating evidence for the big bang model and recounts the history of cosmological speculation. Revised and updated, this new edition features all the most recent astronomical advances, including: Photos and measurements from the Hubble Space Telescope, Cosmic Background Explorer Satellite (COBE), and Infrared Space Observatory; the latest estimates of the age of the universe; new ideas in string and superstring theory; recent experiments on neutrino detection; new theories about the presence of dark matter in galaxies; new developments in the theory of the formation and evolution of galaxies; the latest ideas about black holes, worm holes, quantum foam, and multiple universes.

  12. Big-BOE: Fusing Spanish Official Gazette with Big Data Technology.

    PubMed

    Basanta-Val, Pablo; Sánchez-Fernández, Luis

    2018-06-01

    The proliferation of new data sources, stemmed from the adoption of open-data schemes, in combination with an increasing computing capacity causes the inception of new type of analytics that process Internet of things with low-cost engines to speed up data processing using parallel computing. In this context, the article presents an initiative, called BIG-Boletín Oficial del Estado (BOE), designed to process the Spanish official government gazette (BOE) with state-of-the-art processing engines, to reduce computation time and to offer additional speed up for big data analysts. The goal of including a big data infrastructure is to be able to process different BOE documents in parallel with specific analytics, to search for several issues in different documents. The application infrastructure processing engine is described from an architectural perspective and from performance, showing evidence on how this type of infrastructure improves the performance of different types of simple analytics as several machines cooperate.

  13. Big Data's Role in Precision Public Health.

    PubMed

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts.

  14. Antigravity and the big crunch/big bang transition

    NASA Astrophysics Data System (ADS)

    Bars, Itzhak; Chen, Shih-Hung; Steinhardt, Paul J.; Turok, Neil

    2012-08-01

    We point out a new phenomenon which seems to be generic in 4d effective theories of scalar fields coupled to Einstein gravity, when applied to cosmology. A lift of such theories to a Weyl-invariant extension allows one to define classical evolution through cosmological singularities unambiguously, and hence construct geodesically complete background spacetimes. An attractor mechanism ensures that, at the level of the effective theory, generic solutions undergo a big crunch/big bang transition by contracting to zero size, passing through a brief antigravity phase, shrinking to zero size again, and re-emerging into an expanding normal gravity phase. The result may be useful for the construction of complete bouncing cosmologies like the cyclic model.

  15. Restoring Wyoming big sagebrush

    Treesearch

    Cindy R. Lysne

    2005-01-01

    The widespread occurrence of big sagebrush can be attributed to many adaptive features. Big sagebrush plays an essential role in its communities by providing wildlife habitat, modifying local environmental conditions, and facilitating the reestablishment of native herbs. Currently, however, many sagebrush steppe communities are highly fragmented. As a result, restoring...

  16. Heading and head injuries in soccer.

    PubMed

    Kirkendall, D T; Jordan, S E; Garrett, W E

    2001-01-01

    In the world of sports, soccer is unique because of the purposeful use of the unprotected head for controlling and advancing the ball. This skill obviously places the player at risk of head injury and the game does carry some risk. Head injury can be a result of contact of the head with another head (or other body parts), ground, goal post, other unknown objects or even the ball. Such impacts can lead to contusions, fractures, eye injuries, concussions or even, in rare cases, death. Coaches, players, parents and physicians are rightly concerned about the risk of head injury in soccer. Current research shows that selected soccer players have some degree of cognitive dysfunction. It is important to determine the reasons behind such deficits. Purposeful heading has been blamed, but a closer look at the studies that focus on heading has revealed methodological concerns that question the validity of blaming purposeful heading of the ball. The player's history and age (did they play when the ball was leather and could absorb significant amounts of water), alcohol intake, drug intake, learning disabilities, concussion definition and control group use/composition are all factors that cloud the ability to blame purposeful heading. What does seem clear is that a player's history of concussive episodes is a more likely explanation for cognitive deficits. While it is likely that the subconcussive impact of purposeful heading is a doubtful factor in the noted deficits, it is unknown whether multiple subconcussive impacts might have some lingering effects. In addition, it is unknown whether the noted deficits have any affect on daily life. Proper instruction in the technique is critical because if the ball contacts an unprepared head (as in accidental head-ball contacts), the potential for serious injury is possible. To further our understanding of the relationship of heading, head injury and cognitive deficits, we need to: learn more about the actual impact of a ball on the

  17. 35. View of the big meadow at the Billings Farm ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    35. View of the big meadow at the Billings Farm & Museum, looking west toward Mount Tom (more distant view). The view illustrates the relation of the forested hillside lands to the agricultural fields in the valley, and includes the east facade of the mansion, visible as a small gap in the distant trees at center. - Marsh-Billings-Rockefeller National Historical Park, 54 Elm Street, Woodstock, Windsor County, VT

  18. Printed circuit board for a CCD camera head

    DOEpatents

    Conder, Alan D.

    2002-01-01

    A charge-coupled device (CCD) camera head which can replace film for digital imaging of visible light, ultraviolet radiation, and soft to penetrating x-rays, such as within a target chamber where laser produced plasmas are studied. The camera head is small, capable of operating both in and out of a vacuum environment, and is versatile. The CCD camera head uses PC boards with an internal heat sink connected to the chassis for heat dissipation, which allows for close (0.04" for example) stacking of the PC boards. Integration of this CCD camera head into existing instrumentation provides a substantial enhancement of diagnostic capabilities for studying high energy density plasmas, for a variety of military industrial, and medical imaging applications.

  19. The effect of long and short head biceps loading on glenohumeral joint rotational range of motion and humeral head position.

    PubMed

    McGarry, Michelle H; Nguyen, Michael L; Quigley, Ryan J; Hanypsiak, Bryan; Gupta, Ranjan; Lee, Thay Q

    2016-06-01

    To evaluate the effect of loading the long and short heads of the biceps on glenohumeral range of motion and humeral head position. Eight cadaveric shoulders were tested in 60° abduction in the scapula and coronal plane. Muscle loading was applied based on cross-sectional area ratios. The short and long head of the biceps were loaded individually followed by combined loading. Range of motion was measured with 2.2 Nm torque, and the humeral head apex position was measured using a MicroScribe. A paired t test with Bonferroni correction was used for statistics. Long head loading decreased internal rotation in both the scapular (17.9 %) and coronal planes (5.7 %) and external rotation in the scapular plane (2.6 %) (P < 0.04). With only short head loading, maximum internal rotation was significantly increased in the scapular and coronal plane. Long head and short head loading shifted the humeral head apex posteriorly in maximum internal rotation in both planes with the long head shift being significantly greater than the short head. Long head loading also shifted the humeral apex inferiorly in internal rotation and inferiorly posteriorly in neutral rotation in the scapular plane. With the long head unloaded, there was a significant superior shift with short head loading in both planes. Loading the long head of the biceps had a much greater effect on glenohumeral range of motion and humeral head shift than the short head of the biceps; however, in the absence of long head loading, with the short head loaded, maximum internal rotation increases and the humeral head shifts superiorly, which may contribute to impingement following tenodesis of the long head of the biceps. These small changes in rotational range of motion and humeral head position with biceps tenodesis may not lead to pathologic conditions in low-demand patients; however, in throwers, biceps tenodesis may lead to increased contact pressures in late-cocking and deceleration that will likely translate

  20. Perspectives on making big data analytics work for oncology.

    PubMed

    El Naqa, Issam

    2016-12-01

    Oncology, with its unique combination of clinical, physical, technological, and biological data provides an ideal case study for applying big data analytics to improve cancer treatment safety and outcomes. An oncology treatment course such as chemoradiotherapy can generate a large pool of information carrying the 5Vs hallmarks of big data. This data is comprised of a heterogeneous mixture of patient demographics, radiation/chemo dosimetry, multimodality imaging features, and biological markers generated over a treatment period that can span few days to several weeks. Efforts using commercial and in-house tools are underway to facilitate data aggregation, ontology creation, sharing, visualization and varying analytics in a secure environment. However, open questions related to proper data structure representation and effective analytics tools to support oncology decision-making need to be addressed. It is recognized that oncology data constitutes a mix of structured (tabulated) and unstructured (electronic documents) that need to be processed to facilitate searching and subsequent knowledge discovery from relational or NoSQL databases. In this context, methods based on advanced analytics and image feature extraction for oncology applications will be discussed. On the other hand, the classical p (variables)≫n (samples) inference problem of statistical learning is challenged in the Big data realm and this is particularly true for oncology applications where p-omics is witnessing exponential growth while the number of cancer incidences has generally plateaued over the past 5-years leading to a quasi-linear growth in samples per patient. Within the Big data paradigm, this kind of phenomenon may yield undesirable effects such as echo chamber anomalies, Yule-Simpson reversal paradox, or misleading ghost analytics. In this work, we will present these effects as they pertain to oncology and engage small thinking methodologies to counter these effects ranging from

  1. Small species indicate big changes? Arctic report card

    USDA-ARS?s Scientific Manuscript database

    As Arctic climate warms, how will terrestrial ecosystems and the communities that they support respond in the coming decades? Small mammals including shrews and their associated parasites can serve as key indicators and proxies of accelerating perturbation, contributing to general models for anticip...

  2. Pediatric head and neck masses.

    PubMed

    Gujar, Sachin; Gandhi, Dheeraj; Mukherji, Suresh K

    2004-04-01

    Most neck masses in the pediatric head and neck region are benign. Congenital, developmental, and inflammatory lesions make up most of the masses in the pediatric head and neck. For example, neck masses due to inflammatory lymphadenitis are common in children because of the frequency of upper respiratory tract infections. Although many of the malignant tumors in children are found in the head and neck, they account for only a small portion of the neck masses. The choice of the imaging modality is based on a number of factors, several of which are unique to the pediatric population. Although the bulk of disease entities are adequately evaluated by CT, MRI can provide additional vital information in many cases. MRI provides better soft tissue characterization than CT, has multiplanar capabilities. In this article, we will attempt to provide an overview of conditions that present as neck masses.

  3. Exploiting big data for critical care research.

    PubMed

    Docherty, Annemarie B; Lone, Nazir I

    2015-10-01

    Over recent years the digitalization, collection and storage of vast quantities of data, in combination with advances in data science, has opened up a new era of big data. In this review, we define big data, identify examples of critical care research using big data, discuss the limitations and ethical concerns of using these large datasets and finally consider scope for future research. Big data refers to datasets whose size, complexity and dynamic nature are beyond the scope of traditional data collection and analysis methods. The potential benefits to critical care are significant, with faster progress in improving health and better value for money. Although not replacing clinical trials, big data can improve their design and advance the field of precision medicine. However, there are limitations to analysing big data using observational methods. In addition, there are ethical concerns regarding maintaining confidentiality of patients who contribute to these datasets. Big data have the potential to improve medical care and reduce costs, both by individualizing medicine, and bringing together multiple sources of data about individual patients. As big data become increasingly mainstream, it will be important to maintain public confidence by safeguarding data security, governance and confidentiality.

  4. Riparian wetlands and visitor use management in Big Bend National Park, Texas

    Treesearch

    C. M. Fleming; S. H. Kunkle; M. D. Flora

    1996-01-01

    Wetlands and riparian habitats constitute a small, but nonetheless vital component in the Chihuahuan Desert. Big Bend National Park, 801,000 acres, contains about 27,000 acres of wetland. The park has riparian or wetland habitat distributed around 315 water sources, some perennial streams, and along 118 miles of the Rio Grande. These areas contain unique vegetation...

  5. Big domains are novel Ca²+-binding modules: evidences from big domains of Leptospira immunoglobulin-like (Lig) proteins.

    PubMed

    Raman, Rajeev; Rajanikanth, V; Palaniappan, Raghavan U M; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P; Sharma, Yogendra; Chang, Yung-Fu

    2010-12-29

    Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca²+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca²+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9(th) (Lig A9) and 10(th) repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca²+ with dissociation constants of 2-4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. We demonstrate that the Lig are Ca²+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca²+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca²+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca²+ binding.

  6. Act-Frequency Signatures of the Big Five.

    PubMed

    Chapman, Benjamin P; Goldberg, Lewis R

    2017-10-01

    The traditional focus of work on personality and behavior has tended toward "major outcomes" such as health or antisocial behavior, or small sets of behaviors observable over short periods in laboratories or in convenience samples. In a community sample, we examined a wide set (400) of mundane, incidental or "every day" behavioral acts, the frequencies of which were reported over the past year. Using an exploratory methodology similar to genomic approaches (relying on the False Discovery Rate) revealed 26 prototypical acts for Intellect, 24 acts for Extraversion, 13 for Emotional Stability, nine for Conscientiousness, and six for Agreeableness. Many links were consistent with general intuition-for instance, low Conscientiousness with work and procrastination. Some of the most robust associations, however, were for acts too specific for a priori hypothesis. For instance, Extraversion was strongly associated with telling dirty jokes, Intellect with "loung[ing] around [the] house without clothes on", and Agreeableness with singing in the shower. Frequency categories for these acts changed with markedly non-linearity across Big Five Z-scores. Findings may help ground trait scores in emblematic acts, and enrich understanding of mundane or common behavioral signatures of the Big Five.

  7. Big Joe Capsule Assembly Activities

    NASA Image and Video Library

    1959-08-01

    Big Joe Capsule Assembly Activities in 1959 at NASA Glenn Research Center (formerly NASA Lewis). Big Joe was an Atlas missile that successfully launched a boilerplate model of the Mercury capsule on September 9, 1959.

  8. Urgent Call for Nursing Big Data.

    PubMed

    Delaney, Connie W

    2016-01-01

    The purpose of this panel is to expand internationally a National Action Plan for sharable and comparable nursing data for quality improvement and big data science. There is an urgent need to assure that nursing has sharable and comparable data for quality improvement and big data science. A national collaborative - Nursing Knowledge and Big Data Science includes multi-stakeholder groups focused on a National Action Plan toward implementing and using sharable and comparable nursing big data. Panelists will share accomplishments and future plans with an eye toward international collaboration. This presentation is suitable for any audience attending the NI2016 conference.

  9. The big data potential of epidemiological studies for criminology and forensics.

    PubMed

    DeLisi, Matt

    2018-07-01

    Big data, the analysis of original datasets with large samples ranging from ∼30,000 to one million participants to mine unexplored data, has been under-utilized in criminology. However, there have been recent calls for greater synthesis between epidemiology and criminology and a small number of scholars have utilized epidemiological studies that were designed to measure alcohol and substance use to harvest behavioral and psychiatric measures that relate to the study of crime. These studies have been helpful in producing knowledge about the most serious, violent, and chronic offenders, but applications to more pathological forensic populations is lagging. Unfortunately, big data relating to crime and justice are restricted and limited to criminal justice purposes and not easily available to the research community. Thus, the study of criminal and forensic populations is limited in terms of data volume, velocity, and variety. Additional forays into epidemiology, increased use of available online judicial and correctional data, and unknown new frontiers are needed to bring criminology up to speed in the big data arena. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  10. References for Haplotype Imputation in the Big Data Era

    PubMed Central

    Li, Wenzhi; Xu, Wei; Li, Qiling; Ma, Li; Song, Qing

    2016-01-01

    Imputation is a powerful in silico approach to fill in those missing values in the big datasets. This process requires a reference panel, which is a collection of big data from which the missing information can be extracted and imputed. Haplotype imputation requires ethnicity-matched references; a mismatched reference panel will significantly reduce the quality of imputation. However, currently existing big datasets cover only a small number of ethnicities, there is a lack of ethnicity-matched references for many ethnic populations in the world, which has hampered the data imputation of haplotypes and its downstream applications. To solve this issue, several approaches have been proposed and explored, including the mixed reference panel, the internal reference panel and genotype-converted reference panel. This review article provides the information and comparison between these approaches. Increasing evidence showed that not just one or two genetic elements dictate the gene activity and functions; instead, cis-interactions of multiple elements dictate gene activity. Cis-interactions require the interacting elements to be on the same chromosome molecule, therefore, haplotype analysis is essential for the investigation of cis-interactions among multiple genetic variants at different loci, and appears to be especially important for studying the common diseases. It will be valuable in a wide spectrum of applications from academic research, to clinical diagnosis, prevention, treatment, and pharmaceutical industry. PMID:27274952

  11. bigSCale: an analytical framework for big-scale single-cell data.

    PubMed

    Iacono, Giovanni; Mereu, Elisabetta; Guillaumet-Adkins, Amy; Corominas, Roser; Cuscó, Ivon; Rodríguez-Esteban, Gustavo; Gut, Marta; Pérez-Jurado, Luis Alberto; Gut, Ivo; Heyn, Holger

    2018-06-01

    Single-cell RNA sequencing (scRNA-seq) has significantly deepened our insights into complex tissues, with the latest techniques capable of processing tens of thousands of cells simultaneously. Analyzing increasing numbers of cells, however, generates extremely large data sets, extending processing time and challenging computing resources. Current scRNA-seq analysis tools are not designed to interrogate large data sets and often lack sensitivity to identify marker genes. With bigSCale, we provide a scalable analytical framework to analyze millions of cells, which addresses the challenges associated with large data sets. To handle the noise and sparsity of scRNA-seq data, bigSCale uses large sample sizes to estimate an accurate numerical model of noise. The framework further includes modules for differential expression analysis, cell clustering, and marker identification. A directed convolution strategy allows processing of extremely large data sets, while preserving transcript information from individual cells. We evaluated the performance of bigSCale using both a biological model of aberrant gene expression in patient-derived neuronal progenitor cells and simulated data sets, which underlines the speed and accuracy in differential expression analysis. To test its applicability for large data sets, we applied bigSCale to assess 1.3 million cells from the mouse developing forebrain. Its directed down-sampling strategy accumulates information from single cells into index cell transcriptomes, thereby defining cellular clusters with improved resolution. Accordingly, index cell clusters identified rare populations, such as reelin ( Reln )-positive Cajal-Retzius neurons, for which we report previously unrecognized heterogeneity associated with distinct differentiation stages, spatial organization, and cellular function. Together, bigSCale presents a solution to address future challenges of large single-cell data sets. © 2018 Iacono et al.; Published by Cold Spring Harbor

  12. [Big data in medicine and healthcare].

    PubMed

    Rüping, Stefan

    2015-08-01

    Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.

  13. High School Students as Mentors: Findings from the Big Brothers Big Sisters School-Based Mentoring Impact Study

    ERIC Educational Resources Information Center

    Herrera, Carla; Kauh, Tina J.; Cooney, Siobhan M.; Grossman, Jean Baldwin; McMaken, Jennifer

    2008-01-01

    High schools have recently become a popular source of mentors for school-based mentoring (SBM) programs. The high school Bigs program of Big Brothers Big Sisters of America, for example, currently involves close to 50,000 high-school-aged mentors across the country. While the use of these young mentors has several potential advantages, their age…

  14. The nice people who live up in the cold place above you put lots of money into sense things to look into the big deep water and see weird-ass things

    NASA Astrophysics Data System (ADS)

    Pelz, M.; Scherwath, M.; Hoeberechts, M.

    2017-12-01

    There is lots of stuff in the very big water we want to look at. But because our bodies are soft and can't hold air good, we use computer senses to help us look at all the stuff down there instead.It's actually really good thinking because we don't have to get wet and we can use computer senses under the water all the time, even when the air is cold and it sucks to be outside. We can also go really deep which is cool because weird-ass stuff is down there and we would get pressed too small if we tried to go in person. The sense things idea also save us lots of money because we only have to use other people's water cars once a year to make sure our sense things are working all the time and that we can still see stuff right. Our sense things are made of power lines that go out into the big water and come back to our work-house so if we don't want to keep looking at the same thing, we can tell the sense things to do it different from our house using the lines. This is pretty good because we can change our minds a lot and still get good ideas about what is happening in the big deep water where the weird-ass stuff is.Our head-guys give us money for this thing because we think it will let us know if the ground will shake and kill us before it starts shaking. This is kind of important because we can get out of the way and we can take our good stuff with us too if we know early that it will start shaking and making big-ass waves. Head-guys like to make people feel safe and we are good at helping with that, we think.But we made sure our sense thing can be used for more than just being ready to run away if the ground moves (even though this is a good use). There are also lots of weird-ass and weird-front animals in the big water. Some are not good looking at all, but they do cool stuff with their bodies or they are really good for eating and that makes them really interesting so we look at them too.Last but not least, we use our sense things up in the really cold big water

  15. Making big sense from big data in toxicology by read-across.

    PubMed

    Hartung, Thomas

    2016-01-01

    Modern information technologies have made big data available in safety sciences, i.e., extremely large data sets that may be analyzed only computationally to reveal patterns, trends and associations. This happens by (1) compilation of large sets of existing data, e.g., as a result of the European REACH regulation, (2) the use of omics technologies and (3) systematic robotized testing in a high-throughput manner. All three approaches and some other high-content technologies leave us with big data--the challenge is now to make big sense of these data. Read-across, i.e., the local similarity-based intrapolation of properties, is gaining momentum with increasing data availability and consensus on how to process and report it. It is predominantly applied to in vivo test data as a gap-filling approach, but can similarly complement other incomplete datasets. Big data are first of all repositories for finding similar substances and ensure that the available data is fully exploited. High-content and high-throughput approaches similarly require focusing on clusters, in this case formed by underlying mechanisms such as pathways of toxicity. The closely connected properties, i.e., structural and biological similarity, create the confidence needed for predictions of toxic properties. Here, a new web-based tool under development called REACH-across, which aims to support and automate structure-based read-across, is presented among others.

  16. [Big data in official statistics].

    PubMed

    Zwick, Markus

    2015-08-01

    The concept of "big data" stands to change the face of official statistics over the coming years, having an impact on almost all aspects of data production. The tasks of future statisticians will not necessarily be to produce new data, but rather to identify and make use of existing data to adequately describe social and economic phenomena. Until big data can be used correctly in official statistics, a lot of questions need to be answered and problems solved: the quality of data, data protection, privacy, and the sustainable availability are some of the more pressing issues to be addressed. The essential skills of official statisticians will undoubtedly change, and this implies a number of challenges to be faced by statistical education systems, in universities, and inside the statistical offices. The national statistical offices of the European Union have concluded a concrete strategy for exploring the possibilities of big data for official statistics, by means of the Big Data Roadmap and Action Plan 1.0. This is an important first step and will have a significant influence on implementing the concept of big data inside the statistical offices of Germany.

  17. Considerations on Geospatial Big Data

    NASA Astrophysics Data System (ADS)

    LIU, Zhen; GUO, Huadong; WANG, Changlin

    2016-11-01

    Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.

  18. 48 CFR 719.271-4 - Heads of contracting activities.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... activities. 719.271-4 Section 719.271-4 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies 719.271-4 Heads of contracting activities. In order for the agency small business program to be effective, the active support of top management...

  19. Big-Leaf Mahogany on CITES Appendix II: Big Challenge, Big Opportunity

    Treesearch

    JAMES GROGAN; PAULO BARRETO

    2005-01-01

    On 15 November 2003, big-leaf mahogany (Swietenia macrophylla King, Meliaceae), the most valuable widely traded Neotropical timber tree, gained strengthened regulatory protection from its listing on Appendix II of the Convention on International Trade in Endangered Species ofWild Fauna and Flora (CITES). CITES is a United Nations-chartered agreement signed by 164...

  20. Big Data in Medicine is Driving Big Changes

    PubMed Central

    Verspoor, K.

    2014-01-01

    Summary Objectives To summarise current research that takes advantage of “Big Data” in health and biomedical informatics applications. Methods Survey of trends in this work, and exploration of literature describing how large-scale structured and unstructured data sources are being used to support applications from clinical decision making and health policy, to drug design and pharmacovigilance, and further to systems biology and genetics. Results The survey highlights ongoing development of powerful new methods for turning that large-scale, and often complex, data into information that provides new insights into human health, in a range of different areas. Consideration of this body of work identifies several important paradigm shifts that are facilitated by Big Data resources and methods: in clinical and translational research, from hypothesis-driven research to data-driven research, and in medicine, from evidence-based practice to practice-based evidence. Conclusions The increasing scale and availability of large quantities of health data require strategies for data management, data linkage, and data integration beyond the limits of many existing information systems, and substantial effort is underway to meet those needs. As our ability to make sense of that data improves, the value of the data will continue to increase. Health systems, genetics and genomics, population and public health; all areas of biomedicine stand to benefit from Big Data and the associated technologies. PMID:25123716

  1. Health Informatics Scientists' Perception About Big Data Technology.

    PubMed

    Minou, John; Routsis, Fotios; Gallos, Parisis; Mantas, John

    2017-01-01

    The aim of this paper is to present the perceptions of the Health Informatics Scientists about the Big Data Technology in Healthcare. An empirical study was conducted among 46 scientists to assess their knowledge about the Big Data Technology and their perceptions about using this technology in healthcare. Based on the study findings, 86.7% of the scientists had knowledge of Big data Technology. Furthermore, 59.1% of the scientists believed that Big Data Technology refers to structured data. Additionally, 100% of the population believed that Big Data Technology can be implemented in Healthcare. Finally, the majority does not know any cases of use of Big Data Technology in Greece while 57,8% of the them mentioned that they knew use cases of the Big Data Technology abroad.

  2. Harnessing the Power of Big Data to Improve Graduate Medical Education: Big Idea or Bust?

    PubMed

    Arora, Vineet M

    2018-06-01

    With the advent of electronic medical records (EMRs) fueling the rise of big data, the use of predictive analytics, machine learning, and artificial intelligence are touted as transformational tools to improve clinical care. While major investments are being made in using big data to transform health care delivery, little effort has been directed toward exploiting big data to improve graduate medical education (GME). Because our current system relies on faculty observations of competence, it is not unreasonable to ask whether big data in the form of clinical EMRs and other novel data sources can answer questions of importance in GME such as when is a resident ready for independent practice.The timing is ripe for such a transformation. A recent National Academy of Medicine report called for reforms to how GME is delivered and financed. While many agree on the need to ensure that GME meets our nation's health needs, there is little consensus on how to measure the performance of GME in meeting this goal. During a recent workshop at the National Academy of Medicine on GME outcomes and metrics in October 2017, a key theme emerged: Big data holds great promise to inform GME performance at individual, institutional, and national levels. In this Invited Commentary, several examples are presented, such as using big data to inform clinical experience and provide clinically meaningful data to trainees, and using novel data sources, including ambient data, to better measure the quality of GME training.

  3. A SWOT Analysis of Big Data

    ERIC Educational Resources Information Center

    Ahmadi, Mohammad; Dileepan, Parthasarati; Wheatley, Kathleen K.

    2016-01-01

    This is the decade of data analytics and big data, but not everyone agrees with the definition of big data. Some researchers see it as the future of data analysis, while others consider it as hype and foresee its demise in the near future. No matter how it is defined, big data for the time being is having its glory moment. The most important…

  4. A survey of big data research

    PubMed Central

    Fang, Hua; Zhang, Zhaoyang; Wang, Chanpaul Jin; Daneshmand, Mahmoud; Wang, Chonggang; Wang, Honggang

    2015-01-01

    Big data create values for business and research, but pose significant challenges in terms of networking, storage, management, analytics and ethics. Multidisciplinary collaborations from engineers, computer scientists, statisticians and social scientists are needed to tackle, discover and understand big data. This survey presents an overview of big data initiatives, technologies and research in industries and academia, and discusses challenges and potential solutions. PMID:26504265

  5. Software Architecture for Big Data Systems

    DTIC Science & Technology

    2014-03-27

    Software Architecture: Trends and New Directions #SEIswArch © 2014 Carnegie Mellon University Software Architecture for Big Data Systems...AND SUBTITLE Software Architecture for Big Data Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...ih - . Software Architecture: Trends and New Directions #SEIswArch © 2014 Carnegie Mellon University WHAT IS BIG DATA ? FROM A SOFTWARE

  6. smallWig: parallel compression of RNA-seq WIG files.

    PubMed

    Wang, Zhiying; Weissman, Tsachy; Milenkovic, Olgica

    2016-01-15

    We developed a new lossless compression method for WIG data, named smallWig, offering the best known compression rates for RNA-seq data and featuring random access functionalities that enable visualization, summary statistics analysis and fast queries from the compressed files. Our approach results in order of magnitude improvements compared with bigWig and ensures compression rates only a fraction of those produced by cWig. The key features of the smallWig algorithm are statistical data analysis and a combination of source coding methods that ensure high flexibility and make the algorithm suitable for different applications. Furthermore, for general-purpose file compression, the compression rate of smallWig approaches the empirical entropy of the tested WIG data. For compression with random query features, smallWig uses a simple block-based compression scheme that introduces only a minor overhead in the compression rate. For archival or storage space-sensitive applications, the method relies on context mixing techniques that lead to further improvements of the compression rate. Implementations of smallWig can be executed in parallel on different sets of chromosomes using multiple processors, thereby enabling desirable scaling for future transcriptome Big Data platforms. The development of next-generation sequencing technologies has led to a dramatic decrease in the cost of DNA/RNA sequencing and expression profiling. RNA-seq has emerged as an important and inexpensive technology that provides information about whole transcriptomes of various species and organisms, as well as different organs and cellular communities. The vast volume of data generated by RNA-seq experiments has significantly increased data storage costs and communication bandwidth requirements. Current compression tools for RNA-seq data such as bigWig and cWig either use general-purpose compressors (gzip) or suboptimal compression schemes that leave significant room for improvement. To substantiate

  7. 78 FR 3911 - Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-17

    ... DEPARTMENT OF THE INTERIOR Fish and Wildlife Service [FWS-R3-R-2012-N259; FXRS1265030000-134-FF03R06000] Big Stone National Wildlife Refuge, Big Stone and Lac Qui Parle Counties, MN; Final Comprehensive... significant impact (FONSI) for the environmental assessment (EA) for Big Stone National Wildlife Refuge...

  8. Study on compensation algorithm of head skew in hard disk drives

    NASA Astrophysics Data System (ADS)

    Xiao, Yong; Ge, Xiaoyu; Sun, Jingna; Wang, Xiaoyan

    2011-10-01

    In hard disk drives (HDDs), head skew among multiple heads is pre-calibrated during manufacturing process. In real applications with high capacity of storage, the head stack may be tilted due to environmental change, resulting in additional head skew errors from outer diameter (OD) to inner diameter (ID). In case these errors are below the preset threshold for power on recalibration, the current strategy may not be aware, and drive performance under severe environment will be degraded. In this paper, in-the-field compensation of small DC head skew variation across stroke is proposed, where a zone table has been equipped. Test results demonstrating its effectiveness to reduce observer error and to enhance drive performance via accurate prediction of DC head skew are provided.

  9. Big Crater as Viewed by Pathfinder Lander - Anaglyph

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The 'Big Crater' is actually a relatively small Martian crater to the southeast of the Mars Pathfinder landing site. It is 1500 meters (4900 feet) in diameter, or about the same size as Meteor Crater in Arizona. Superimposed on the rim of Big Crater (the central part of the rim as seen here) is a smaller crater nicknamed 'Rimshot Crater.' The distance to this smaller crater, and the nearest portion of the rim of Big Crater, is 2200 meters (7200 feet). To the right of Big Crater, south from the spacecraft, almost lost in the atmospheric dust 'haze,' is the large streamlined mountain nicknamed 'Far Knob.' This mountain is over 450 meters (1480 feet) tall, and is over 30 kilometers (19 miles) from the spacecraft. Another, smaller and closer knob, nicknamed 'Southeast Knob' can be seen as a triangular peak to the left of the flanks of the Big Crater rim. This knob is 21 kilometers (13 miles) southeast from the spacecraft.

    The larger features visible in this scene - Big Crater, Far Knob, and Southeast Knob - were discovered on the first panoramas taken by the IMP camera on the 4th of July, 1997, and subsequently identified in Viking Orbiter images taken over 20 years ago. The scene includes rocky ridges and swales or 'hummocks' of flood debris that range from a few tens of meters away from the lander to the distance of South Twin Peak. The largest rock in the nearfield, just left of center in the foreground, nicknamed 'Otter', is about 1.5 meters (4.9 feet) long and 10 meters (33 feet) from the spacecraft.

    This view of Big Crater was produced by combining 6 individual 'Superpan' scenes from the left and right eyes of the IMP camera. Each frame consists of 8 individual frames (left eye) and 7 frames (right eye) taken with different color filters that were enlarged by 500% and then co-added using Adobe Photoshop to produce, in effect, a super-resolution panchromatic frame that is sharper than an individual frame would be.

    The anaglyph view of Big Crater was

  10. 13 CFR 101.109 - Do SBA regulations include the section headings?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Do SBA regulations include the section headings? 101.109 Section 101.109 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION ADMINISTRATION Overview § 101.109 Do SBA regulations include the section headings? Yes. All SBA regulations must...

  11. 13 CFR 101.109 - Do SBA regulations include the section headings?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Do SBA regulations include the section headings? 101.109 Section 101.109 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION ADMINISTRATION Overview § 101.109 Do SBA regulations include the section headings? Yes. All SBA regulations must...

  12. PC Utilities: Small Programs with a Big Impact

    ERIC Educational Resources Information Center

    Baule, Steven

    2004-01-01

    The three utility commercial programs available on the Internet are like software packages purchased through a vendor or the Internet, shareware programs are developed by individuals and distributed via the Internet for a small fee to obtain the complete version of the product, and freeware programs are distributed via the Internet free of cost.…

  13. Big Domains Are Novel Ca2+-Binding Modules: Evidences from Big Domains of Leptospira Immunoglobulin-Like (Lig) Proteins

    PubMed Central

    Palaniappan, Raghavan U. M.; Lin, Yi-Pin; He, Hongxuan; McDonough, Sean P.; Sharma, Yogendra; Chang, Yung-Fu

    2010-01-01

    Background Many bacterial surface exposed proteins mediate the host-pathogen interaction more effectively in the presence of Ca2+. Leptospiral immunoglobulin-like (Lig) proteins, LigA and LigB, are surface exposed proteins containing Bacterial immunoglobulin like (Big) domains. The function of proteins which contain Big fold is not known. Based on the possible similarities of immunoglobulin and βγ-crystallin folds, we here explore the important question whether Ca2+ binds to a Big domains, which would provide a novel functional role of the proteins containing Big fold. Principal Findings We selected six individual Big domains for this study (three from the conserved part of LigA and LigB, denoted as Lig A3, Lig A4, and LigBCon5; two from the variable region of LigA, i.e., 9th (Lig A9) and 10th repeats (Lig A10); and one from the variable region of LigB, i.e., LigBCen2. We have also studied the conserved region covering the three and six repeats (LigBCon1-3 and LigCon). All these proteins bind the calcium-mimic dye Stains-all. All the selected four domains bind Ca2+ with dissociation constants of 2–4 µM. Lig A9 and Lig A10 domains fold well with moderate thermal stability, have β-sheet conformation and form homodimers. Fluorescence spectra of Big domains show a specific doublet (at 317 and 330 nm), probably due to Trp interaction with a Phe residue. Equilibrium unfolding of selected Big domains is similar and follows a two-state model, suggesting the similarity in their fold. Conclusions We demonstrate that the Lig are Ca2+-binding proteins, with Big domains harbouring the binding motif. We conclude that despite differences in sequence, a Big motif binds Ca2+. This work thus sets up a strong possibility for classifying the proteins containing Big domains as a novel family of Ca2+-binding proteins. Since Big domain is a part of many proteins in bacterial kingdom, we suggest a possible function these proteins via Ca2+ binding. PMID:21206924

  14. Big sagebrush seed bank densities following wildfires

    USDA-ARS?s Scientific Manuscript database

    Big sagebrush (Artemisia spp.) is a critical shrub to many wildlife species including sage grouse (Centrocercus urophasianus), mule deer (Odocoileus hemionus), and pygmy rabbit (Brachylagus idahoensis). Big sagebrush is killed by wildfires and big sagebrush seed is generally short-lived and do not s...

  15. [Measurement of external pressure of peroneal nerve tract coming in contact with lithotomy leg holders using pressure distribution measurement system BIG-MAT®].

    PubMed

    Mizuno, Ju; Namba, Chikara; Takahashi, Toru

    2014-10-01

    We investigated external pressure on peroneal nerve tract coming in contact with two kinds of leg holders using pressure distribution measurement system BIG- MAT® (Nitta Corp., Osaka) in the lithotomy position Peak contact (active) pressure at the left fibular head region coming in contact with knee-crutch-type leg holder M® (Takara Belmont Corp., Osaka), which supports the left popliteal fossa, was 78.0 ± 26.4 mmHg. On the other hand, peak contact pressure at the left lateral lower leg region coming in contact with boot-support-type leg holder Bel Flex® (Takara Belmont Corp., Osaka), which supports the left lower leg and foot was 26.3±7.9 mmHg. These results suggest that use of knee-crutch-type leg holder is more likely to induce common peroneal nerve palsy at the fibular head region, but use of boot-support-type leg holder dose not easily induce superficial peroneal nerve palsy at the lateral lower leg region, because capillary blood pressure is known to be 32 mmHg. Safer holders for positioning will be developed to prevent nerve palsy based on the analysis of chronological change in external pressure using BIG-MAT® system during anesthesia.

  16. Epidemiology in wonderland: Big Data and precision medicine.

    PubMed

    Saracci, Rodolfo

    2018-03-01

    Big Data and precision medicine, two major contemporary challenges for epidemiology, are critically examined from two different angles. In Part 1 Big Data collected for research purposes (Big research Data) and Big Data used for research although collected for other primary purposes (Big secondary Data) are discussed in the light of the fundamental common requirement of data validity, prevailing over "bigness". Precision medicine is treated developing the key point that high relative risks are as a rule required to make a variable or combination of variables suitable for prediction of disease occurrence, outcome or response to treatment; the commercial proliferation of allegedly predictive tests of unknown or poor validity is commented. Part 2 proposes a "wise epidemiology" approach to: (a) choosing in a context imprinted by Big Data and precision medicine-epidemiological research projects actually relevant to population health, (b) training epidemiologists, (c) investigating the impact on clinical practices and doctor-patient relation of the influx of Big Data and computerized medicine and (d) clarifying whether today "health" may be redefined-as some maintain in purely technological terms.

  17. Big Data and Analytics in Healthcare.

    PubMed

    Tan, S S-L; Gao, G; Koch, S

    2015-01-01

    This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.

  18. "Big data" in economic history.

    PubMed

    Gutmann, Myron P; Merchant, Emily Klancher; Roberts, Evan

    2018-03-01

    Big data is an exciting prospect for the field of economic history, which has long depended on the acquisition, keying, and cleaning of scarce numerical information about the past. This article examines two areas in which economic historians are already using big data - population and environment - discussing ways in which increased frequency of observation, denser samples, and smaller geographic units allow us to analyze the past with greater precision and often to track individuals, places, and phenomena across time. We also explore promising new sources of big data: organically created economic data, high resolution images, and textual corpora.

  19. Big Data and Ambulatory Care

    PubMed Central

    Thorpe, Jane Hyatt; Gray, Elizabeth Alexandra

    2015-01-01

    Big data is heralded as having the potential to revolutionize health care by making large amounts of data available to support care delivery, population health, and patient engagement. Critics argue that big data's transformative potential is inhibited by privacy requirements that restrict health information exchange. However, there are a variety of permissible activities involving use and disclosure of patient information that support care delivery and management. This article presents an overview of the legal framework governing health information, dispels misconceptions about privacy regulations, and highlights how ambulatory care providers in particular can maximize the utility of big data to improve care. PMID:25401945

  20. Big Data Knowledge in Global Health Education.

    PubMed

    Olayinka, Olaniyi; Kekeh, Michele; Sheth-Chandra, Manasi; Akpinar-Elci, Muge

    The ability to synthesize and analyze massive amounts of data is critical to the success of organizations, including those that involve global health. As countries become highly interconnected, increasing the risk for pandemics and outbreaks, the demand for big data is likely to increase. This requires a global health workforce that is trained in the effective use of big data. To assess implementation of big data training in global health, we conducted a pilot survey of members of the Consortium of Universities of Global Health. More than half the respondents did not have a big data training program at their institution. Additionally, the majority agreed that big data training programs will improve global health deliverables, among other favorable outcomes. Given the observed gap and benefits, global health educators may consider investing in big data training for students seeking a career in global health. Copyright © 2017 Icahn School of Medicine at Mount Sinai. Published by Elsevier Inc. All rights reserved.

  1. Big data for bipolar disorder.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Whybrow, Peter C; Bauer, Michael

    2016-12-01

    The delivery of psychiatric care is changing with a new emphasis on integrated care, preventative measures, population health, and the biological basis of disease. Fundamental to this transformation are big data and advances in the ability to analyze these data. The impact of big data on the routine treatment of bipolar disorder today and in the near future is discussed, with examples that relate to health policy, the discovery of new associations, and the study of rare events. The primary sources of big data today are electronic medical records (EMR), claims, and registry data from providers and payers. In the near future, data created by patients from active monitoring, passive monitoring of Internet and smartphone activities, and from sensors may be integrated with the EMR. Diverse data sources from outside of medicine, such as government financial data, will be linked for research. Over the long term, genetic and imaging data will be integrated with the EMR, and there will be more emphasis on predictive models. Many technical challenges remain when analyzing big data that relates to size, heterogeneity, complexity, and unstructured text data in the EMR. Human judgement and subject matter expertise are critical parts of big data analysis, and the active participation of psychiatrists is needed throughout the analytical process.

  2. GEOSS: Addressing Big Data Challenges

    NASA Astrophysics Data System (ADS)

    Nativi, S.; Craglia, M.; Ochiai, O.

    2014-12-01

    In the sector of Earth Observation, the explosion of data is due to many factors including: new satellite constellations, the increased capabilities of sensor technologies, social media, crowdsourcing, and the need for multidisciplinary and collaborative research to face Global Changes. In this area, there are many expectations and concerns about Big Data. Vendors have attempted to use this term for their commercial purposes. It is necessary to understand whether Big Data is a radical shift or an incremental change for the existing digital infrastructures. This presentation tries to explore and discuss the impact of Big Data challenges and new capabilities on the Global Earth Observation System of Systems (GEOSS) and particularly on its common digital infrastructure called GCI. GEOSS is a global and flexible network of content providers allowing decision makers to access an extraordinary range of data and information at their desk. The impact of the Big Data dimensionalities (commonly known as 'V' axes: volume, variety, velocity, veracity, visualization) on GEOSS is discussed. The main solutions and experimentation developed by GEOSS along these axes are introduced and analyzed. GEOSS is a pioneering framework for global and multidisciplinary data sharing in the Earth Observation realm; its experience on Big Data is valuable for the many lessons learned.

  3. Small Schools in a Big World: Thinking about a Wicked Problem

    ERIC Educational Resources Information Center

    Corbett, Michael; Tinkham, Jennifer

    2014-01-01

    The position of small rural schools is precarious in much of rural Canada today. What is to be done about small schools in rural communities which are often experiencing population decline and aging, economic restructuring, and the loss of employment and services? We argue this issue is a classic "wicked" policy problem. Small schools…

  4. Big Questions: Missing Antimatter

    ScienceCinema

    Lincoln, Don

    2018-06-08

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics. In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.

  5. Big data in biomedicine.

    PubMed

    Costa, Fabricio F

    2014-04-01

    The increasing availability and growth rate of biomedical information, also known as 'big data', provides an opportunity for future personalized medicine programs that will significantly improve patient care. Recent advances in information technology (IT) applied to biomedicine are changing the landscape of privacy and personal information, with patients getting more control of their health information. Conceivably, big data analytics is already impacting health decisions and patient care; however, specific challenges need to be addressed to integrate current discoveries into medical practice. In this article, I will discuss the major breakthroughs achieved in combining omics and clinical health data in terms of their application to personalized medicine. I will also review the challenges associated with using big data in biomedicine and translational science. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Big Data’s Role in Precision Public Health

    PubMed Central

    Dolley, Shawn

    2018-01-01

    Precision public health is an emerging practice to more granularly predict and understand public health risks and customize treatments for more specific and homogeneous subpopulations, often using new data, technologies, and methods. Big data is one element that has consistently helped to achieve these goals, through its ability to deliver to practitioners a volume and variety of structured or unstructured data not previously possible. Big data has enabled more widespread and specific research and trials of stratifying and segmenting populations at risk for a variety of health problems. Examples of success using big data are surveyed in surveillance and signal detection, predicting future risk, targeted interventions, and understanding disease. Using novel big data or big data approaches has risks that remain to be resolved. The continued growth in volume and variety of available data, decreased costs of data capture, and emerging computational methods mean big data success will likely be a required pillar of precision public health into the future. This review article aims to identify the precision public health use cases where big data has added value, identify classes of value that big data may bring, and outline the risks inherent in using big data in precision public health efforts. PMID:29594091

  7. Big Programs from a Small State: Less Commonly Taught Languages Find Their Home in Delaware Elementary Schools

    ERIC Educational Resources Information Center

    Fulkerson, Gregory

    2009-01-01

    This article describes three big programs from Delaware where the less commonly taught languages find their home in Delaware elementary schools. Odyssey Charter School, located in Wilmington, is one of the very few Greek-language-focused public schools in the nation. The school began in 2006 as a Greek immersion program that concentrated on the…

  8. Big data in forensic science and medicine.

    PubMed

    Lefèvre, Thomas

    2018-07-01

    In less than a decade, big data in medicine has become quite a phenomenon and many biomedical disciplines got their own tribune on the topic. Perspectives and debates are flourishing while there is a lack for a consensual definition for big data. The 3Vs paradigm is frequently evoked to define the big data principles and stands for Volume, Variety and Velocity. Even according to this paradigm, genuine big data studies are still scarce in medicine and may not meet all expectations. On one hand, techniques usually presented as specific to the big data such as machine learning techniques are supposed to support the ambition of personalized, predictive and preventive medicines. These techniques are mostly far from been new and are more than 50 years old for the most ancient. On the other hand, several issues closely related to the properties of big data and inherited from other scientific fields such as artificial intelligence are often underestimated if not ignored. Besides, a few papers temper the almost unanimous big data enthusiasm and are worth attention since they delineate what is at stakes. In this context, forensic science is still awaiting for its position papers as well as for a comprehensive outline of what kind of contribution big data could bring to the field. The present situation calls for definitions and actions to rationally guide research and practice in big data. It is an opportunity for grounding a true interdisciplinary approach in forensic science and medicine that is mainly based on evidence. Copyright © 2017 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  9. Big Data and Perioperative Nursing.

    PubMed

    Westra, Bonnie L; Peterson, Jessica J

    2016-10-01

    Big data are large volumes of digital data that can be collected from disparate sources and are challenging to analyze. These data are often described with the five "Vs": volume, velocity, variety, veracity, and value. Perioperative nurses contribute to big data through documentation in the electronic health record during routine surgical care, and these data have implications for clinical decision making, administrative decisions, quality improvement, and big data science. This article explores methods to improve the quality of perioperative nursing data and provides examples of how these data can be combined with broader nursing data for quality improvement. We also discuss a national action plan for nursing knowledge and big data science and how perioperative nurses can engage in collaborative actions to transform health care. Standardized perioperative nursing data has the potential to affect care far beyond the original patient. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  10. Modeling in Big Data Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Endert, Alexander; Szymczak, Samantha; Gunning, Dave

    Human-Centered Big Data Research (HCBDR) is an area of work that focuses on the methodologies and research areas focused on understanding how humans interact with “big data”. In the context of this paper, we refer to “big data” in a holistic sense, including most (if not all) the dimensions defining the term, such as complexity, variety, velocity, veracity, etc. Simply put, big data requires us as researchers of to question and reconsider existing approaches, with the opportunity to illuminate new kinds of insights that were traditionally out of reach to humans. The purpose of this article is to summarize themore » discussions and ideas about the role of models in HCBDR at a recent workshop. Models, within the context of this paper, include both computational and conceptual mental models. As such, the discussions summarized in this article seek to understand the connection between these two categories of models.« less

  11. NASA's Big Data Task Force

    NASA Astrophysics Data System (ADS)

    Holmes, C. P.; Kinter, J. L.; Beebe, R. F.; Feigelson, E.; Hurlburt, N. E.; Mentzel, C.; Smith, G.; Tino, C.; Walker, R. J.

    2017-12-01

    Two years ago NASA established the Ad Hoc Big Data Task Force (BDTF - https://science.nasa.gov/science-committee/subcommittees/big-data-task-force), an advisory working group with the NASA Advisory Council system. The scope of the Task Force included all NASA Big Data programs, projects, missions, and activities. The Task Force focused on such topics as exploring the existing and planned evolution of NASA's science data cyber-infrastructure that supports broad access to data repositories for NASA Science Mission Directorate missions; best practices within NASA, other Federal agencies, private industry and research institutions; and Federal initiatives related to big data and data access. The BDTF has completed its two-year term and produced several recommendations plus four white papers for NASA's Science Mission Directorate. This presentation will discuss the activities and results of the TF including summaries of key points from its focused study topics. The paper serves as an introduction to the papers following in this ESSI session.

  12. Big Data Technologies

    PubMed Central

    Bellazzi, Riccardo; Dagliati, Arianna; Sacchi, Lucia; Segagni, Daniele

    2015-01-01

    The so-called big data revolution provides substantial opportunities to diabetes management. At least 3 important directions are currently of great interest. First, the integration of different sources of information, from primary and secondary care to administrative information, may allow depicting a novel view of patient’s care processes and of single patient’s behaviors, taking into account the multifaceted nature of chronic care. Second, the availability of novel diabetes technologies, able to gather large amounts of real-time data, requires the implementation of distributed platforms for data analysis and decision support. Finally, the inclusion of geographical and environmental information into such complex IT systems may further increase the capability of interpreting the data gathered and extract new knowledge from them. This article reviews the main concepts and definitions related to big data, it presents some efforts in health care, and discusses the potential role of big data in diabetes care. Finally, as an example, it describes the research efforts carried on in the MOSAIC project, funded by the European Commission. PMID:25910540

  13. Future Lunar Sampling Missions: Big Returns on Small Samples

    NASA Astrophysics Data System (ADS)

    Shearer, C. K.; Borg, L.

    2002-01-01

    The next sampling missions to the Moon will result in the return of sample mass (100g to 1 kg) substantially smaller than those returned by the Apollo missions (380 kg). Lunar samples to be returned by these missions are vital for: (1) calibrating the late impact history of the inner solar system that can then be extended to other planetary surfaces; (2) deciphering the effects of catastrophic impacts on a planetary body (i.e. Aitken crater); (3) understanding the very late-stage thermal and magmatic evolution of a cooling planet; (4) exploring the interior of a planet; and (5) examining volatile reservoirs and transport on an airless planetary body. Can small lunar samples be used to answer these and other pressing questions concerning important solar system processes? Two potential problems with small, robotically collected samples are placing them in a geologic context and extracting robust planetary information. Although geologic context will always be a potential problem with any planetary sample, new lunar samples can be placed within the context of the important Apollo - Luna collections and the burgeoning planet-scale data sets for the lunar surface and interior. Here we illustrate the usefulness of applying both new or refined analytical approaches in deciphering information locked in small lunar samples.

  14. The Berlin Inventory of Gambling behavior - Screening (BIG-S): Validation using a clinical sample.

    PubMed

    Wejbera, Martin; Müller, Kai W; Becker, Jan; Beutel, Manfred E

    2017-05-18

    Published diagnostic questionnaires for gambling disorder in German are either based on DSM-III criteria or focus on aspects other than life time prevalence. This study was designed to assess the usability of the DSM-IV criteria based Berlin Inventory of Gambling Behavior Screening tool in a clinical sample and adapt it to DSM-5 criteria. In a sample of 432 patients presenting for behavioral addiction assessment at the University Medical Center Mainz, we checked the screening tool's results against clinical diagnosis and compared a subsample of n=300 clinically diagnosed gambling disorder patients with a comparison group of n=132. The BIG-S produced a sensitivity of 99.7% and a specificity of 96.2%. The instrument's unidimensionality and the diagnostic improvements of DSM-5 criteria were verified by exploratory and confirmatory factor analysis as well as receiver operating characteristic analysis. The BIG-S is a reliable and valid screening tool for gambling disorder and demonstrated its concise and comprehensible operationalization of current DSM-5 criteria in a clinical setting.

  15. Interaction of vestibular, echolocation, and visual modalities guiding flight by the big brown bat, Eptesicus fuscus.

    PubMed

    Horowitz, Seth S; Cheney, Cheryl A; Simmons, James A

    2004-01-01

    The big brown bat (Eptesicus fuscus) is an aerial-feeding insectivorous species that relies on echolocation to avoid obstacles and to detect flying insects. Spatial perception in the dark using echolocation challenges the vestibular system to function without substantial visual input for orientation. IR thermal video recordings show the complexity of bat flights in the field and suggest a highly dynamic role for the vestibular system in orientation and flight control. To examine this role, we carried out laboratory studies of flight behavior under illuminated and dark conditions in both static and rotating obstacle tests while administering heavy water (D2O) to impair vestibular inputs. Eptesicus carried out complex maneuvers through both fixed arrays of wires and a rotating obstacle array using both vision and echolocation, or when guided by echolocation alone. When treated with D2O in combination with lack of visual cues, bats showed considerable decrements in performance. These data indicate that big brown bats use both vision and echolocation to provide spatial registration for head position information generated by the vestibular system.

  16. MAGNETIC RECORDING HEAD

    DOEpatents

    Merrill, L.C.

    1958-06-17

    An electromagetic recording head is described for simultaneous recording of a plurality of signals within a small space on a magnetically semsitized medium. Basically the head structure comprises a non-magnetic centerpiece provided with only first and second groups of spaced cut-out slots respectively on opposite sides of the centerpiece. The two groups of slots are in parallel alignment and the slots of one group are staggered with respect to the slots of the other group so that one slot is not directly opposite another slot. Each slot has a magnet pole piece disposed therein and cooperating with a second pole and coil to provide a magnetic flux gap at the upper end of the slot. As a tape is drawn over the upper end of the centerpiece the individual magnetic circuits are disposed along its width to provide means for simultaneously recording information on separate portions, tracks. of the tape.

  17. Danio rerio: Small Fish Making a Big Splash in Leukemia.

    PubMed

    Squiban, Barbara; Frazer, J Kimble

    2014-06-01

    Zebrafish ( Danio rerio ) are widely used for developmental biology studies. In the past decade, D. rerio have become an important oncology model as well. Leukemia is one type of cancer where zebrafish are particularly valuable. As vertebrates, fish have great anatomic and biologic similarity to humans, including their hematopoietic and immune systems. As an experimental platform, D. rerio offer many advantages that mammalian models lack. These include their ease of genetic manipulation, capacity for imaging, and suitability for large-scale phenotypic and drug screens. In this review, we present examples of these strategies and others to illustrate how zebrafish have been and can be used to study leukemia. Besides appraising the techniques researchers apply and introducing the leukemia models they have created, we also highlight recent and exciting discoveries made using D. rerio with an eye to where the field is likely headed.

  18. Small scale sequence automation pays big dividends

    NASA Technical Reports Server (NTRS)

    Nelson, Bill

    1994-01-01

    Galileo sequence design and integration are supported by a suite of formal software tools. Sequence review, however, is largely a manual process with reviewers scanning hundreds of pages of cryptic computer printouts to verify sequence correctness. Beginning in 1990, a series of small, PC based sequence review tools evolved. Each tool performs a specific task but all have a common 'look and feel'. The narrow focus of each tool means simpler operation, and easier creation, testing, and maintenance. Benefits from these tools are (1) decreased review time by factors of 5 to 20 or more with a concomitant reduction in staffing, (2) increased review accuracy, and (3) excellent returns on time invested.

  19. Traffic information computing platform for big data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  20. Head ballistocardiogram based on wireless multi-location sensors.

    PubMed

    Onizuka, Kohei; Sodini, Charles G

    2015-08-01

    Recently a wearable BCG monitoring technique based on an accelerometer worn at the ear was demonstrated to replace a conventional bulky BCG acquisition system. In this work, a multi-location wireless vital signs monitor was developed, and at least two common acceleration vectors correlating to sitting-BCG were found in the supine position by using head PPG signal as a reference for eight healthy human subjects. The head side amplitude in the supine position is roughly proportional to the sitting amplitude that is in turn proportional to the stroke volume. Signal processing techniques to identify J-waves in a subject having small amplitude was also developed based on the two common vectors at the head side and top.

  1. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  2. Mentoring in Schools: An Impact Study of Big Brothers Big Sisters School-Based Mentoring

    ERIC Educational Resources Information Center

    Herrera, Carla; Grossman, Jean Baldwin; Kauh, Tina J.; McMaken, Jennifer

    2011-01-01

    This random assignment impact study of Big Brothers Big Sisters School-Based Mentoring involved 1,139 9- to 16-year-old students in 10 cities nationwide. Youth were randomly assigned to either a treatment group (receiving mentoring) or a control group (receiving no mentoring) and were followed for 1.5 school years. At the end of the first school…

  3. Eolian deposits in the Neoproterozoic Big Bear Group, San Bernardino Mountains, California, USA

    NASA Astrophysics Data System (ADS)

    Stewart, John H.

    2005-12-01

    Strata interpreted to be eolian are recognized in the Neoproterozoic Big Bear Group in the San Bernardino Mountains of southern California, USA. The strata consist of medium- to large-scale (30 cm to > 6 m) cross-stratified quartzite considered to be eolian dune deposits and interstratified thinly laminated quartzite that are problematically interpreted as either eolian translatent climbing ripple laminae, or as tidal-flat deposits. High index ripples and adhesion structures considered to be eolian are associated with the thinly laminated and cross-stratified strata. The eolian strata are in a succession that is characterized by flaser bedding, aqueous ripple marks, mudcracks, and interstratified small-scale cross-strata that are suggestive of a tidal environment containing local fluvial deposits. The eolian strata may have formed in a near-shore environment inland of a tidal flat. The Neoproterozoic Big Bear Group is unusual in the western United States and may represent a remnant of strata that were originally more widespread and part of the hypothetical Neoproterozoic supercontinent of Rodinia. The Big Bear Group perhaps is preserved only in blocks that were downdropped along Neoproterozoic extensional faults. The eolian deposits of the Big Bear Group may have been deposited during arid conditions that preceded worldwide glacial events in the late Neoproterozoic. Possibly similar pre-glacial arid events are recognized in northern Mexico, northeast Washington, Australia, and northwest Canada.

  4. Eolian deposits in the Neoproterozoic Big Bear Group, San Bernardino Mountains, California, USA

    USGS Publications Warehouse

    Stewart, John H.

    2005-01-01

    Strata interpreted to be eolian are recognized in the Neoproterozoic Big Bear Group in the San Bernardino Mountains of southern California, USA. The strata consist of medium- to large-scale (30 cm to > 6 m) cross-stratified quartzite considered to be eolian dune deposits and interstratified thinly laminated quartzite that are problematically interpreted as either eolian translatent climbing ripple laminae, or as tidal-flat deposits. High index ripples and adhesion structures considered to be eolian are associated with the thinly laminated and cross-stratified strata. The eolian strata are in a succession that is characterized by flaser bedding, aqueous ripple marks, mudcracks, and interstratified small-scale cross-strata that are suggestive of a tidal environment containing local fluvial deposits. The eolian strata may have formed in a near-shore environment inland of a tidal flat. The Neoproterozoic Big Bear Group is unusual in the western United States and may represent a remnant of strata that were originally more widespread and part of the hypothetical Neoproterozoic supercontinent of Rodinia. The Big Bear Group perhaps is preserved only in blocks that were downdropped along Neoproterozoic extensional faults. The eolian deposits of the Big Bear Group may have been deposited during arid conditions that preceded worldwide glacial events in the late Neoproterozoic. Possibly similar pre-glacial arid events are recognized in northern Mexico, northeast Washington, Australia, and northwest Canada.

  5. Big data processing in the cloud - Challenges and platforms

    NASA Astrophysics Data System (ADS)

    Zhelev, Svetoslav; Rozeva, Anna

    2017-12-01

    Choosing the appropriate architecture and technologies for a big data project is a difficult task, which requires extensive knowledge in both the problem domain and in the big data landscape. The paper analyzes the main big data architectures and the most widely implemented technologies used for processing and persisting big data. Clouds provide for dynamic resource scaling, which makes them a natural fit for big data applications. Basic cloud computing service models are presented. Two architectures for processing big data are discussed, Lambda and Kappa architectures. Technologies for big data persistence are presented and analyzed. Stream processing as the most important and difficult to manage is outlined. The paper highlights main advantages of cloud and potential problems.

  6. Ethics and Epistemology in Big Data Research.

    PubMed

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian; Ioannidis, John P A

    2017-12-01

    Biomedical innovation and translation are increasingly emphasizing research using "big data." The hope is that big data methods will both speed up research and make its results more applicable to "real-world" patients and health services. While big data research has been embraced by scientists, politicians, industry, and the public, numerous ethical, organizational, and technical/methodological concerns have also been raised. With respect to technical and methodological concerns, there is a view that these will be resolved through sophisticated information technologies, predictive algorithms, and data analysis techniques. While such advances will likely go some way towards resolving technical and methodological issues, we believe that the epistemological issues raised by big data research have important ethical implications and raise questions about the very possibility of big data research achieving its goals.

  7. Regional Big Injun (Price/Pocono) subsurface stratigraphy of West Virginia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Donaldson, A.C.; Zou, Xiangdong

    1992-01-01

    The lower Big Injun (Lower Mississippian) is the oil reservoir of the Granny Creek and Rock Creek fields and consists of multiple sandstones that were deposited in different fluvial-deltaic depositional environments. These multiple sandstones became amalgamated and now appear as a widespread blanket sandstone as a result of ancient cut and fill processes associated with river-channel sedimentation. The regional study of this Price Formation subsurface equivalent considers the continuity and thickness variations of the composite sandstones of the Big Injun mainly within western West Virginia. The major fluvial drainage system apparently flowed southward through Ohio (much of it later erodedmore » by the pre-Pottsville unconformity) during Big Injun time (and earlier) and part of the system was diverted into southwestern West Virginia as vertically stacked channel and river-mouth bar deposits (Rock Creek field). This ancient Ontario River system apparently drained a huge area including the northern craton as well as the orogenic belt. The emerging West Virginia Dome probably sourced the sediment transported by small rivers developing southwestward prograding deltas across Clay County (Granny Creek field). Sedimentation was affected by differential subsidence in the basin. Paleovalley fill was considered for areas with vertically stacked sandstones, but evidence for their origin is not convincing. Oil-reservoir sandstones are classified as dip-trending river channel (D1) and deltaic shoreline (D2) deposits.« less

  8. Big Questions: Missing Antimatter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    2013-08-27

    Einstein's equation E = mc2 is often said to mean that energy can be converted into matter. More accurately, energy can be converted to matter and antimatter. During the first moments of the Big Bang, the universe was smaller, hotter and energy was everywhere. As the universe expanded and cooled, the energy converted into matter and antimatter. According to our best understanding, these two substances should have been created in equal quantities. However when we look out into the cosmos we see only matter and no antimatter. The absence of antimatter is one of the Big Mysteries of modern physics.more » In this video, Fermilab's Dr. Don Lincoln explains the problem, although doesn't answer it. The answer, as in all Big Mysteries, is still unknown and one of the leading research topics of contemporary science.« less

  9. A Great Year for the Big Blue Water

    NASA Astrophysics Data System (ADS)

    Leinen, M.

    2016-12-01

    It has been a great year for the big blue water. Last year the 'United_Nations' decided that it would focus on long time remain alright for the big blue water as one of its 'Millenium_Development_Goals'. This is new. In the past the big blue water was never even considered as a part of this world long time remain alright push. Also, last year the big blue water was added to the words of the group of world people paper #21 on cooling the air and things. It is hard to believe that the big blue water was not in the paper before because 70% of the world is covered by the big blue water! Many people at the group of world meeting were from our friends at 'AGU'.

  10. New camera-based microswitch technology to monitor small head and mouth responses of children with multiple disabilities.

    PubMed

    Lancioni, Giulio E; Bellini, Domenico; Oliva, Doretta; Singh, Nirbhay N; O'Reilly, Mark F; Green, Vanessa A; Furniss, Fred

    2014-06-01

    Assessing a new camera-based microswitch technology, which did not require the use of color marks on the participants' face. Two children with extensive multiple disabilities participated. The responses selected for them consisted of small, lateral head movements and mouth closing or opening. The intervention was carried out according to a multiple probe design across responses. The technology involved a computer with a CPU using a 2-GHz clock, a USB video camera with a 16-mm lens, a USB cable connecting the camera and the computer, and a special software program written in ISO C++ language. The new technology was satisfactorily used with both children. Large increases in their responding were observed during the intervention periods (i.e. when the responses were followed by preferred stimulation). The new technology may be an important resource for persons with multiple disabilities and minimal motor behavior.

  11. A three-component system incorporating Ppd-D1, copy number variation at Ppd-B1, and numerous small-effect quantitative trait loci facilitates adaptation of heading time in winter wheat cultivars of worldwide origin.

    PubMed

    Würschum, Tobias; Langer, Simon M; Longin, C Friedrich H; Tucker, Matthew R; Leiser, Willmar L

    2018-06-01

    The broad adaptability of heading time has contributed to the global success of wheat in a diverse array of climatic conditions. Here, we investigated the genetic architecture underlying heading time in a large panel of 1,110 winter wheat cultivars of worldwide origin. Genome-wide association mapping, in combination with the analysis of major phenology loci, revealed a three-component system that facilitates the adaptation of heading time in winter wheat. The photoperiod sensitivity locus Ppd-D1 was found to account for almost half of the genotypic variance in this panel and can advance or delay heading by many days. In addition, copy number variation at Ppd-B1 was the second most important source of variation in heading, explaining 8.3% of the genotypic variance. Results from association mapping and genomic prediction indicated that the remaining variation is attributed to numerous small-effect quantitative trait loci that facilitate fine-tuning of heading to the local climatic conditions. Collectively, our results underpin the importance of the two Ppd-1 loci for the adaptation of heading time in winter wheat and illustrate how the three components have been exploited for wheat breeding globally. © 2018 John Wiley & Sons Ltd.

  12. 48 CFR 419.201-71 - Small business coordinators.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Small business... SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Policies 419.201-71 Small business coordinators. The head of the contracting activity (HCA) or a representative of the HCA shall designate in writing a small business...

  13. Real-Time Information Extraction from Big Data

    DTIC Science & Technology

    2015-10-01

    I N S T I T U T E F O R D E F E N S E A N A L Y S E S Real-Time Information Extraction from Big Data Robert M. Rolfe...Information Extraction from Big Data Jagdeep Shah Robert M. Rolfe Francisco L. Loaiza-Lemos October 7, 2015 I N S T I T U T E F O R D E F E N S E...AN A LY S E S Abstract We are drowning under the 3 Vs (volume, velocity and variety) of big data . Real-time information extraction from big

  14. When the Big Fish Turns Small: Effects of Participating in Gifted Summer Programs on Academic Self-Concepts

    ERIC Educational Resources Information Center

    Dai, David Yun; Rinn, Anne N.; Tan, Xiaoyuan

    2013-01-01

    The purposes of this study were to (a) examine the presence and prevalence of the big-fish-little-pond effect (BFLPE) in summer programs for the gifted, (b) identify group and individual difference variables that help predict those who are more susceptible to the BFLPE, and (c) put the possible BFLPE on academic self-concept in a larger context of…

  15. Head-bobbing behavior in foraging Whooping Cranes

    USGS Publications Warehouse

    Cronin, T.; Kinloch, M.; Olsen, Glenn H.

    2006-01-01

    Many species of cursorial birds 'head-bob', that is, they alternately thrust the head forward, then hold it stiII as they walk. Such a motion stabilizes visual fields intermittently and could be critical for visual search; yet the time available for stabilization vs. forward thrust varies with walking speed. Whooping Cranes (Grus americana) are extremely tall birds that visually search the ground for seeds, berries, and small prey. We examined head movements in unrestrained Whooping Cranes using digital video subsequently analyzed with a computer graphical overlay. When foraging, the cranes walk at speeds that allow the head to be held still for at least 50% of the time. This behavior is thought to balance the two needs for covering as much ground as possible and for maximizing the time for visual fixation of the ground in the search for prey. Our results strongly suggest that in cranes, and probably many other bird species, visual fixation of the ground is required for object detection and identification. The thrust phase of the head-bobbing cycle is probably also important for vision. As the head moves forward, the movement generates visual flow and motion parallax, providing visual cues for distances and the relative locations of objects. The eyes commonly change their point of fixation when the head is moving too, suggesting that they remain visually competent throughout the entire cycle of thrust and stabilization.

  16. Artificial gravity: head movements during short-radius centrifugation

    NASA Technical Reports Server (NTRS)

    Young, L. R.; Hecht, H.; Lyne, L. E.; Sienko, K. H.; Cheung, C. C.; Kavelaars, J.

    2001-01-01

    Short-radius centrifugation is a potential countermeasure to long-term weightlessness. Unfortunately, head movements in a rotating environment induce serious discomfort, non-compensatory vestibulo-ocular reflexes, and subjective illusions of body tilt. In two experiments we investigated the effects of pitch and yaw head movements in participants placed supine on a rotating bed with their head at the center of rotation, feet at the rim. The vast majority of participants experienced motion sickness, inappropriate vertical nystagmus and illusory tilt and roll as predicted by a semicircular canal model. However, a small but significant number of the 28 participants experienced tilt in the predicted plane but in the opposite direction. Heart rate was elevated following one-second duration head turns. Significant adaptation occurred following a series of head turns in the light. Vertical nystagmus, motion sickness and illusory tilt all decreased with adaptation. Consequences for artificial gravity produced by short-radius centrifuges as a countermeasure are discussed. Grant numbers: NCC 9-58. c 2001. Elsevier Science Ltd. All rights reserved.

  17. Big data and biomedical informatics: a challenging opportunity.

    PubMed

    Bellazzi, R

    2014-05-22

    Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations.

  18. Head turning as a prominent motor symptom in status epilepticus.

    PubMed

    Bauer, Gerhard; Broessner, Gregor; Unterberger, Iris; Walser, Gerald; Pfausler, Bettina; Trinka, Eugen

    2008-06-01

    Head and eye turning is frequently observed during seizures. Versions with tonic and/or clonic symptoms can be differentiated from smooth head deviations. Head turning as a prominent symptom of status epilepticus has not previously been reported. We present eight case reports, (7 women/1 man, mean age 41 years, median 41.5, range 10 to 74), of status epilepticus (SE), with head turning as a prominent motor symptom. Six were accompanied by continuous frontal, occipital and temporal ictal epileptiform discharges. Furthermore, two patients had absence status with rhythmic and clonic head versions. While the localizing significance of head turnings in SE is low, in our cases, the direction was away from the discharging hemisphere in all cases of focal SE regardless of whether the turning was classified as version (three cases) or deviation (three cases). In this small series of SE, the classical observation of a patient looking away from the discharging hemisphere is still valid.

  19. Personality and job performance: the Big Five revisited.

    PubMed

    Hurtz, G M; Donovan, J J

    2000-12-01

    Prior meta-analyses investigating the relation between the Big 5 personality dimensions and job performance have all contained a threat to construct validity, in that much of the data included within these analyses was not derived from actual Big 5 measures. In addition, these reviews did not address the relations between the Big 5 and contextual performance. Therefore, the present study sought to provide a meta-analytic estimate of the criterion-related validity of explicit Big 5 measures for predicting job performance and contextual performance. The results for job performance closely paralleled 2 of the previous meta-analyses, whereas analyses with contextual performance showed more complex relations among the Big 5 and performance. A more critical interpretation of the Big 5-performance relationship is presented, and suggestions for future research aimed at enhancing the validity of personality predictors are provided.

  20. Adding Big Data Analytics to GCSS-MC

    DTIC Science & Technology

    2014-09-30

    TERMS Big Data , Hadoop , MapReduce, GCSS-MC 15. NUMBER OF PAGES 93 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT Unclassified 18. SECURITY...10 2.5 Hadoop . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 3 The Experiment Design 23 3.1 Why Add a Big Data Element...23 3.2 Adding a Big Data Element to GCSS-MC . . . . . . . . . . . . . . 24 3.3 Building a Hadoop Cluster

  1. Ethics and Epistemology of Big Data.

    PubMed

    Lipworth, Wendy; Mason, Paul H; Kerridge, Ian

    2017-12-01

    In this Symposium on the Ethics and Epistemology of Big Data, we present four perspectives on the ways in which the rapid growth in size of research databanks-i.e. their shift into the realm of "big data"-has changed their moral, socio-political, and epistemic status. While there is clearly something different about "big data" databanks, we encourage readers to place the arguments presented in this Symposium in the context of longstanding debates about the ethics, politics, and epistemology of biobank, database, genetic, and epidemiological research.

  2. Foul weather friends: big business and health care reform in the 1990s in historical perspective.

    PubMed

    Swenson, Peter; Greer, Scott

    2002-08-01

    Existing accounts of the Clinton health reform efforts of the early 1990s neglect to examine how the change in big business reform interests during the short period between the late 1980s and 1994 might have altered the trajectory of compulsory health insurance legislation in Congress. This article explores evidence that big employers lost their early interest in reform because they believed their private remedies for bringing down health cost inflation were finally beginning to work. This had a discouraging effect on reform efforts. Historical analysis shows how hard times during the Great Depression also aligned big business interests with those of reformers seeking compulsory social insurance. Unlike the present case, however, the economic climate did not quickly improve, and the social insurance reform of the New Deal succeeded. The article speculates, therefore, that had employer health expenditures not flattened out, continuing and even growing big business support might have neutralized small business and other opposition that contributed heavily to the failure of reform. Thus in light of the Clinton administration's demonstrated willingness to compromise with business on details of its plan, some kind of major reform might have succeeded.

  3. The challenges of big data.

    PubMed

    Mardis, Elaine R

    2016-05-01

    The largely untapped potential of big data analytics is a feeding frenzy that has been fueled by the production of many next-generation-sequencing-based data sets that are seeking to answer long-held questions about the biology of human diseases. Although these approaches are likely to be a powerful means of revealing new biological insights, there are a number of substantial challenges that currently hamper efforts to harness the power of big data. This Editorial outlines several such challenges as a means of illustrating that the path to big data revelations is paved with perils that the scientific community must overcome to pursue this important quest. © 2016. Published by The Company of Biologists Ltd.

  4. Big³. Editorial.

    PubMed

    Lehmann, C U; Séroussi, B; Jaulent, M-C

    2014-05-22

    To provide an editorial introduction into the 2014 IMIA Yearbook of Medical Informatics with an overview of the content, the new publishing scheme, and upcoming 25th anniversary. A brief overview of the 2014 special topic, Big Data - Smart Health Strategies, and an outline of the novel publishing model is provided in conjunction with a call for proposals to celebrate the 25th anniversary of the Yearbook. 'Big Data' has become the latest buzzword in informatics and promise new approaches and interventions that can improve health, well-being, and quality of life. This edition of the Yearbook acknowledges the fact that we just started to explore the opportunities that 'Big Data' will bring. However, it will become apparent to the reader that its pervasive nature has invaded all aspects of biomedical informatics - some to a higher degree than others. It was our goal to provide a comprehensive view at the state of 'Big Data' today, explore its strengths and weaknesses, as well as its risks, discuss emerging trends, tools, and applications, and stimulate the development of the field through the aggregation of excellent survey papers and working group contributions to the topic. For the first time in history will the IMIA Yearbook be published in an open access online format allowing a broader readership especially in resource poor countries. For the first time, thanks to the online format, will the IMIA Yearbook be published twice in the year, with two different tracks of papers. We anticipate that the important role of the IMIA yearbook will further increase with these changes just in time for its 25th anniversary in 2016.

  5. Fever in pregnancy and offspring head circumference.

    PubMed

    Dreier, Julie Werenberg; Strandberg-Larsen, Katrine; Uldall, Peter Vilhelm; Nybo Andersen, Anne-Marie

    2018-02-01

    To examine whether maternal fever during pregnancy is associated with reduced head circumference and risk of microcephaly at birth. A prospective study of 86,980 live-born singletons within the Danish National Birth Cohort was carried out. Self-reported maternal fever exposure was ascertained in two interviews during pregnancy and information on head circumference at birth was extracted from the Danish Medical Birth Registry. Fever in pregnancy was reported by 27% of the mothers, and we identified 3370 cases of microcephaly (head circumference less than or equal to third percentile for sex and gestational age) and 1140 cases of severe microcephaly (head circumference less than or equal to first percentile for sex and gestational age). In this study, maternal fever exposure was not associated with reduced head circumference (adjusted β = 0.03, 95% confidence intervals [CI]: 0.01-0.05), increased risk of microcephaly (odds ratio: 0.95, 95% CI: 0.88-1.03) nor severe microcephaly (odds ratio: 1.01, 95% CI: 0.88-1.15) in the offspring. These findings were consistent for increasing numbers of fever episodes, for increasing fever severity, and for exposure in both early pregnancy and midpregnancy. In this most comprehensive study to date, we found no indication that maternal fever in pregnancy is associated with small head size in the offspring. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. The Big Read: Case Studies

    ERIC Educational Resources Information Center

    National Endowment for the Arts, 2009

    2009-01-01

    The Big Read evaluation included a series of 35 case studies designed to gather more in-depth information on the program's implementation and impact. The case studies gave readers a valuable first-hand look at The Big Read in context. Both formal and informal interviews, focus groups, attendance at a wide range of events--all showed how…

  7. Small Schools in the Big City: Neoliberalism, Bureaucracy and the Sustainability of Small by Design Schools in Chicago

    ERIC Educational Resources Information Center

    Pitluck, Corrin

    2010-01-01

    Assuming the strength of small by design schools for poor urban students of color to be a settled question, this project attempts to analyze the sustainability of small by design schools in a large, complex urban district. Asking what causes small schools to converge toward or diverge from the small by design model, I analyze three sets of design…

  8. Seed bank and big sagebrush plant community composition in a range margin for big sagebrush

    USGS Publications Warehouse

    Martyn, Trace E.; Bradford, John B.; Schlaepfer, Daniel R.; Burke, Ingrid C.; Laurenroth, William K.

    2016-01-01

    The potential influence of seed bank composition on range shifts of species due to climate change is unclear. Seed banks can provide a means of both species persistence in an area and local range expansion in the case of increasing habitat suitability, as may occur under future climate change. However, a mismatch between the seed bank and the established plant community may represent an obstacle to persistence and expansion. In big sagebrush (Artemisia tridentata) plant communities in Montana, USA, we compared the seed bank to the established plant community. There was less than a 20% similarity in the relative abundance of species between the established plant community and the seed bank. This difference was primarily driven by an overrepresentation of native annual forbs and an underrepresentation of big sagebrush in the seed bank compared to the established plant community. Even though we expect an increase in habitat suitability for big sagebrush under future climate conditions at our sites, the current mismatch between the plant community and the seed bank could impede big sagebrush range expansion into increasingly suitable habitat in the future.

  9. Application and Prospect of Big Data in Water Resources

    NASA Astrophysics Data System (ADS)

    Xi, Danchi; Xu, Xinyi

    2017-04-01

    Because of developed information technology and affordable data storage, we h ave entered the era of data explosion. The term "Big Data" and technology relate s to it has been created and commonly applied in many fields. However, academic studies just got attention on Big Data application in water resources recently. As a result, water resource Big Data technology has not been fully developed. This paper introduces the concept of Big Data and its key technologies, including the Hadoop system and MapReduce. In addition, this paper focuses on the significance of applying the big data in water resources and summarizing prior researches by others. Most studies in this field only set up theoretical frame, but we define the "Water Big Data" and explain its tridimensional properties which are time dimension, spatial dimension and intelligent dimension. Based on HBase, the classification system of Water Big Data is introduced: hydrology data, ecology data and socio-economic data. Then after analyzing the challenges in water resources management, a series of solutions using Big Data technologies such as data mining and web crawler, are proposed. Finally, the prospect of applying big data in water resources is discussed, it can be predicted that as Big Data technology keeps developing, "3D" (Data Driven Decision) will be utilized more in water resources management in the future.

  10. Toward a Literature-Driven Definition of Big Data in Healthcare.

    PubMed

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    The aim of this study was to provide a definition of big data in healthcare. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. A total of 196 papers were included. Big data can be defined as datasets with Log(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data.

  11. Big Data Analytic, Big Step for Patient Management and Care in Puerto Rico.

    PubMed

    Borrero, Ernesto E

    2018-01-01

    This letter provides an overview of the application of big data in health care system to improve quality of care, including predictive modelling for risk and resource use, precision medicine and clinical decision support, quality of care and performance measurement, public health and research applications, among others. The author delineates the tremendous potential for big data analytics and discuss how it can be successfully implemented in clinical practice, as an important component of a learning health-care system.

  12. Big Data and Biomedical Informatics: A Challenging Opportunity

    PubMed Central

    2014-01-01

    Summary Big data are receiving an increasing attention in biomedicine and healthcare. It is therefore important to understand the reason why big data are assuming a crucial role for the biomedical informatics community. The capability of handling big data is becoming an enabler to carry out unprecedented research studies and to implement new models of healthcare delivery. Therefore, it is first necessary to deeply understand the four elements that constitute big data, namely Volume, Variety, Velocity, and Veracity, and their meaning in practice. Then, it is mandatory to understand where big data are present, and where they can be beneficially collected. There are research fields, such as translational bioinformatics, which need to rely on big data technologies to withstand the shock wave of data that is generated every day. Other areas, ranging from epidemiology to clinical care, can benefit from the exploitation of the large amounts of data that are nowadays available, from personal monitoring to primary care. However, building big data-enabled systems carries on relevant implications in terms of reproducibility of research studies and management of privacy and data access; proper actions should be taken to deal with these issues. An interesting consequence of the big data scenario is the availability of new software, methods, and tools, such as map-reduce, cloud computing, and concept drift machine learning algorithms, which will not only contribute to big data research, but may be beneficial in many biomedical informatics applications. The way forward with the big data opportunity will require properly applied engineering principles to design studies and applications, to avoid preconceptions or over-enthusiasms, to fully exploit the available technologies, and to improve data processing and data management regulations. PMID:24853034

  13. 48 CFR 619.505 - Rejecting Small Business Administration recommendations.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 4 2014-10-01 2014-10-01 false Rejecting Small Business... STATE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 619.505 Rejecting Small Business Administration recommendations. The Procurement Executive is the agency head for the...

  14. 48 CFR 619.505 - Rejecting Small Business Administration recommendations.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 4 2012-10-01 2012-10-01 false Rejecting Small Business... STATE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 619.505 Rejecting Small Business Administration recommendations. The Procurement Executive is the agency head for the...

  15. 48 CFR 619.505 - Rejecting Small Business Administration recommendations.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 4 2013-10-01 2013-10-01 false Rejecting Small Business... STATE SOCIOECONOMIC PROGRAMS SMALL BUSINESS PROGRAMS Set-Asides for Small Business 619.505 Rejecting Small Business Administration recommendations. The Procurement Executive is the agency head for the...

  16. Integrating the Apache Big Data Stack with HPC for Big Data

    NASA Astrophysics Data System (ADS)

    Fox, G. C.; Qiu, J.; Jha, S.

    2014-12-01

    There is perhaps a broad consensus as to important issues in practical parallel computing as applied to large scale simulations; this is reflected in supercomputer architectures, algorithms, libraries, languages, compilers and best practice for application development. However, the same is not so true for data intensive computing, even though commercially clouds devote much more resources to data analytics than supercomputers devote to simulations. We look at a sample of over 50 big data applications to identify characteristics of data intensive applications and to deduce needed runtime and architectures. We suggest a big data version of the famous Berkeley dwarfs and NAS parallel benchmarks and use these to identify a few key classes of hardware/software architectures. Our analysis builds on combining HPC and ABDS the Apache big data software stack that is well used in modern cloud computing. Initial results on clouds and HPC systems are encouraging. We propose the development of SPIDAL - Scalable Parallel Interoperable Data Analytics Library -- built on system aand data abstractions suggested by the HPC-ABDS architecture. We discuss how it can be used in several application areas including Polar Science.

  17. Issues in Big-Data Database Systems

    DTIC Science & Technology

    2014-06-01

    Post, 18 August 2013. Berman, Jules K. (2013). Principles of Big Data: Preparing, Sharing, and Analyzing Complex Information. New York: Elsevier... Jules K. (2013). Principles of Big Data: Preparing, Sharing, and Analyzing Complex Information. New York: Elsevier. 261pp. Characterization of

  18. The p53-reactivating small molecule RITA induces senescence in head and neck cancer cells.

    PubMed

    Chuang, Hui-Ching; Yang, Liang Peng; Fitzgerald, Alison L; Osman, Abdullah; Woo, Sang Hyeok; Myers, Jeffrey N; Skinner, Heath D

    2014-01-01

    TP53 is the most commonly mutated gene in head and neck cancer (HNSCC), with mutations being associated with resistance to conventional therapy. Restoring normal p53 function has previously been investigated via the use of RITA (reactivation of p53 and induction of tumor cell apoptosis), a small molecule that induces a conformational change in p53, leading to activation of its downstream targets. In the current study we found that RITA indeed exerts significant effects in HNSCC cells. However, in this model, we found that a significant outcome of RITA treatment was accelerated senescence. RITA-induced senescence in a variety of p53 backgrounds, including p53 null cells. Also, inhibition of p53 expression did not appear to significantly inhibit RITA-induced senescence. Thus, this phenomenon appears to be partially p53-independent. Additionally, RITA-induced senescence appears to be partially mediated by activation of the DNA damage response and SIRT1 (Silent information regulator T1) inhibition, with a synergistic effect seen by combining either ionizing radiation or SIRT1 inhibition with RITA treatment. These data point toward a novel mechanism of RITA function as well as hint to its possible therapeutic benefit in HNSCC.

  19. WE-H-BRB-00: Big Data in Radiation Oncology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Big Data in Radiation Oncology: (1) Overview of the NIH 2015 Big Data Workshop, (2) Where do we stand in the applications of big data in radiation oncology?, and (3) Learning Health Systems for Radiation Oncology: Needs and Challenges for Future Success The overriding goal of this trio panel of presentations is to improve awareness of the wide ranging opportunities for big data impact on patient quality care and enhancing potential for research and collaboration opportunities with NIH and a host of new big data initiatives. This presentation will also summarize the Big Data workshop that was held at themore » NIH Campus on August 13–14, 2015 and sponsored by AAPM, ASTRO, and NIH. The workshop included discussion of current Big Data cancer registry initiatives, safety and incident reporting systems, and other strategies that will have the greatest impact on radiation oncology research, quality assurance, safety, and outcomes analysis. Learning Objectives: To discuss current and future sources of big data for use in radiation oncology research To optimize our current data collection by adopting new strategies from outside radiation oncology To determine what new knowledge big data can provide for clinical decision support for personalized medicine L. Xing, NIH/NCI Google Inc.« less

  20. Epidemiology in the Era of Big Data

    PubMed Central

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-01-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called ‘3 Vs’: variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that, while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field’s future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future. PMID:25756221

  1. Toward a Literature-Driven Definition of Big Data in Healthcare

    PubMed Central

    Baro, Emilie; Degoul, Samuel; Beuscart, Régis; Chazard, Emmanuel

    2015-01-01

    Objective. The aim of this study was to provide a definition of big data in healthcare. Methods. A systematic search of PubMed literature published until May 9, 2014, was conducted. We noted the number of statistical individuals (n) and the number of variables (p) for all papers describing a dataset. These papers were classified into fields of study. Characteristics attributed to big data by authors were also considered. Based on this analysis, a definition of big data was proposed. Results. A total of 196 papers were included. Big data can be defined as datasets with Log⁡(n∗p) ≥ 7. Properties of big data are its great variety and high velocity. Big data raises challenges on veracity, on all aspects of the workflow, on extracting meaningful information, and on sharing information. Big data requires new computational methods that optimize data management. Related concepts are data reuse, false knowledge discovery, and privacy issues. Conclusion. Big data is defined by volume. Big data should not be confused with data reuse: data can be big without being reused for another purpose, for example, in omics. Inversely, data can be reused without being necessarily big, for example, secondary use of Electronic Medical Records (EMR) data. PMID:26137488

  2. Big-Eyed Bugs Have Big Appetite for Pests

    USDA-ARS?s Scientific Manuscript database

    Many kinds of arthropod natural enemies (predators and parasitoids) inhabit crop fields in Arizona and can have a large negative impact on several pest insect species that also infest these crops. Geocoris spp., commonly known as big-eyed bugs, are among the most abundant insect predators in field c...

  3. [The problem of small "n" and big "P" in neuropsycho-pharmacology, or how to keep the rate of false discoveries under control].

    PubMed

    Petschner, Péter; Bagdy, György; Tóthfalusi, Laszló

    2015-03-01

    One of the characteristics of many methods used in neuropsychopharmacology is that a large number of parameters (P) are measured in relatively few subjects (n). Functional magnetic resonance imaging, electroencephalography (EEG) and genomic studies are typical examples. For example one microarray chip can contain thousands of probes. Therefore, in studies using microarray chips, P may be several thousand-fold larger than n. Statistical analysis of such studies is a challenging task and they are refereed to in the statistical literature such as the small "n" big "P" problem. The problem has many facets including the controversies associated with multiple hypothesis testing. A typical scenario in this context is, when two or more groups are compared by the individual attributes. If the increased classification error due to the multiple testing is neglected, then several highly significant differences will be discovered. But in reality, some of these significant differences are coincidental, not reproducible findings. Several methods were proposed to solve this problem. In this review we discuss two of the proposed solutions, algorithms to compare sets and statistical hypothesis tests controlling the false discovery rate.

  4. Big Data - What is it and why it matters.

    PubMed

    Tattersall, Andy; Grant, Maria J

    2016-06-01

    Big data, like MOOCs, altmetrics and open access, is a term that has been commonplace in the library community for some time yet, despite its prevalence, many in the library and information sector remain unsure of the relationship between big data and their roles. This editorial explores what big data could mean for the day-to-day practice of health library and information workers, presenting examples of big data in action, considering the ethics of accessing big data sets and the potential for new roles for library and information workers. © 2016 Health Libraries Group.

  5. Research on information security in big data era

    NASA Astrophysics Data System (ADS)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  6. The big data challenges of connectomics.

    PubMed

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2014-11-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces 'big data', unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them.

  7. Active head rotations and eye-head coordination

    NASA Technical Reports Server (NTRS)

    Zangemeister, W. H.; Stark, L.

    1981-01-01

    It is pointed out that head movements play an important role in gaze. The interaction between eye and head movements involves both their shared role in directing gaze and the compensatory vestibular ocular reflex. The dynamics of head trajectories are discussed, taking into account the use of parameterization to obtain the peak velocity, peak accelerations, the times of these extrema, and the duration of the movement. Attention is given to the main sequence, neck muscle EMG and details of the head-movement trajectory, types of head model accelerations, the latency of eye and head movement in coordinated gaze, gaze latency as a function of various factors, and coordinated gaze types. Clinical examples of gaze-plane analysis are considered along with the instantaneous change of compensatory eye movement (CEM) gain, and aspects of variability.

  8. Big Sib Students' Perceptions of the Educational Environment at the School of Medical Sciences, Universiti Sains Malaysia, using Dundee Ready Educational Environment Measure (DREEM) Inventory.

    PubMed

    Arzuman, Hafiza; Yusoff, Muhamad Saiful Bahri; Chit, Som Phong

    2010-07-01

    A cross-sectional descriptive study was conducted among Big Sib students to explore their perceptions of the educational environment at the School of Medical Sciences, Universiti Sains Malaysia (USM) and its weak areas using the Dundee Ready Educational Environment Measure (DREEM) inventory. The DREEM inventory is a validated global instrument for measuring educational environments in undergraduate medical and health professional education. The English version of the DREEM inventory was administered to all Year 2 Big Sib students (n = 67) at a regular Big Sib session. The purpose of the study as well as confidentiality and ethical issues were explained to the students before the questionnaire was administered. The response rate was 62.7% (42 out of 67 students). The overall DREEM score was 117.9/200 (SD 14.6). The DREEM indicated that the Big Sib students' perception of educational environment of the medical school was more positive than negative. Nevertheless, the study also revealed some problem areas within the educational environment. This pilot study revealed that Big Sib students perceived a positive learning environment at the School of Medical Sciences, USM. It also identified some low-scored areas that require further exploration to pinpoint the exact problems. The relatively small study population selected from a particular group of students was the major limitation of the study. This small sample size also means that the study findings cannot be generalised.

  9. Density Resolution Artifacts Encountered When Scanning Infant Heads With X-Ray Computed Tomography (CT)

    NASA Astrophysics Data System (ADS)

    Thompson, Joseph R.; M oore, Robert J.; Hinshaw, David B.; Hasso, Anton N.

    1982-12-01

    Density resolution the accuracy of CT numbers) is generally recognized by radiologists w'ao interpret Children's, CT to be very poor. A CT scanning phantom was made. in order to document the brain attenuation inaccuracies which do occur and also to derive normal brain attenuation values for varying sized heads, given. the skull diameters and thicknesses. In scanning' this phantom, other factors, some of equal importance, to small head size, were found to affect the Hounsfield numbers of brain. The phantom was scanned in order to determine the magnitude of these specific factors, using the GE 8800 model scanner. After head size (412 to 25, H), the variables of the head support (up to 15 H) and centering within the field of view (6-23 H) were of similar importance, for small heads. Kilovoltage, software, and machine drift were less, important, although only kVp settings, of 105 and 120 were employed. Manufacturers may improve CT number accuracy if they recognize the relative, magnitude of the various factors which alter measured attenuation.

  10. ["Big data" - large data, a lot of knowledge?].

    PubMed

    Hothorn, Torsten

    2015-01-28

    Since a couple of years, the term Big Data describes technologies to extract knowledge from data. Applications of Big Data and their consequences are also increasingly discussed in the mass media. Because medicine is an empirical science, we discuss the meaning of Big Data and its potential for future medical research.

  11. Big Ideas in Primary Mathematics: Issues and Directions

    ERIC Educational Resources Information Center

    Askew, Mike

    2013-01-01

    This article is located within the literature arguing for attention to Big Ideas in teaching and learning mathematics for understanding. The focus is on surveying the literature of Big Ideas and clarifying what might constitute Big Ideas in the primary Mathematics Curriculum based on both theoretical and pragmatic considerations. This is…

  12. Big Data - Smart Health Strategies

    PubMed Central

    2014-01-01

    Summary Objectives To select best papers published in 2013 in the field of big data and smart health strategies, and summarize outstanding research efforts. Methods A systematic search was performed using two major bibliographic databases for relevant journal papers. The references obtained were reviewed in a two-stage process, starting with a blinded review performed by the two section editors, and followed by a peer review process operated by external reviewers recognized as experts in the field. Results The complete review process selected four best papers, illustrating various aspects of the special theme, among them: (a) using large volumes of unstructured data and, specifically, clinical notes from Electronic Health Records (EHRs) for pharmacovigilance; (b) knowledge discovery via querying large volumes of complex (both structured and unstructured) biological data using big data technologies and relevant tools; (c) methodologies for applying cloud computing and big data technologies in the field of genomics, and (d) system architectures enabling high-performance access to and processing of large datasets extracted from EHRs. Conclusions The potential of big data in biomedicine has been pinpointed in various viewpoint papers and editorials. The review of current scientific literature illustrated a variety of interesting methods and applications in the field, but still the promises exceed the current outcomes. As we are getting closer towards a solid foundation with respect to common understanding of relevant concepts and technical aspects, and the use of standardized technologies and tools, we can anticipate to reach the potential that big data offer for personalized medicine and smart health strategies in the near future. PMID:25123721

  13. Big Data Management in US Hospitals: Benefits and Barriers.

    PubMed

    Schaeffer, Chad; Booton, Lawrence; Halleck, Jamey; Studeny, Jana; Coustasse, Alberto

    Big data has been considered as an effective tool for reducing health care costs by eliminating adverse events and reducing readmissions to hospitals. The purposes of this study were to examine the emergence of big data in the US health care industry, to evaluate a hospital's ability to effectively use complex information, and to predict the potential benefits that hospitals might realize if they are successful in using big data. The findings of the research suggest that there were a number of benefits expected by hospitals when using big data analytics, including cost savings and business intelligence. By using big data, many hospitals have recognized that there have been challenges, including lack of experience and cost of developing the analytics. Many hospitals will need to invest in the acquiring of adequate personnel with experience in big data analytics and data integration. The findings of this study suggest that the adoption, implementation, and utilization of big data technology will have a profound positive effect among health care providers.

  14. Big Data in Caenorhabditis elegans: quo vadis?

    PubMed Central

    Hutter, Harald; Moerman, Donald

    2015-01-01

    A clear definition of what constitutes “Big Data” is difficult to identify, but we find it most useful to define Big Data as a data collection that is complete. By this criterion, researchers on Caenorhabditis elegans have a long history of collecting Big Data, since the organism was selected with the idea of obtaining a complete biological description and understanding of development. The complete wiring diagram of the nervous system, the complete cell lineage, and the complete genome sequence provide a framework to phrase and test hypotheses. Given this history, it might be surprising that the number of “complete” data sets for this organism is actually rather small—not because of lack of effort, but because most types of biological experiments are not currently amenable to complete large-scale data collection. Many are also not inherently limited, so that it becomes difficult to even define completeness. At present, we only have partial data on mutated genes and their phenotypes, gene expression, and protein–protein interaction—important data for many biological questions. Big Data can point toward unexpected correlations, and these unexpected correlations can lead to novel investigations; however, Big Data cannot establish causation. As a result, there is much excitement about Big Data, but there is also a discussion on just what Big Data contributes to solving a biological problem. Because of its relative simplicity, C. elegans is an ideal test bed to explore this issue and at the same time determine what is necessary to build a multicellular organism from a single cell. PMID:26543198

  15. [Relevance of big data for molecular diagnostics].

    PubMed

    Bonin-Andresen, M; Smiljanovic, B; Stuhlmüller, B; Sörensen, T; Grützkau, A; Häupl, T

    2018-04-01

    Big data analysis raises the expectation that computerized algorithms may extract new knowledge from otherwise unmanageable vast data sets. What are the algorithms behind the big data discussion? In principle, high throughput technologies in molecular research already introduced big data and the development and application of analysis tools into the field of rheumatology some 15 years ago. This includes especially omics technologies, such as genomics, transcriptomics and cytomics. Some basic methods of data analysis are provided along with the technology, however, functional analysis and interpretation requires adaptation of existing or development of new software tools. For these steps, structuring and evaluating according to the biological context is extremely important and not only a mathematical problem. This aspect has to be considered much more for molecular big data than for those analyzed in health economy or epidemiology. Molecular data are structured in a first order determined by the applied technology and present quantitative characteristics that follow the principles of their biological nature. These biological dependencies have to be integrated into software solutions, which may require networks of molecular big data of the same or even different technologies in order to achieve cross-technology confirmation. More and more extensive recording of molecular processes also in individual patients are generating personal big data and require new strategies for management in order to develop data-driven individualized interpretation concepts. With this perspective in mind, translation of information derived from molecular big data will also require new specifications for education and professional competence.

  16. Big data in psychology: A framework for research advancement.

    PubMed

    Adjerid, Idris; Kelley, Ken

    2018-02-22

    The potential for big data to provide value for psychology is significant. However, the pursuit of big data remains an uncertain and risky undertaking for the average psychological researcher. In this article, we address some of this uncertainty by discussing the potential impact of big data on the type of data available for psychological research, addressing the benefits and most significant challenges that emerge from these data, and organizing a variety of research opportunities for psychology. Our article yields two central insights. First, we highlight that big data research efforts are more readily accessible than many researchers realize, particularly with the emergence of open-source research tools, digital platforms, and instrumentation. Second, we argue that opportunities for big data research are diverse and differ both in their fit for varying research goals, as well as in the challenges they bring about. Ultimately, our outlook for researchers in psychology using and benefiting from big data is cautiously optimistic. Although not all big data efforts are suited for all researchers or all areas within psychology, big data research prospects are diverse, expanding, and promising for psychology and related disciplines. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  17. 'Big data' in pharmaceutical science: challenges and opportunities.

    PubMed

    Dossetter, Al G; Ecker, Gerhard; Laverty, Hugh; Overington, John

    2014-05-01

    Future Medicinal Chemistry invited a selection of experts to express their views on the current impact of big data in drug discovery and design, as well as speculate on future developments in the field. The topics discussed include the challenges of implementing big data technologies, maintaining the quality and privacy of data sets, and how the industry will need to adapt to welcome the big data era. Their enlightening responses provide a snapshot of the many and varied contributions being made by big data to the advancement of pharmaceutical science.

  18. Addressing Data Veracity in Big Data Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aman, Saima; Chelmis, Charalampos; Prasanna, Viktor

    Big data applications such as in smart electric grids, transportation, and remote environment monitoring involve geographically dispersed sensors that periodically send back information to central nodes. In many cases, data from sensors is not available at central nodes at a frequency that is required for real-time modeling and decision-making. This may be due to physical limitations of the transmission networks, or due to consumers limiting frequent transmission of data from sensors located at their premises for security and privacy concerns. Such scenarios lead to partial data problem and raise the issue of data veracity in big data applications. We describemore » a novel solution to the problem of making short term predictions (up to a few hours ahead) in absence of real-time data from sensors in Smart Grid. A key implication of our work is that by using real-time data from only a small subset of influential sensors, we are able to make predictions for all sensors. We thus reduce the communication complexity involved in transmitting sensory data in Smart Grids. We use real-world electricity consumption data from smart meters to empirically demonstrate the usefulness of our method. Our dataset consists of data collected at 15-min intervals from 170 smart meters in the USC Microgrid for 7 years, totaling 41,697,600 data points.« less

  19. Sports and the Big6: The Information Advantage.

    ERIC Educational Resources Information Center

    Eisenberg, Mike

    1997-01-01

    Explores the connection between sports and the Big6 information problem-solving process and how sports provides an ideal setting for learning and teaching about the Big6. Topics include information aspects of baseball, football, soccer, basketball, figure skating, track and field, and golf; and the Big6 process applied to sports. (LRW)

  20. Big brown bats (Eptesicus fuscus) maintain hearing sensitivity after exposure to intense band-limited noise.

    PubMed

    Simmons, Andrea Megela; Hom, Kelsey N; Simmons, James A

    2017-03-01

    Thresholds to short-duration narrowband frequency-modulated (FM) sweeps were measured in six big brown bats (Eptesicus fuscus) in a two-alternative forced choice passive listening task before and after exposure to band-limited noise (lower and upper frequencies between 10 and 50 kHz, 1 h, 116-119 dB sound pressure level root mean square; sound exposure level 152 dB). At recovery time points of 2 and 5 min post-exposure, thresholds varied from -4 to +4 dB from pre-exposure threshold estimates. Thresholds after sham (control) exposures varied from -6 to +2 dB from pre-exposure estimates. The small differences in thresholds after noise and sham exposures support the hypothesis that big brown bats do not experience significant temporary threshold shifts under these experimental conditions. These results confirm earlier findings showing stability of thresholds to broadband FM sweeps at longer recovery times after exposure to broadband noise. Big brown bats may have evolved a lessened susceptibility to noise-induced hearing losses, related to the special demands of echolocation.

  1. Acute Subdural Hematoma in a Judo Player with Repeated Head Injuries.

    PubMed

    Yokota, Hiroshi; Ida, Yuki

    2016-07-01

    Acute subdural hematoma (ASDH) is the most important cause of severe head injuries occurring during judo practice in Japan. Repeated head injuries have been reported as a cause of fatal ASDH, although the mechanism remains unknown. A 16-year-old boy visited an emergency department with vomiting 3 days after a strong blow to the occipital region during judo practice. Although computed tomography was performed at that time, a small interhemispheric ASDH was overlooked. The patient sustained another head injury 19 days after the first, which led to convulsions and disturbance of consciousness. The ASDH was increased in size on computed tomography. We performed a surgical evacuation, which revealed tearing of a bridging vein, after which the patient showed a good recovery. It is important to be aware of the possibility of a small ASDH in concussed judo players after an initial impact, which may lead to subsequent fatal ASDH after another impact incident. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Small ponds play big role in greenhouse gas emissions from inland waters

    NASA Astrophysics Data System (ADS)

    Holgerson, M.; Raymond, P. A.

    2017-12-01

    Inland waters are an important part of the global carbon cycle, but there is uncertainty in estimating their greenhouse gas emissions. Uncertainty stems from different models and variable estimates of surface water gas concentrations, gas exchange rates, and the global size distribution of water bodies. Emissions from small water bodies are especially difficult to estimate because they are not globally mapped and few studies have assessed their greenhouse gas concentrations and gas exchange rates. To overcome these limitations, we studied greenhouse gases and gas exchange rates in small ponds in temperate forests of the northeastern United States. We then compiled our data with direct measurements of CO2 and CH4 concentrations from 427 ponds and lakes worldwide, and upscaled to estimate greenhouse gas emissions using estimates of gas exchange rates and the size distribution of lakes. We found that small ponds play a disproportionately large role in greenhouse gas emissions. While small ponds only account for about 9% of global lakes and ponds by area, they contribute 15% of CO2 and 41% of diffusive CH4 emissions from inland freshwaters. Secondly, we measured gas exchange velocities (k) in small ponds and compiled direct measurements of k from 67 global water bodies. We found that k is low but highly variable in small ponds, and increases and becomes even more variable with lake size, a finding that is not currently included in global carbon models. In a third study, we found that gas exchange in small ponds is highly sensitive to overnight cooling, which can lead to short bursts of increased k at night, with implications for greenhouse gas emissions. Overall, these studies show that small ponds are a critical part of the global carbon cycle, and also highlight many knowledge gaps. Therefore, understanding small pond carbon cycling is an important research priority.

  3. Current applications of big data in obstetric anesthesiology.

    PubMed

    Klumpner, Thomas T; Bauer, Melissa E; Kheterpal, Sachin

    2017-06-01

    The narrative review aims to highlight several recently published 'big data' studies pertinent to the field of obstetric anesthesiology. Big data has been used to study rare outcomes, to identify trends within the healthcare system, to identify variations in practice patterns, and to highlight potential inequalities in obstetric anesthesia care. Big data studies have helped define the risk of rare complications of obstetric anesthesia, such as the risk of neuraxial hematoma in thrombocytopenic parturients. Also, large national databases have been used to better understand trends in anesthesia-related adverse events during cesarean delivery as well as outline potential racial/ethnic disparities in obstetric anesthesia care. Finally, real-time analysis of patient data across a number of disparate health information systems through the use of sophisticated clinical decision support and surveillance systems is one promising application of big data technology on the labor and delivery unit. 'Big data' research has important implications for obstetric anesthesia care and warrants continued study. Real-time electronic surveillance is a potentially useful application of big data technology on the labor and delivery unit.

  4. [Big data and their perspectives in radiation therapy].

    PubMed

    Guihard, Sébastien; Thariat, Juliette; Clavier, Jean-Baptiste

    2017-02-01

    The concept of big data indicates a change of scale in the use of data and data aggregation into large databases through improved computer technology. One of the current challenges in the creation of big data in the context of radiation therapy is the transformation of routine care items into dark data, i.e. data not yet collected, and the fusion of databases collecting different types of information (dose-volume histograms and toxicity data for example). Processes and infrastructures devoted to big data collection should not impact negatively on the doctor-patient relationship, the general process of care or the quality of the data collected. The use of big data requires a collective effort of physicians, physicists, software manufacturers and health authorities to create, organize and exploit big data in radiotherapy and, beyond, oncology. Big data involve a new culture to build an appropriate infrastructure legally and ethically. Processes and issues are discussed in this article. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  5. Implications of improved diagnostic imaging of small nodal metastases in head and neck cancer: Radiotherapy target volume transformation and dose de-escalation.

    PubMed

    van den Bosch, Sven; Vogel, Wouter V; Raaijmakers, Cornelis P; Dijkema, Tim; Terhaard, Chris H J; Al-Mamgani, Abrahim; Kaanders, Johannes H A M

    2018-05-03

    Diagnostic imaging continues to evolve, and now has unprecedented accuracy for detecting small nodal metastasis. This influences the tumor load in elective target volumes and subsequently has consequences for the radiotherapy dose required to control disease in these volumes. Small metastases that used to remain subclinical and were included in elective volumes, will nowadays be detected and included in high-dose volumes. Consequentially, high-dose volumes will more often contain low-volume disease. These target volume transformations lead to changes in the tumor burden in elective and "gross" tumor volumes with implications for the radiotherapy dose prescribed to these volumes. For head and neck tumors, nodal staging has evolved from mere palpation to combinations of high-resolution imaging modalities. A traditional nodal gross tumor volume in the neck typically had a minimum diameter of 10-15 mm, while nowadays much smaller tumor deposits are detected in lymph nodes. However, the current dose levels for elective nodal irradiation were empirically determined in the 1950s, and have not changed since. In this report the radiobiological consequences of target volume transformation caused by modern imaging of the neck are evaluated, and theoretically derived reductions of dose in radiotherapy for head and neck cancer are proposed. The concept of target volume transformation and subsequent strategies for dose adaptation applies to many other tumor types as well. Awareness of this concept may result in new strategies for target definition and selection of dose levels with the aim to provide optimal tumor control with less toxicity. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Volume and Value of Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  7. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  8. Big Data Analytics Methodology in the Financial Industry

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2017-01-01

    Firms in industry continue to be attracted by the benefits of Big Data Analytics. The benefits of Big Data Analytics projects may not be as evident as frequently indicated in the literature. The authors of the study evaluate factors in a customized methodology that may increase the benefits of Big Data Analytics projects. Evaluating firms in the…

  9. Big data: survey, technologies, opportunities, and challenges.

    PubMed

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Ali, Waleed Kamaleldin Mahmoud; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data.

  10. Big Data: Survey, Technologies, Opportunities, and Challenges

    PubMed Central

    Khan, Nawsher; Yaqoob, Ibrar; Hashem, Ibrahim Abaker Targio; Inayat, Zakira; Mahmoud Ali, Waleed Kamaleldin; Alam, Muhammad; Shiraz, Muhammad; Gani, Abdullah

    2014-01-01

    Big Data has gained much attention from the academia and the IT industry. In the digital and computing world, information is generated and collected at a rate that rapidly exceeds the boundary range. Currently, over 2 billion people worldwide are connected to the Internet, and over 5 billion individuals own mobile phones. By 2020, 50 billion devices are expected to be connected to the Internet. At this point, predicted data production will be 44 times greater than that in 2009. As information is transferred and shared at light speed on optic fiber and wireless networks, the volume of data and the speed of market growth increase. However, the fast growth rate of such large data generates numerous challenges, such as the rapid growth of data, transfer speed, diverse data, and security. Nonetheless, Big Data is still in its infancy stage, and the domain has not been reviewed in general. Hence, this study comprehensively surveys and classifies the various attributes of Big Data, including its nature, definitions, rapid growth rate, volume, management, analysis, and security. This study also proposes a data life cycle that uses the technologies and terminologies of Big Data. Future research directions in this field are determined based on opportunities and several open issues in Big Data domination. These research directions facilitate the exploration of the domain and the development of optimal techniques to address Big Data. PMID:25136682

  11. Use of Head Guards in AIBA Boxing Tournaments-A Cross-Sectional Observational Study.

    PubMed

    Loosemore, Michael P; Butler, Charles F; Khadri, Abdelhamid; McDonagh, David; Patel, Vimal A; Bailes, Julian E

    2017-01-01

    This study looks at the changes in injuries after the implementation of a new rule by the International Boxing Association (AIBA) to remove head guards from its competitions. A cross-sectional observational study performed prospectively. This brief report examines the removal of head guards in 2 different ways. The first was to examine the stoppages due to blows to the head by comparing World Series Boxing (WSB), without head guards, to other AIBA competitions with head guards. Secondly, we examined the last 3 world championships: 2009 and 2011 (with head guards) and 2013 (without head guards). World Series Boxing and AIBA world championship boxing. Boxers from WSB and AIBA world championships. The information was recorded by ringside medical physicians. Stoppages per 10 000 rounds; stoppages per 1000 hours. Both studies show that the number of stoppages due to head blows was significantly decreased without head guards. The studies also showed that there was a notable increase in cuts. Removing head guards may reduce the already small risk of acute brain injury in amateur boxing.

  12. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    NASA Astrophysics Data System (ADS)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  13. Curating Big Data Made Simple: Perspectives from Scientific Communities.

    PubMed

    Sowe, Sulayman K; Zettsu, Koji

    2014-03-01

    The digital universe is exponentially producing an unprecedented volume of data that has brought benefits as well as fundamental challenges for enterprises and scientific communities alike. This trend is inherently exciting for the development and deployment of cloud platforms to support scientific communities curating big data. The excitement stems from the fact that scientists can now access and extract value from the big data corpus, establish relationships between bits and pieces of information from many types of data, and collaborate with a diverse community of researchers from various domains. However, despite these perceived benefits, to date, little attention is focused on the people or communities who are both beneficiaries and, at the same time, producers of big data. The technical challenges posed by big data are as big as understanding the dynamics of communities working with big data, whether scientific or otherwise. Furthermore, the big data era also means that big data platforms for data-intensive research must be designed in such a way that research scientists can easily search and find data for their research, upload and download datasets for onsite/offsite use, perform computations and analysis, share their findings and research experience, and seamlessly collaborate with their colleagues. In this article, we present the architecture and design of a cloud platform that meets some of these requirements, and a big data curation model that describes how a community of earth and environmental scientists is using the platform to curate data. Motivation for developing the platform, lessons learnt in overcoming some challenges associated with supporting scientists to curate big data, and future research directions are also presented.

  14. Big data analytics in healthcare: promise and potential.

    PubMed

    Raghupathi, Wullianallur; Raghupathi, Viju

    2014-01-01

    To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.

  15. Head lice

    MedlinePlus

    Pediculosis capitis - head lice ... Head lice infect hair on the head. Tiny eggs on the hair look like flakes of dandruff . However, ... flaking off the scalp, they stay in place. Head lice can live up to 30 days on a ...

  16. Big data are coming to psychiatry: a general introduction.

    PubMed

    Monteith, Scott; Glenn, Tasha; Geddes, John; Bauer, Michael

    2015-12-01

    Big data are coming to the study of bipolar disorder and all of psychiatry. Data are coming from providers and payers (including EMR, imaging, insurance claims and pharmacy data), from omics (genomic, proteomic, and metabolomic data), and from patients and non-providers (data from smart phone and Internet activities, sensors and monitoring tools). Analysis of the big data will provide unprecedented opportunities for exploration, descriptive observation, hypothesis generation, and prediction, and the results of big data studies will be incorporated into clinical practice. Technical challenges remain in the quality, analysis and management of big data. This paper discusses some of the fundamental opportunities and challenges of big data for psychiatry.

  17. Magnetic heading reference

    NASA Technical Reports Server (NTRS)

    Garner, H. D. (Inventor)

    1977-01-01

    This invention employs a magnetometer as a magnetic heading reference for a vehicle such as a small aircraft. The magnetometer is mounted on a directional dial in the aircraft in the vicinity of the pilot such that it is free to turn with the dial about the yaw axis of the aircraft. The invention includes a circuit for generating a signal proportional to the northerly turning error produced in the magnetometer due to the vertical component of the earth's magnetic field. This generated signal is then subtracted from the output of the magnetometer to compensate for the northerly turning error.

  18. Head Rotation Detection in Marmoset Monkeys

    NASA Astrophysics Data System (ADS)

    Simhadri, Sravanthi

    Head movement is known to have the benefit of improving the accuracy of sound localization for humans and animals. Marmoset is a small bodied New World monkey species and it has become an emerging model for studying the auditory functions. This thesis aims to detect the horizontal and vertical rotation of head movement in marmoset monkeys. Experiments were conducted in a sound-attenuated acoustic chamber. Head movement of marmoset monkey was studied under various auditory and visual stimulation conditions. With increasing complexity, these conditions are (1) idle, (2) sound-alone, (3) sound and visual signals, and (4) alert signal by opening and closing of the chamber door. All of these conditions were tested with either house light on or off. Infra-red camera with a frame rate of 90 Hz was used to capture of the head movement of monkeys. To assist the signal detection, two circular markers were attached to the top of monkey head. The data analysis used an image-based marker detection scheme. Images were processed using the Computation Vision Toolbox in Matlab. The markers and their positions were detected using blob detection techniques. Based on the frame-by-frame information of marker positions, the angular position, velocity and acceleration were extracted in horizontal and vertical planes. Adaptive Otsu Thresholding, Kalman filtering and bound setting for marker properties were used to overcome a number of challenges encountered during this analysis, such as finding image segmentation threshold, continuously tracking markers during large head movement, and false alarm detection. The results show that the blob detection method together with Kalman filtering yielded better performances than other image based techniques like optical flow and SURF features .The median of the maximal head turn in the horizontal plane was in the range of 20 to 70 degrees and the median of the maximal velocity in horizontal plane was in the range of a few hundreds of degrees per

  19. True Randomness from Big Data.

    PubMed

    Papakonstantinou, Periklis A; Woodruff, David P; Yang, Guang

    2016-09-26

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  20. True Randomness from Big Data

    NASA Astrophysics Data System (ADS)

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-09-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests.

  1. True Randomness from Big Data

    PubMed Central

    Papakonstantinou, Periklis A.; Woodruff, David P.; Yang, Guang

    2016-01-01

    Generating random bits is a difficult task, which is important for physical systems simulation, cryptography, and many applications that rely on high-quality random bits. Our contribution is to show how to generate provably random bits from uncertain events whose outcomes are routinely recorded in the form of massive data sets. These include scientific data sets, such as in astronomics, genomics, as well as data produced by individuals, such as internet search logs, sensor networks, and social network feeds. We view the generation of such data as the sampling process from a big source, which is a random variable of size at least a few gigabytes. Our view initiates the study of big sources in the randomness extraction literature. Previous approaches for big sources rely on statistical assumptions about the samples. We introduce a general method that provably extracts almost-uniform random bits from big sources and extensively validate it empirically on real data sets. The experimental findings indicate that our method is efficient enough to handle large enough sources, while previous extractor constructions are not efficient enough to be practical. Quality-wise, our method at least matches quantum randomness expanders and classical world empirical extractors as measured by standardized tests. PMID:27666514

  2. Auditory compensation for head rotation is incomplete.

    PubMed

    Freeman, Tom C A; Culling, John F; Akeroyd, Michael A; Brimijoin, W Owen

    2017-02-01

    Hearing is confronted by a similar problem to vision when the observer moves. The image motion that is created remains ambiguous until the observer knows the velocity of eye and/or head. One way the visual system solves this problem is to use motor commands, proprioception, and vestibular information. These "extraretinal signals" compensate for self-movement, converting image motion into head-centered coordinates, although not always perfectly. We investigated whether the auditory system also transforms coordinates by examining the degree of compensation for head rotation when judging a moving sound. Real-time recordings of head motion were used to change the "movement gain" relating head movement to source movement across a loudspeaker array. We then determined psychophysically the gain that corresponded to a perceptually stationary source. Experiment 1 showed that the gain was small and positive for a wide range of trained head speeds. Hence, listeners perceived a stationary source as moving slightly opposite to the head rotation, in much the same way that observers see stationary visual objects move against a smooth pursuit eye movement. Experiment 2 showed the degree of compensation remained the same for sounds presented at different azimuths, although the precision of performance declined when the sound was eccentric. We discuss two possible explanations for incomplete compensation, one based on differences in the accuracy of signals encoding image motion and self-movement and one concerning statistical optimization that sacrifices accuracy for precision. We then consider the degree to which such explanations can be applied to auditory motion perception in moving listeners. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  3. Big Data, Big Problems: Incorporating Mission, Values, and Culture in Provider Affiliations.

    PubMed

    Shaha, Steven H; Sayeed, Zain; Anoushiravani, Afshin A; El-Othmani, Mouhanad M; Saleh, Khaled J

    2016-10-01

    This article explores how integration of data from clinical registries and electronic health records produces a quality impact within orthopedic practices. Data are differentiated from information, and several types of data that are collected and used in orthopedic outcome measurement are defined. Furthermore, the concept of comparative effectiveness and its impact on orthopedic clinical research are assessed. This article places emphasis on how the concept of big data produces health care challenges balanced with benefits that may be faced by patients and orthopedic surgeons. Finally, essential characteristics of an electronic health record that interlinks musculoskeletal care and big data initiatives are reviewed. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. AirMSPI PODEX BigSur Terrain Images

    Atmospheric Science Data Center

    2013-12-13

    ... Browse Images from the PODEX 2013 Campaign   Big Sur target (Big Sur, California) 02/03/2013 Terrain-projected   Select ...   Version number   For more information, see the Data Product Specifications (DPS)   ...

  5. A New Look at Big History

    ERIC Educational Resources Information Center

    Hawkey, Kate

    2014-01-01

    The article sets out a "big history" which resonates with the priorities of our own time. A globalizing world calls for new spacial scales to underpin what the history curriculum addresses, "big history" calls for new temporal scales, while concern over climate change calls for a new look at subject boundaries. The article…

  6. West Virginia's big trees: setting the record straight

    Treesearch

    Melissa Thomas-Van Gundy; Robert Whetsell

    2016-01-01

    People love big trees, people love to find big trees, and people love to find big trees in the place they call home. Having been suspicious for years, my coauthor and historian Rob Whetsell, approached me with a species identification challenge. There are several photographs of giant trees used by many people to illustrate the past forests of West Virginia,...

  7. Markerless rat head motion tracking using structured light for brain PET imaging of unrestrained awake small animals

    NASA Astrophysics Data System (ADS)

    Miranda, Alan; Staelens, Steven; Stroobants, Sigrid; Verhaeghe, Jeroen

    2017-03-01

    Preclinical positron emission tomography (PET) imaging in small animals is generally performed under anesthesia to immobilize the animal during scanning. More recently, for rat brain PET studies, methods to perform scans of unrestrained awake rats are being developed in order to avoid the unwanted effects of anesthesia on the brain response. Here, we investigate the use of a projected structure stereo camera to track the motion of the rat head during the PET scan. The motion information is then used to correct the PET data. The stereo camera calculates a 3D point cloud representation of the scene and the tracking is performed by point cloud matching using the iterative closest point algorithm. The main advantage of the proposed motion tracking is that no intervention, e.g. for marker attachment, is needed. A manually moved microDerenzo phantom experiment and 3 awake rat [18F]FDG experiments were performed to evaluate the proposed tracking method. The tracking accuracy was 0.33 mm rms. After motion correction image reconstruction, the microDerenzo phantom was recovered albeit with some loss of resolution. The reconstructed FWHM of the 2.5 and 3 mm rods increased with 0.94 and 0.51 mm respectively in comparison with the motion-free case. In the rat experiments, the average tracking success rate was 64.7%. The correlation of relative brain regional [18F]FDG uptake between the anesthesia and awake scan reconstructions was increased from on average 0.291 (not significant) before correction to 0.909 (p  <  0.0001) after motion correction. Markerless motion tracking using structured light can be successfully used for tracking of the rat head for motion correction in awake rat PET scans.

  8. Multilevel groundwater monitoring of hydraulic head and temperature in the eastern Snake River Plain aquifer, Idaho National Laboratory, Idaho, 2009–10

    USGS Publications Warehouse

    Twining, Brian V.; Fisher, Jason C.

    2012-01-01

    During 2009 and 2010, the U.S. Geological Survey’s Idaho National Laboratory Project Office, in cooperation with the U.S. Department of Energy, collected quarterly, depth-discrete measurements of fluid pressure and temperature in nine boreholes located in the eastern Snake River Plain aquifer. Each borehole was instrumented with a multilevel monitoring system consisting of a series of valved measurement ports, packer bladders, casing segments, and couplers. Multilevel monitoring at the Idaho National Laboratory has been ongoing since 2006. This report summarizes data collected from three multilevel monitoring wells installed during 2009 and 2010 and presents updates to six multilevel monitoring wells. Hydraulic heads (heads) and groundwater temperatures were monitored from 9 multilevel monitoring wells, including 120 hydraulically isolated depth intervals from 448.0 to 1,377.6 feet below land surface. Quarterly head and temperature profiles reveal unique patterns for vertical examination of the aquifer’s complex basalt and sediment stratigraphy, proximity to aquifer recharge and discharge, and groundwater flow. These features contribute to some of the localized variability even though the general profile shape remained consistent over the period of record. Major inflections in the head profiles almost always coincided with low-permeability sediment layers and occasionally thick sequences of dense basalt. However, the presence of a sediment layer or dense basalt layer was insufficient for identifying the location of a major head change within a borehole without knowing the true areal extent and relative transmissivity of the lithologic unit. Temperature profiles for boreholes completed within the Big Lost Trough indicate linear conductive trends; whereas, temperature profiles for boreholes completed within the axial volcanic high indicate mostly convective heat transfer resulting from the vertical movement of groundwater. Additionally, temperature profiles

  9. Head Lice

    MedlinePlus

    What are head lice? Head lice are tiny insects that live on people's heads. Adult lice are about the size of sesame seeds. The eggs, called ... often at the neckline and behind the ears. Head lice are parasites, and they need to feed on ...

  10. 77 FR 49779 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-17

    ... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee... Big Horn County Weed and Pest Building, 4782 Highway 310, Greybull, Wyoming. Written comments about...

  11. 75 FR 71069 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee... held at the Big Horn County Weed and Pest Building, 4782 Highway 310, Greybull, Wyoming. Written...

  12. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    ScienceCinema

    None

    2017-12-09

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nuclei existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe

  13. The big data challenges of connectomics

    PubMed Central

    Lichtman, Jeff W; Pfister, Hanspeter; Shavit, Nir

    2015-01-01

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information at unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here we describe some of the key difficulties that may arise and provide suggestions for managing them. PMID:25349911

  14. Reviews Book: Nucleus Book: The Wonderful World of Relativity Book: Head Shot Book: Cosmos Close-Up Places to Visit: Physics DemoLab Book: Quarks, Leptons and the Big Bang EBook: Shooting Stars Equipment: Victor 70C USB Digital Multimeter Web Watch

    NASA Astrophysics Data System (ADS)

    2012-09-01

    WE RECOMMEND Nucleus: A Trip into the Heart of Matter A coffee-table book for everyone to dip into and learn from The Wonderful World of Relativity A charming, stand-out introduction to relativity The Physics DemoLab, National University of Singapore A treasure trove of physics for hands-on science experiences Quarks, Leptons and the Big Bang Perfect to polish up on particle physics for older students Victor 70C USB Digital Multimeter Equipment impresses for usability and value WORTH A LOOK Cosmos Close-Up Weighty tour of the galaxy that would make a good display Shooting Stars Encourage students to try astrophotography with this ebook HANDLE WITH CARE Head Shot: The Science Behind the JKF Assassination Exploration of the science behind the crime fails to impress WEB WATCH App-lied science for education: a selection of free Android apps are reviewed and iPhone app options are listed

  15. Outcomes of fetuses with small head circumference on second-trimester ultrasonography.

    PubMed

    Deloison, Benjamin; Chalouhi, Gihad E; Bernard, Jean-Pierre; Ville, Yves; Salomon, Laurent J

    2012-09-01

    We examined the outcomes of pregnancies in which the fetal head circumference (HC) was below the 5(th) centile at the routine second-trimester scan. We retrospectively analysed outcomes of 18,377 women according to HC Z scores at second-trimester ultrasound examination between 2001 and 2008. We collected all major malformations, intrauterine deaths and other abnormal outcomes. Six hundred seventy-four fetuses (3.7%) had an HC below the 5(th) centile. Twenty-one major malformations were noted, consisting mainly of neurological abnormalities (3.1%). There were seven intra uterine fetal death (1.3%). Of all the fetuses, 26% were lost to follow-up. Outcome and neurological development was normal in 467 cases, based on neonatal examination and/or parent or general practitioner reports. Major abnormalities were noted in respectively 26.2%, 3.0% and 1.1% of fetuses with Z scores < -2.5, -2.5 to -2.0, and -2 to -1.645, compared with 0.3% of fetuses with normal HC (p < 10(-4)). A head circumference below the 5(th) centile at second-trimester scan is associated with various abnormalities, especially neurological disorders. The outcome was worse when the HC was smaller. An HC Z score below -2.5 was strongly associated with neurological and chromosomal abnormalities. Conversely, an HC Z score below -1.645 but above -2, excluding cases with prenatally diagnosed malformations, seems to be reassuring for favorable neonatal outcome. © 2012 John Wiley & Sons, Ltd.

  16. Harvesting small stems -- A Southern USA perspective

    Treesearch

    William F. Watson; Bryce J. Stokes

    1989-01-01

    Operations that harvest small stems using conventional equipment are discussed. A typical operation consists of rubber-tired feller-bunchers with shear heads, rubber-tired grapple skidders, and in-woods chippers. These systems harvest the small stems either in a pre-harvest, postharvest, or integrated-harvest method.

  17. The p53-Reactivating Small Molecule RITA Induces Senescence in Head and Neck Cancer Cells

    PubMed Central

    Chuang, Hui-Ching; Yang, Liang Peng; Fitzgerald, Alison L.; Osman, Abdullah; Woo, Sang Hyeok; Myers, Jeffrey N.; Skinner, Heath D.

    2014-01-01

    TP53 is the most commonly mutated gene in head and neck cancer (HNSCC), with mutations being associated with resistance to conventional therapy. Restoring normal p53 function has previously been investigated via the use of RITA (reactivation of p53 and induction of tumor cell apoptosis), a small molecule that induces a conformational change in p53, leading to activation of its downstream targets. In the current study we found that RITA indeed exerts significant effects in HNSCC cells. However, in this model, we found that a significant outcome of RITA treatment was accelerated senescence. RITA-induced senescence in a variety of p53 backgrounds, including p53 null cells. Also, inhibition of p53 expression did not appear to significantly inhibit RITA-induced senescence. Thus, this phenomenon appears to be partially p53-independent. Additionally, RITA-induced senescence appears to be partially mediated by activation of the DNA damage response and SIRT1 (Silent information regulator T1) inhibition, with a synergistic effect seen by combining either ionizing radiation or SIRT1 inhibition with RITA treatment. These data point toward a novel mechanism of RITA function as well as hint to its possible therapeutic benefit in HNSCC. PMID:25119136

  18. Toward a manifesto for the 'public understanding of big data'.

    PubMed

    Michael, Mike; Lupton, Deborah

    2016-01-01

    In this article, we sketch a 'manifesto' for the 'public understanding of big data'. On the one hand, this entails such public understanding of science and public engagement with science and technology-tinged questions as follows: How, when and where are people exposed to, or do they engage with, big data? Who are regarded as big data's trustworthy sources, or credible commentators and critics? What are the mechanisms by which big data systems are opened to public scrutiny? On the other hand, big data generate many challenges for public understanding of science and public engagement with science and technology: How do we address publics that are simultaneously the informant, the informed and the information of big data? What counts as understanding of, or engagement with, big data, when big data themselves are multiplying, fluid and recursive? As part of our manifesto, we propose a range of empirical, conceptual and methodological exhortations. We also provide Appendix 1 that outlines three novel methods for addressing some of the issues raised in the article. © The Author(s) 2015.

  19. An embedding for the big bang

    NASA Technical Reports Server (NTRS)

    Wesson, Paul S.

    1994-01-01

    A cosmological model is given that has good physical properties for the early and late universe but is a hypersurface in a flat five-dimensional manifold. The big bang can therefore be regarded as an effect of a choice of coordinates in a truncated higher-dimensional geometry. Thus the big bang is in some sense a geometrical illusion.

  20. Heading Tuning in Macaque Area V6.

    PubMed

    Fan, Reuben H; Liu, Sheng; DeAngelis, Gregory C; Angelaki, Dora E

    2015-12-16

    Cortical areas, such as the dorsal subdivision of the medial superior temporal area (MSTd) and the ventral intraparietal area (VIP), have been shown to integrate visual and vestibular self-motion signals. Area V6 is interconnected with areas MSTd and VIP, allowing for the possibility that V6 also integrates visual and vestibular self-motion cues. An alternative hypothesis in the literature is that V6 does not use these sensory signals to compute heading but instead discounts self-motion signals to represent object motion. However, the responses of V6 neurons to visual and vestibular self-motion cues have never been studied, thus leaving the functional roles of V6 unclear. We used a virtual reality system to examine the 3D heading tuning of macaque V6 neurons in response to optic flow and inertial motion stimuli. We found that the majority of V6 neurons are selective for heading defined by optic flow. However, unlike areas MSTd and VIP, V6 neurons are almost universally unresponsive to inertial motion in the absence of optic flow. We also explored the spatial reference frames of heading signals in V6 by measuring heading tuning for different eye positions, and we found that the visual heading tuning of most V6 cells was eye-centered. Similar to areas MSTd and VIP, the population of V6 neurons was best able to discriminate small variations in heading around forward and backward headings. Our findings support the idea that V6 is involved primarily in processing visual motion signals and does not appear to play a role in visual-vestibular integration for self-motion perception. To understand how we successfully navigate our world, it is important to understand which parts of the brain process cues used to perceive our direction of self-motion (i.e., heading). Cortical area V6 has been implicated in heading computations based on human neuroimaging data, but direct measurements of heading selectivity in individual V6 neurons have been lacking. We provide the first

  1. 76 FR 26240 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-06

    ... words Big Horn County RAC in the subject line. Facsimilies may be sent to 307-674-2668. All comments... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee...

  2. The method of UCN "small heating" measurement in the big gravitational spectrometer (BGS) and studies of this effect on Fomblin oil Y-HVAC 18/8

    NASA Astrophysics Data System (ADS)

    Nesvizhevsky, V. V.; Voronin, A. Yu.; Lambrecht, A.; Reynaud, S.; Lychagin, E. V.; Muzychka, A. Yu.; Nekhaev, G. V.; Strelkov, A. V.

    2018-02-01

    The Big Gravitational Spectrometer (BGS) takes advantage of the strong influence of the Earth's gravity on the motion of ultracold neutrons (UCNs) that makes it possible to shape and measure UCN spectra. We optimized the BGS to investigate the "small heating" of UCNs, that is, the inelastic reflection of UCNs from a surface accompanied by an energy change comparable with the initial UCN energy. UCNs whose energy increases are referred to as "Vaporized UCNs" (VUCNs). The BGS provides the narrowest UCN spectra of a few cm and the broadest "visible" VUCN energy range of up to ˜150 cm (UCN energy is given in units of its maximum height in the Earth's gravitational field, where 1.00 cm ≈ 1.02 neV). The dead-zone between the UCN and VUCN spectra is the narrowest ever achieved (a few cm). We performed measurements with and without samples without breaking vacuum. BGS provides the broadest range of temperatures (77-600 K) and the highest sensitivity to the small heating effect, up to ˜10-8 per bounce, i.e., two orders of magnitude higher than the sensitivity of alternative methods. We describe the method to measure the probability of UCN "small heating" using the BGS and illustrate it with a study of samples of the hydrogen-free oil Fomblin Y-HVAC 18/8. The data obtained are well reproducible, do not depend on sample thickness, and do not evolve over time. The measured model-independent probability P+ of UCN small heating from an energy "mono-line" 30.2 ± 2.5 cm to the energy range 35-140 cm is in the range (1.05 ±0.02s t a t )×1 0-5-(1.31 ±0.24s t a t )×1 0-5 at a temperature of 24 °C. The associated systematic uncertainty would disappear if a VUCN spectrum shape were known, for instance, from a particular model of small heating. This experiment provides the most precise and reliable value of small heating probability on Fomblin measured so far. These results are of importance for studies of UCN small heating as well as for analyzing and designing neutron

  3. The method of UCN "small heating" measurement in the big gravitational spectrometer (BGS) and studies of this effect on Fomblin oil Y-HVAC 18/8.

    PubMed

    Nesvizhevsky, V V; Voronin, A Yu; Lambrecht, A; Reynaud, S; Lychagin, E V; Muzychka, A Yu; Nekhaev, G V; Strelkov, A V

    2018-02-01

    The Big Gravitational Spectrometer (BGS) takes advantage of the strong influence of the Earth's gravity on the motion of ultracold neutrons (UCNs) that makes it possible to shape and measure UCN spectra. We optimized the BGS to investigate the "small heating" of UCNs, that is, the inelastic reflection of UCNs from a surface accompanied by an energy change comparable with the initial UCN energy. UCNs whose energy increases are referred to as "Vaporized UCNs" (VUCNs). The BGS provides the narrowest UCN spectra of a few cm and the broadest "visible" VUCN energy range of up to ∼150 cm (UCN energy is given in units of its maximum height in the Earth's gravitational field, where 1.00 cm ≈ 1.02 neV). The dead-zone between the UCN and VUCN spectra is the narrowest ever achieved (a few cm). We performed measurements with and without samples without breaking vacuum. BGS provides the broadest range of temperatures (77-600 K) and the highest sensitivity to the small heating effect, up to ∼10 -8 per bounce, i.e., two orders of magnitude higher than the sensitivity of alternative methods. We describe the method to measure the probability of UCN "small heating" using the BGS and illustrate it with a study of samples of the hydrogen-free oil Fomblin Y-HVAC 18/8. The data obtained are well reproducible, do not depend on sample thickness, and do not evolve over time. The measured model-independent probability P + of UCN small heating from an energy "mono-line" 30.2 ± 2.5 cm to the energy range 35-140 cm is in the range 1.05±0.02 stat ×10 -5 -1.31±0.24 stat ×10 -5 at a temperature of 24 °C. The associated systematic uncertainty would disappear if a VUCN spectrum shape were known, for instance, from a particular model of small heating. This experiment provides the most precise and reliable value of small heating probability on Fomblin measured so far. These results are of importance for studies of UCN small heating as well as for analyzing and designing neutron lifetime

  4. Addition of lysophospholipids with large head groups to cells inhibits Shiga toxin binding.

    PubMed

    Ailte, Ieva; Lingelem, Anne Berit Dyve; Kavaliauskiene, Simona; Bergan, Jonas; Kvalvaag, Audun Sverre; Myrann, Anne-Grethe; Skotland, Tore; Sandvig, Kirsten

    2016-07-26

    Shiga toxin (Stx), an AB5 toxin, binds specifically to the neutral glycosphingolipid Gb3 at the cell surface before being transported into cells. We here demonstrate that addition of conical lysophospholipids (LPLs) with large head groups inhibit Stx binding to cells whereas LPLs with small head groups do not. Lysophosphatidylinositol (LPI 18:0), the most efficient LPL with the largest head group, was selected for in-depth investigations to study how the binding of Stx is regulated. We show that the inhibition of Stx binding by LPI is reversible and possibly regulated by cholesterol since addition of methyl-β-cyclodextrin (mβCD) reversed the ability of LPI to inhibit binding. LPI-induced inhibition of Stx binding is independent of signalling and membrane turnover as it occurs in fixed cells as well as after depletion of cellular ATP. Furthermore, data obtained with fluorescent membrane dyes suggest that LPI treatment has a direct effect on plasma membrane lipid packing with shift towards a liquid disordered phase in the outer leaflet, while lysophosphoethanolamine (LPE), which has a small head group, does not. In conclusion, our data show that cellular treatment with conical LPLs with large head groups changes intrinsic properties of the plasma membrane and modulates Stx binding to Gb3.

  5. Firing properties of rat lateral mammillary single units: head direction, head pitch, and angular head velocity.

    PubMed

    Stackman, R W; Taube, J S

    1998-11-01

    Many neurons in the rat anterodorsal thalamus (ADN) and postsubiculum (PoS) fire selectively when the rat points its head in a specific direction in the horizontal plane, independent of the animal's location and ongoing behavior. The lateral mammillary nuclei (LMN) are interconnected with both the ADN and PoS and, therefore, are in a pivotal position to influence ADN/PoS neurophysiology. To further understand how the head direction (HD) cell signal is generated, we recorded single neurons from the LMN of freely moving rats. The majority of cells discharged as a function of one of three types of spatial correlates: (1) directional heading, (2) head pitch, or (3) angular head velocity (AHV). LMN HD cells exhibited higher peak firing rates and greater range of directional firing than that of ADN and PoS HD cells. LMN HD cells were modulated by angular head velocity, turning direction, and anticipated the rat's future HD by a greater amount of time (approximately 95 msec) than that previously reported for ADN HD cells (approximately 25 msec). Most head pitch cells discharged when the rostrocaudal axis of the rat's head was orthogonal to the horizontal plane. Head pitch cell firing was independent of the rat's location, directional heading, and its body orientation (i.e., the cell discharged whenever the rat pointed its head up, whether standing on all four limbs or rearing). AHV cells were categorized as fast or slow AHV cells depending on whether their firing rate increased or decreased in proportion to angular head velocity. These data demonstrate that LMN neurons code direction and angular motion of the head in both horizontal and vertical planes and support the hypothesis that the LMN play an important role in processing both egocentric and allocentric spatial information.

  6. Brief report: a preliminary study of fetal head circumference growth in autism spectrum disorder.

    PubMed

    Whitehouse, Andrew J O; Hickey, Martha; Stanley, Fiona J; Newnham, John P; Pennell, Craig E

    2011-01-01

    Fetal head circumference (HC) growth was examined prospectively in children with autism spectrum disorder (ASD). ASD participants (N = 14) were each matched with four control participants (N = 56) on a range of parameters known to influence fetal growth. HC was measured using ultrasonography at approximately 18 weeks gestation and again at birth using a paper tape-measure. Overall body size was indexed by fetal femur-length and birth length. There was no between-groups difference in head circumference at either time-point. While a small number of children with ASD had disproportionately large head circumference relative to body size at both time-points, the between-groups difference did not reach statistical significance in this small sample. These preliminary findings suggest that further investigation of fetal growth in ASD is warranted.

  7. Commentary: Epidemiology in the era of big data.

    PubMed

    Mooney, Stephen J; Westreich, Daniel J; El-Sayed, Abdulrahman M

    2015-05-01

    Big Data has increasingly been promoted as a revolutionary development in the future of science, including epidemiology. However, the definition and implications of Big Data for epidemiology remain unclear. We here provide a working definition of Big Data predicated on the so-called "three V's": variety, volume, and velocity. From this definition, we argue that Big Data has evolutionary and revolutionary implications for identifying and intervening on the determinants of population health. We suggest that as more sources of diverse data become publicly available, the ability to combine and refine these data to yield valid answers to epidemiologic questions will be invaluable. We conclude that while epidemiology as practiced today will continue to be practiced in the Big Data future, a component of our field's future value lies in integrating subject matter knowledge with increased technical savvy. Our training programs and our visions for future public health interventions should reflect this future.

  8. Big Data and the Future of Radiology Informatics.

    PubMed

    Kansagra, Akash P; Yu, John-Paul J; Chatterjee, Arindam R; Lenchik, Leon; Chow, Daniel S; Prater, Adam B; Yeh, Jean; Doshi, Ankur M; Hawkins, C Matthew; Heilbrun, Marta E; Smith, Stacy E; Oselkin, Martin; Gupta, Pushpender; Ali, Sayed

    2016-01-01

    Rapid growth in the amount of data that is electronically recorded as part of routine clinical operations has generated great interest in the use of Big Data methodologies to address clinical and research questions. These methods can efficiently analyze and deliver insights from high-volume, high-variety, and high-growth rate datasets generated across the continuum of care, thereby forgoing the time, cost, and effort of more focused and controlled hypothesis-driven research. By virtue of an existing robust information technology infrastructure and years of archived digital data, radiology departments are particularly well positioned to take advantage of emerging Big Data techniques. In this review, we describe four areas in which Big Data is poised to have an immediate impact on radiology practice, research, and operations. In addition, we provide an overview of the Big Data adoption cycle and describe how academic radiology departments can promote Big Data development. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  9. Natural regeneration processes in big sagebrush (Artemisia tridentata)

    USGS Publications Warehouse

    Schlaepfer, Daniel R.; Lauenroth, William K.; Bradford, John B.

    2014-01-01

    Big sagebrush, Artemisia tridentata Nuttall (Asteraceae), is the dominant plant species of large portions of semiarid western North America. However, much of historical big sagebrush vegetation has been removed or modified. Thus, regeneration is recognized as an important component for land management. Limited knowledge about key regeneration processes, however, represents an obstacle to identifying successful management practices and to gaining greater insight into the consequences of increasing disturbance frequency and global change. Therefore, our objective is to synthesize knowledge about natural big sagebrush regeneration. We identified and characterized the controls of big sagebrush seed production, germination, and establishment. The largest knowledge gaps and associated research needs include quiescence and dormancy of embryos and seedlings; variation in seed production and germination percentages; wet-thermal time model of germination; responses to frost events (including freezing/thawing of soils), CO2 concentration, and nutrients in combination with water availability; suitability of microsite vs. site conditions; competitive ability as well as seedling growth responses; and differences among subspecies and ecoregions. Potential impacts of climate change on big sagebrush regeneration could include that temperature increases may not have a large direct influence on regeneration due to the broad temperature optimum for regeneration, whereas indirect effects could include selection for populations with less stringent seed dormancy. Drier conditions will have direct negative effects on germination and seedling survival and could also lead to lighter seeds, which lowers germination success further. The short seed dispersal distance of big sagebrush may limit its tracking of suitable climate; whereas, the low competitive ability of big sagebrush seedlings may limit successful competition with species that track climate. An improved understanding of the

  10. Big Data Provenance: Challenges, State of the Art and Opportunities.

    PubMed

    Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay

    2015-01-01

    Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data.

  11. Big Data Provenance: Challenges, State of the Art and Opportunities

    PubMed Central

    Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay

    2017-01-01

    Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data. PMID:29399671

  12. Simultaneous multi-headed imager geometry calibration method

    DOEpatents

    Tran, Vi-Hoa [Newport News, VA; Meikle, Steven Richard [Penshurst, AU; Smith, Mark Frederick [Yorktown, VA

    2008-02-19

    A method for calibrating multi-headed high sensitivity and high spatial resolution dynamic imaging systems, especially those useful in the acquisition of tomographic images of small animals. The method of the present invention comprises: simultaneously calibrating two or more detectors to the same coordinate system; and functionally correcting for unwanted detector movement due to gantry flexing.

  13. 1976 Big Thompson flood, Colorado

    USGS Publications Warehouse

    Jarrett, R. D.; Vandas, S.J.

    2006-01-01

    In the early evening of July 31, 1976, a large stationary thunderstorm released as much as 7.5 inches of rainfall in about an hour (about 12 inches in a few hours) in the upper reaches of the Big Thompson River drainage. This large amount of rainfall in such a short period of time produced a flash flood that caught residents and tourists by surprise. The immense volume of water that churned down the narrow Big Thompson Canyon scoured the river channel and destroyed everything in its path, including 418 homes, 52 businesses, numerous bridges, paved and unpaved roads, power and telephone lines, and many other structures. The tragedy claimed the lives of 144 people. Scores of other people narrowly escaped with their lives. The Big Thompson flood ranks among the deadliest of Colorado's recorded floods. It is one of several destructive floods in the United States that has shown the necessity of conducting research to determine the causes and effects of floods. The U.S. Geological Survey (USGS) conducts research and operates a Nationwide streamgage network to help understand and predict the magnitude and likelihood of large streamflow events such as the Big Thompson Flood. Such research and streamgage information are part of an ongoing USGS effort to reduce flood hazards and to increase public awareness.

  14. [Embracing medical innovation in the era of big data].

    PubMed

    You, Suning

    2015-01-01

    Along with the advent of big data era worldwide, medical field has to place itself in it inevitably. The current article thoroughly introduces the basic knowledge of big data, and points out the coexistence of its advantages and disadvantages. Although the innovations in medical field are struggling, the current medical pattern will be changed fundamentally by big data. The article also shows quick change of relevant analysis in big data era, depicts a good intention of digital medical, and proposes some wise advices to surgeons.

  15. Application and Exploration of Big Data Mining in Clinical Medicine.

    PubMed

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-03-20

    To review theories and technologies of big data mining and their application in clinical medicine. Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster-Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Big data mining has the potential to play an important role in clinical medicine.

  16. Big Data in Public Health: Terminology, Machine Learning, and Privacy.

    PubMed

    Mooney, Stephen J; Pejaver, Vikas

    2018-04-01

    The digital world is generating data at a staggering and still increasing rate. While these "big data" have unlocked novel opportunities to understand public health, they hold still greater potential for research and practice. This review explores several key issues that have arisen around big data. First, we propose a taxonomy of sources of big data to clarify terminology and identify threads common across some subtypes of big data. Next, we consider common public health research and practice uses for big data, including surveillance, hypothesis-generating research, and causal inference, while exploring the role that machine learning may play in each use. We then consider the ethical implications of the big data revolution with particular emphasis on maintaining appropriate care for privacy in a world in which technology is rapidly changing social norms regarding the need for (and even the meaning of) privacy. Finally, we make suggestions regarding structuring teams and training to succeed in working with big data in research and practice.

  17. 76 FR 3827 - Regulatory Flexibility, Small Business, and Job Creation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-21

    ... Flexibility, Small Business, and Job Creation Memorandum for the Heads of Executive Departments and Agencies Small businesses play an essential role in the American economy; they help to fuel productivity... employed by a small business or own one. During a recent 15-year period, small businesses created more than...

  18. Big data analytics to improve cardiovascular care: promise and challenges.

    PubMed

    Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M

    2016-06-01

    The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.

  19. A proposed framework of big data readiness in public sectors

    NASA Astrophysics Data System (ADS)

    Ali, Raja Haslinda Raja Mohd; Mohamad, Rosli; Sudin, Suhizaz

    2016-08-01

    Growing interest over big data mainly linked to its great potential to unveil unforeseen pattern or profiles that support organisation's key business decisions. Following private sector moves to embrace big data, the government sector has now getting into the bandwagon. Big data has been considered as one of the potential tools to enhance service delivery of the public sector within its financial resources constraints. Malaysian government, particularly, has considered big data as one of the main national agenda. Regardless of government commitment to promote big data amongst government agencies, degrees of readiness of the government agencies as well as their employees are crucial in ensuring successful deployment of big data. This paper, therefore, proposes a conceptual framework to investigate perceived readiness of big data potentials amongst Malaysian government agencies. Perceived readiness of 28 ministries and their respective employees will be assessed using both qualitative (interview) and quantitative (survey) approaches. The outcome of the study is expected to offer meaningful insight on factors affecting change readiness among public agencies on big data potentials and the expected outcome from greater/lower change readiness among the public sectors.

  20. Big Bang Day : The Great Big Particle Adventure - 3. Origins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    In this series, comedian and physicist Ben Miller asks the CERN scientists what they hope to find. If the LHC is successful, it will explain the nature of the Universe around us in terms of a few simple ingredients and a few simple rules. But the Universe now was forged in a Big Bang where conditions were very different, and the rules were very different, and those early moments were crucial to determining how things turned out later. At the LHC they can recreate conditions as they were billionths of a second after the Big Bang, before atoms and nucleimore » existed. They can find out why matter and antimatter didn't mutually annihilate each other to leave behind a Universe of pure, brilliant light. And they can look into the very structure of space and time - the fabric of the Universe« less

  1. 78 FR 33326 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-04

    ... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee... will be held July 15, 2013 at 3:00 p.m. ADDRESSES: The meeting will be held at Big Horn County Weed and...

  2. 76 FR 7810 - Big Horn County Resource Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... DEPARTMENT OF AGRICULTURE Forest Service Big Horn County Resource Advisory Committee AGENCY: Forest Service, USDA. ACTION: Notice of meeting. SUMMARY: The Big Horn County Resource Advisory Committee... will be held on March 3, 2011, and will begin at 10 a.m. ADDRESSES: The meeting will be held at the Big...

  3. In Search of the Big Bubble

    ERIC Educational Resources Information Center

    Simoson, Andrew; Wentzky, Bethany

    2011-01-01

    Freely rising air bubbles in water sometimes assume the shape of a spherical cap, a shape also known as the "big bubble". Is it possible to find some objective function involving a combination of a bubble's attributes for which the big bubble is the optimal shape? Following the basic idea of the definite integral, we define a bubble's surface as…

  4. Does Choice of Head Size and Neck Geometry Affect Stem Migration in Modular Large-Diameter Metal-on-Metal Total Hip Arthroplasty? A Preliminary Analysis

    PubMed Central

    Georgiou, CS; Evangelou, KG; Theodorou, EG; Provatidis, CG; Megas, PD

    2012-01-01

    Due to their theoretical advantages, hip systems combining modular necks and large diameter femoral heads have gradually gained popularity. However, among others, concerns regarding changes in the load transfer patterns were raised. Recent stress analyses have indeed shown that the use of modular necks and big femoral heads causes significant changes in the strain distribution along the femur. Our original hypothesis was that these changes may affect early distal migration of a modular stem. We examined the effect of head diameter and neck geometry on migration at two years of follow-up in a case series of 116 patients (125 hips), who have undergone primary Metal-on-Metal total hip arthroplasty with the modular grit-blasted Profemur®E stem combined with large-diameter heads (>36 mm). We found that choice of neck geometry and head diameter has no effect on stem migration. A multivariate regression analysis including the potential confounding variables of the body mass index, bone quality, canal fill and stem positioning revealed only a negative correlation between subsidence and canal fill in midstem area. Statistical analysis, despite its limitations, did not confirm our hypothesis that choice of neck geometry and/or head diameter affects early distal migration of a modular stem. However, the importance of correct stem sizing was revealed. PMID:23284597

  5. Does Choice of Head Size and Neck Geometry Affect Stem Migration in Modular Large-Diameter Metal-on-Metal Total Hip Arthroplasty? A Preliminary Analysis.

    PubMed

    Georgiou, Cs; Evangelou, Kg; Theodorou, Eg; Provatidis, Cg; Megas, Pd

    2012-01-01

    Due to their theoretical advantages, hip systems combining modular necks and large diameter femoral heads have gradually gained popularity. However, among others, concerns regarding changes in the load transfer patterns were raised. Recent stress analyses have indeed shown that the use of modular necks and big femoral heads causes significant changes in the strain distribution along the femur. Our original hypothesis was that these changes may affect early distal migration of a modular stem. We examined the effect of head diameter and neck geometry on migration at two years of follow-up in a case series of 116 patients (125 hips), who have undergone primary Metal-on-Metal total hip arthroplasty with the modular grit-blasted Profemur®E stem combined with large-diameter heads (>36 mm). We found that choice of neck geometry and head diameter has no effect on stem migration. A multivariate regression analysis including the potential confounding variables of the body mass index, bone quality, canal fill and stem positioning revealed only a negative correlation between subsidence and canal fill in midstem area. Statistical analysis, despite its limitations, did not confirm our hypothesis that choice of neck geometry and/or head diameter affects early distal migration of a modular stem. However, the importance of correct stem sizing was revealed.

  6. Biochemotherapy in patients with advanced head and neck mucosal melanoma.

    PubMed

    Bartell, Holly L; Bedikian, Agop Y; Papadopoulos, Nicholas E; Dett, Tina K; Ballo, Matthew T; Myers, Jeffrey N; Hwu, Patrick; Kim, Kevin B

    2008-12-01

    No systemic therapy regimen has been recognized as effective for metastatic mucosal melanoma of the head and neck. We retrospectively analyzed the effectiveness of biochemotherapy in patients with advanced head and neck mucosal melanoma. We evaluated the medical records of 15 patients at our institution who had received various biochemotherapy regimens for advanced head and neck mucosal melanoma. After a median follow-up duration of 13 months, 3 patients (20%) had partial response, and 4 patients (27%) had complete response. The median time to disease progression for all 15 patients was 10 months. The median overall survival duration for all patients was 22 months. Although this was a small study, our results, especially the high complete response and overall response rates, indicate that biochemotherapy for advanced head and neck mucosal melanoma should be considered as a systemic treatment option for patients with this aggressive malignancy.

  7. Concurrence of big data analytics and healthcare: A systematic review.

    PubMed

    Mehta, Nishita; Pandit, Anil

    2018-06-01

    The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of

  8. Big Data Analytics in Healthcare

    PubMed Central

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S. M. Reza; Beard, Daniel A.

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined. PMID:26229957

  9. Big Data Analytics in Healthcare.

    PubMed

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  10. Mountain big sagebrush (Artemisia tridentata spp vaseyana) seed production

    Treesearch

    Melissa L. Landeen

    2015-01-01

    Big sagebrush (Artemisia tridentata Nutt.) is the most widespread and common shrub in the sagebrush biome of western North America. Of the three most common subspecies of big sagebrush (Artemisia tridentata), mountain big sagebrush (ssp. vaseyana; MBS) is the most resilient to disturbance, but still requires favorable climactic conditions and a viable post-...

  11. New Evidence on the Development of the Word "Big."

    ERIC Educational Resources Information Center

    Sena, Rhonda; Smith, Linda B.

    1990-01-01

    Results indicate that curvilinear trend in children's understanding of word "big" is not obtained in all stimulus contexts. This suggests that meaning and use of "big" is complex, and may not refer simply to larger objects in a set. Proposes that meaning of "big" constitutes a dynamic system driven by many perceptual,…

  12. Investigating Seed Longevity of Big Sagebrush (Artemisia tridentata)

    USGS Publications Warehouse

    Wijayratne, Upekala C.; Pyke, David A.

    2009-01-01

    The Intermountain West is dominated by big sagebrush communities (Artemisia tridentata subspecies) that provide habitat and forage for wildlife, prevent erosion, and are economically important to recreation and livestock industries. The two most prominent subspecies of big sagebrush in this region are Wyoming big sagebrush (A. t. ssp. wyomingensis) and mountain big sagebrush (A. t. ssp. vaseyana). Increased understanding of seed bank dynamics will assist with sustainable management and persistence of sagebrush communities. For example, mountain big sagebrush may be subjected to shorter fire return intervals and prescribed fire is a tool used often to rejuvenate stands and reduce tree (Juniperus sp. or Pinus sp.) encroachment into these communities. A persistent seed bank for mountain big sagebrush would be advantageous under these circumstances. Laboratory germination trials indicate that seed dormancy in big sagebrush may be habitat-specific, with collections from colder sites being more dormant. Our objective was to investigate seed longevity of both subspecies by evaluating viability of seeds in the field with a seed retrieval experiment and sampling for seeds in situ. We chose six study sites for each subspecies. These sites were dispersed across eastern Oregon, southern Idaho, northwestern Utah, and eastern Nevada. Ninety-six polyester mesh bags, each containing 100 seeds of a subspecies, were placed at each site during November 2006. Seed bags were placed in three locations: (1) at the soil surface above litter, (2) on the soil surface beneath litter, and (3) 3 cm below the soil surface to determine whether dormancy is affected by continued darkness or environmental conditions. Subsets of seeds were examined in April and November in both 2007 and 2008 to determine seed viability dynamics. Seed bank samples were taken at each site, separated into litter and soil fractions, and assessed for number of germinable seeds in a greenhouse. Community composition data

  13. Smart Information Management in Health Big Data.

    PubMed

    Muteba A, Eustache

    2017-01-01

    The smart information management system (SIMS) is concerned with the organization of anonymous patient records in a big data and their extraction in order to provide needful real-time intelligence. The purpose of the present study is to highlight the design and the implementation of the smart information management system. We emphasis, in one hand, the organization of a big data in flat file in simulation of nosql database, and in the other hand, the extraction of information based on lookup table and cache mechanism. The SIMS in the health big data aims the identification of new therapies and approaches to delivering care.

  14. Bringing Out Head Start Talents (BOHST). Talent Programming.

    ERIC Educational Resources Information Center

    Amundsen, Jane; And Others

    Designed for preschoolers identified as talented by the Bringing Out Head Start Talents (BOHST) project, the small-group lessons contained in this manual focus on nine areas of talent programming and are presented in color-coded sections: creative, intellectual, leadership, art, music, reading, math, science, and psychomotor talent development.…

  15. Integrative methods for analyzing big data in precision medicine.

    PubMed

    Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša

    2016-03-01

    We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Big Dreams

    ERIC Educational Resources Information Center

    Benson, Michael T.

    2015-01-01

    The Keen Johnson Building is symbolic of Eastern Kentucky University's historic role as a School of Opportunity. It is a place that has inspired generations of students, many from disadvantaged backgrounds, to dream big dreams. The construction of the Keen Johnson Building was inspired by a desire to create a student union facility that would not…

  17. Translating Big Data into Smart Data for Veterinary Epidemiology.

    PubMed

    VanderWaal, Kimberly; Morrison, Robert B; Neuhauser, Claudia; Vilalta, Carles; Perez, Andres M

    2017-01-01

    The increasing availability and complexity of data has led to new opportunities and challenges in veterinary epidemiology around how to translate abundant, diverse, and rapidly growing "big" data into meaningful insights for animal health. Big data analytics are used to understand health risks and minimize the impact of adverse animal health issues through identifying high-risk populations, combining data or processes acting at multiple scales through epidemiological modeling approaches, and harnessing high velocity data to monitor animal health trends and detect emerging health threats. The advent of big data requires the incorporation of new skills into veterinary epidemiology training, including, for example, machine learning and coding, to prepare a new generation of scientists and practitioners to engage with big data. Establishing pipelines to analyze big data in near real-time is the next step for progressing from simply having "big data" to create "smart data," with the objective of improving understanding of health risks, effectiveness of management and policy decisions, and ultimately preventing or at least minimizing the impact of adverse animal health issues.

  18. Linac head scatter factor for asymmetric radiation field

    NASA Astrophysics Data System (ADS)

    Soubra, Mazen Ahmed

    1997-11-01

    The head scatter factor, Sh is an important dosimetric quantity used in radiation therapy dose calculation. It is empirically determined and its field size dependence reflects changes in photon scatter from components in the linac treatment head. In this work a detailed study of the physical factors influencing the determination of Sh was performed with particular attention given to asymmetric field geometries. Ionization measurements for 6 and 18 MV photon beams were made to examine the factors which determine Sh. These include: phantom size and material, collimator backscatter, non-lateral electronic equilibrium (LEE) conditions, electron contamination, collimator-exchange, photon energy, flattening filter and off-axis distance (OAD). Results indicated that LEE is not required for Sh measurements if electron contamination is minimized. Brass caps or polystyrene miniphantoms can both be used in Sh measurements provided the phantom thickness is large enough to stop contaminant electrons. Backscatter radiation effects into the monitor chamber were found to be negligible for the Siemens linac. It was found that the presence and shape of the flattening filter had a significant effect on the empirically determined value of Sh was also shown to be a function of OAD, particularly for small fields. For fields larger than 12×12 cm2/ Sh was independent of OAD. A flattening filter mass model was introduced to explain qualitatively the above results. A detailed Monte Carlo simulation of the Siemens KD2 linac head in 6 MV mode was performed to investigate the sources of head scatter which contribute to the measured Sh. The simulated head components include the flattening filter, the electron beam stopper, the primary collimator, the photon monitor chamber and the secondary collimators. The simulations showed that the scatter from the head of the Siemens linac is a complex function of the head components. On the central axis the flattening filter played the dominant role in

  19. A Review of Head-Worn Display Research at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis (Trey) J., III; Bailey, Randall E.; Williams, Steven P.; Prinzel, Lawrence J., III; Shelton, Kevin J.; Jones, Denise R.; Houston, Vincent

    2015-01-01

    NASA Langley has conducted research in the area of helmet-mounted/head-worn displays over the past 30 years. Initially, NASA Langley's research focused on military applications, but recently it has conducted a line of research in the area of head-worn displays for commercial and business aircraft. This work has revolved around numerous simulation experiments as well as flight tests to develop technology and data for industry and regulatory guidance. The paper summarizes the results of NASA's helmet-mounted/head-worn display research. Of note, the work tracks progress in wearable collimated optics, head tracking, latency reduction, and weight. The research lends credence that a small, sunglasses-type form factor of the head-worn display would be acceptable to commercial pilots, and this goal is now becoming technologically feasible. The research further suggests that a head-worn display may serve as an "equivalent" Head-Up Display (HUD) with safety, operational, and cost benefits. "HUD equivalence" appears to be the economic avenue by which head-worn displays can become main-stream on the commercial and business aircraft flight deck. If this happens, NASA's research suggests that additional operational benefits using the unique capabilities of the head-worn display can open up new operational paradigms.

  20. A review of head-worn display research at NASA Langley Research Center

    NASA Astrophysics Data System (ADS)

    Arthur, Jarvis J.; Bailey, Randall E.; Williams, Steven P.; Prinzel, Lawrence J.; Shelton, Kevin J.; Jones, Denise R.; Houston, Vincent

    2015-05-01

    NASA Langley has conducted research in the area of helmet-mounted/head-worn displays over the past 30 years. Initially, NASA Langley's research focused on military applications, but recently has conducted a line of research in the area of head-worn displays for commercial and business aircraft. This work has revolved around numerous simulation experiments as well as flight tests to develop technology and data for industry and regulatory guidance. The paper summarizes the results of NASA's helmet-mounted/head-worn display research. Of note, the work tracks progress in wearable collimated optics, head tracking, latency reduction, and weight. The research lends credence that a small, sunglasses-type form factor of the head-worn display would be acceptable to commercial pilots, and this goal is now becoming technologically feasible. The research further suggests that a head-worn display may serve as an "equivalent" Head-Up Display (HUD) with safety, operational, and cost benefits. "HUD equivalence" appears to be the economic avenue by which head-worn displays can become main-stream on the commercial and business aircraft flight deck. If this happens, NASA's research suggests that additional operational benefits using the unique capabilities of the head-worn display can open up new operational paradigms.

  1. Contact Analog/Compressed Symbology Heading Tape Assessment

    NASA Technical Reports Server (NTRS)

    Shively, R. Jay; Atencio, Adolph; Turpin, Terry; Dowell, Susan

    2002-01-01

    A simulation assessed the performance, handling qualities and workload associated with a contact-analog, world-referenced heading tape as implemented on the Comanche Helmet Integrated Display Sight System (HIDSS) when compared with a screen-fixed, compressed heading tape. Six pilots, four active duty Army Aviators and two civilians flew three ADS-33 maneuvers and a traffic pattern in the Ames Vertical Motion Simulation facility. Small, but statistically significant advantages were found for the compressed symbology for handling qualities, workload, and some of the performance measures. It should be noted however that the level of performance and handling qualities for both symbology sets fell within the acceptable tolerance levels. Both symbology sets yield satisfactory handling qualities and performance in velocity stabilization mode and adequate handling qualities in the automatic flight control mode. Pilot comments about the contact analog symbology highlighted the lack of useful rate of change information in the heading tape and "blurring" due to the rapid movement of the heading tape. These issues warrant further study. Care must be taken in interpreting the operational significance of these results. The symbology sets yielded categorically similar data, i.e., acceptable handling qualities and adequate performance, so while the results point to the need for further study, their operational significance has yet to be determined.

  2. Machine learning for Big Data analytics in plants.

    PubMed

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Quality of Big Data in Healthcare

    DOE PAGES

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    2015-01-01

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  4. Quality of Big Data in Healthcare

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay

    The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.

  5. The big data challenges of connectomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less

  6. The big data challenges of connectomics

    DOE PAGES

    Lichtman, Jeff W.; Pfister, Hanspeter; Shavit, Nir

    2014-10-28

    The structure of the nervous system is extraordinarily complicated because individual neurons are interconnected to hundreds or even thousands of other cells in networks that can extend over large volumes. Mapping such networks at the level of synaptic connections, a field called connectomics, began in the 1970s with a the study of the small nervous system of a worm and has recently garnered general interest thanks to technical and computational advances that automate the collection of electron-microscopy data and offer the possibility of mapping even large mammalian brains. However, modern connectomics produces ‘big data’, unprecedented quantities of digital information atmore » unprecedented rates, and will require, as with genomics at the time, breakthrough algorithmic and computational solutions. Here in this paper we describe some of the key difficulties that may arise and provide suggestions for managing them.« less

  7. Database Resources of the BIG Data Center in 2018.

    PubMed

    2018-01-04

    The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  8. Database Resources of the BIG Data Center in 2018

    PubMed Central

    Xu, Xingjian; Hao, Lili; Zhu, Junwei; Tang, Bixia; Zhou, Qing; Song, Fuhai; Chen, Tingting; Zhang, Sisi; Dong, Lili; Lan, Li; Wang, Yanqing; Sang, Jian; Hao, Lili; Liang, Fang; Cao, Jiabao; Liu, Fang; Liu, Lin; Wang, Fan; Ma, Yingke; Xu, Xingjian; Zhang, Lijuan; Chen, Meili; Tian, Dongmei; Li, Cuiping; Dong, Lili; Du, Zhenglin; Yuan, Na; Zeng, Jingyao; Zhang, Zhewen; Wang, Jinyue; Shi, Shuo; Zhang, Yadong; Pan, Mengyu; Tang, Bixia; Zou, Dong; Song, Shuhui; Sang, Jian; Xia, Lin; Wang, Zhennan; Li, Man; Cao, Jiabao; Niu, Guangyi; Zhang, Yang; Sheng, Xin; Lu, Mingming; Wang, Qi; Xiao, Jingfa; Zou, Dong; Wang, Fan; Hao, Lili; Liang, Fang; Li, Mengwei; Sun, Shixiang; Zou, Dong; Li, Rujiao; Yu, Chunlei; Wang, Guangyu; Sang, Jian; Liu, Lin; Li, Mengwei; Li, Man; Niu, Guangyi; Cao, Jiabao; Sun, Shixiang; Xia, Lin; Yin, Hongyan; Zou, Dong; Xu, Xingjian; Ma, Lina; Chen, Huanxin; Sun, Yubin; Yu, Lei; Zhai, Shuang; Sun, Mingyuan; Zhang, Zhang; Zhao, Wenming; Xiao, Jingfa; Bao, Yiming; Song, Shuhui; Hao, Lili; Li, Rujiao; Ma, Lina; Sang, Jian; Wang, Yanqing; Tang, Bixia; Zou, Dong; Wang, Fan

    2018-01-01

    Abstract The BIG Data Center at Beijing Institute of Genomics (BIG) of the Chinese Academy of Sciences provides freely open access to a suite of database resources in support of worldwide research activities in both academia and industry. With the vast amounts of omics data generated at ever-greater scales and rates, the BIG Data Center is continually expanding, updating and enriching its core database resources through big-data integration and value-added curation, including BioCode (a repository archiving bioinformatics tool codes), BioProject (a biological project library), BioSample (a biological sample library), Genome Sequence Archive (GSA, a data repository for archiving raw sequence reads), Genome Warehouse (GWH, a centralized resource housing genome-scale data), Genome Variation Map (GVM, a public repository of genome variations), Gene Expression Nebulas (GEN, a database of gene expression profiles based on RNA-Seq data), Methylation Bank (MethBank, an integrated databank of DNA methylomes), and Science Wikis (a series of biological knowledge wikis for community annotations). In addition, three featured web services are provided, viz., BIG Search (search as a service; a scalable inter-domain text search engine), BIG SSO (single sign-on as a service; a user access control system to gain access to multiple independent systems with a single ID and password) and Gsub (submission as a service; a unified submission service for all relevant resources). All of these resources are publicly accessible through the home page of the BIG Data Center at http://bigd.big.ac.cn. PMID:29036542

  9. The BIG Data Center: from deposition to integration to translation

    PubMed Central

    2017-01-01

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. PMID:27899658

  10. The BIG Data Center: from deposition to integration to translation.

    PubMed

    2017-01-04

    Biological data are generated at unprecedentedly exponential rates, posing considerable challenges in big data deposition, integration and translation. The BIG Data Center, established at Beijing Institute of Genomics (BIG), Chinese Academy of Sciences, provides a suite of database resources, including (i) Genome Sequence Archive, a data repository specialized for archiving raw sequence reads, (ii) Gene Expression Nebulas, a data portal of gene expression profiles based entirely on RNA-Seq data, (iii) Genome Variation Map, a comprehensive collection of genome variations for featured species, (iv) Genome Warehouse, a centralized resource housing genome-scale data with particular focus on economically important animals and plants, (v) Methylation Bank, an integrated database of whole-genome single-base resolution methylomes and (vi) Science Wikis, a central access point for biological wikis developed for community annotations. The BIG Data Center is dedicated to constructing and maintaining biological databases through big data integration and value-added curation, conducting basic research to translate big data into big knowledge and providing freely open access to a variety of data resources in support of worldwide research activities in both academia and industry. All of these resources are publicly available and can be found at http://bigd.big.ac.cn. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Application and Exploration of Big Data Mining in Clinical Medicine

    PubMed Central

    Zhang, Yue; Guo, Shu-Li; Han, Li-Na; Li, Tie-Ling

    2016-01-01

    Objective: To review theories and technologies of big data mining and their application in clinical medicine. Data Sources: Literatures published in English or Chinese regarding theories and technologies of big data mining and the concrete applications of data mining technology in clinical medicine were obtained from PubMed and Chinese Hospital Knowledge Database from 1975 to 2015. Study Selection: Original articles regarding big data mining theory/technology and big data mining's application in the medical field were selected. Results: This review characterized the basic theories and technologies of big data mining including fuzzy theory, rough set theory, cloud theory, Dempster–Shafer theory, artificial neural network, genetic algorithm, inductive learning theory, Bayesian network, decision tree, pattern recognition, high-performance computing, and statistical analysis. The application of big data mining in clinical medicine was analyzed in the fields of disease risk assessment, clinical decision support, prediction of disease development, guidance of rational use of drugs, medical management, and evidence-based medicine. Conclusion: Big data mining has the potential to play an important role in clinical medicine. PMID:26960378

  12. Managing Change in Small Scottish Primary Schools: Is There a Small School Management Style?

    ERIC Educational Resources Information Center

    Wilson, Valerie; McPake, Joanna

    2000-01-01

    Identifies management activities and strategies used by 863 heads of small Scottish schools to implement 4 major national initiatives during the past decade. Headteachers valued teamwork and employed a "plan-implement-review" strategy involving a quick audit, realistic planning for achievable targets, inclusive implementation, and…

  13. The ownership of small private forest-land holdings in 23 New England towns

    Treesearch

    Solon Barraclough; James C. Rettie

    1950-01-01

    Much of the forest land in New England, as elsewhere in the United States, is in small private holdings. How to get these small holdings under reasonably good forest management so that they can better contribute to the country?s need for a high sustained yield of good timber products is one of the forestry?s big problems.

  14. Radar system components to detect small and fast objects

    NASA Astrophysics Data System (ADS)

    Hülsmann, Axel; Zech, Christian; Klenner, Mathias; Tessmann, Axel; Leuther, Arnulf; Lopez-Diaz, Daniel; Schlechtweg, Michael; Ambacher, Oliver

    2015-05-01

    Small and fast objects, for example bullets of caliber 5 to 10 mm, fired from guns like AK-47, can cause serious problems to aircrafts in asymmetric warfare. Especially slow and big aircrafts, like heavy transport helicopters are an easy mark of small caliber hand fire weapons. These aircrafts produce so much noise, that the crew is not able to recognize an attack unless serious problems occur and important systems of the aircraft fail. This is just one of many scenarios, where the detection of fast and small objects is desirable. Another scenario is the collision of space debris particles with satellites.

  15. Head circumference

    MedlinePlus

    ... a child's head circumference. Normal ranges for a child's sex and age (weeks, months), based on values that experts have obtained for normal growth rates of infants' and children's heads. Measurement of the head circumference is an ...

  16. Rethinking big data: A review on the data quality and usage issues

    NASA Astrophysics Data System (ADS)

    Liu, Jianzheng; Li, Jie; Li, Weifeng; Wu, Jiansheng

    2016-05-01

    The recent explosive publications of big data studies have well documented the rise of big data and its ongoing prevalence. Different types of ;big data; have emerged and have greatly enriched spatial information sciences and related fields in terms of breadth and granularity. Studies that were difficult to conduct in the past time due to data availability can now be carried out. However, big data brings lots of ;big errors; in data quality and data usage, which cannot be used as a substitute for sound research design and solid theories. We indicated and summarized the problems faced by current big data studies with regard to data collection, processing and analysis: inauthentic data collection, information incompleteness and noise of big data, unrepresentativeness, consistency and reliability, and ethical issues. Cases of empirical studies are provided as evidences for each problem. We propose that big data research should closely follow good scientific practice to provide reliable and scientific ;stories;, as well as explore and develop techniques and methods to mitigate or rectify those 'big-errors' brought by big data.

  17. Analyzing big data with the hybrid interval regression methods.

    PubMed

    Huang, Chia-Hui; Yang, Keng-Chieh; Kao, Han-Ying

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes.

  18. Analyzing Big Data with the Hybrid Interval Regression Methods

    PubMed Central

    Kao, Han-Ying

    2014-01-01

    Big data is a new trend at present, forcing the significant impacts on information technologies. In big data applications, one of the most concerned issues is dealing with large-scale data sets that often require computation resources provided by public cloud services. How to analyze big data efficiently becomes a big challenge. In this paper, we collaborate interval regression with the smooth support vector machine (SSVM) to analyze big data. Recently, the smooth support vector machine (SSVM) was proposed as an alternative of the standard SVM that has been proved more efficient than the traditional SVM in processing large-scale data. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone that the distribution of data becomes hard to be described and the separation margin between classes. PMID:25143968

  19. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Details to small business concerns. 370... INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of each... organizations in each calendar year, at least 20 percent are to small business concerns, in accordance with 5 U...

  20. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 1 2011-01-01 2011-01-01 false Details to small business concerns. 370... INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of each... organizations in each calendar year, at least 20 percent are to small business concerns, in accordance with 5 U...

  1. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Details to small business concerns. 370... INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of each... organizations in each calendar year, at least 20 percent are to small business concerns, in accordance with 5 U...

  2. 5 CFR 370.107 - Details to small business concerns.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Details to small business concerns. 370... INFORMATION TECHNOLOGY EXCHANGE PROGRAM § 370.107 Details to small business concerns. (a) The head of each... organizations in each calendar year, at least 20 percent are to small business concerns, in accordance with 5 U...

  3. Distribution and movement of Big Spring spinedace (Lepidomeda mollispinis pratensis) in Condor Canyon, Meadow Valley Wash, Nevada

    USGS Publications Warehouse

    Jezorek, Ian G.; Connolly, Patrick J.

    2013-01-01

    Big Spring spinedace (Lepidomeda mollispinis pratensis) is a cyprinid whose entire population occurs within a section of Meadow Valley Wash, Nevada. Other spinedace species have suffered population and range declines (one species is extinct). Managers, concerned about the vulnerability of Big Spring spinedace, have considered habitat restoration actions or translocation, but they have lacked data on distribution or habitat use. Our study occurred in an 8.2-km section of Meadow Valley Wash, including about 7.2 km in Condor Canyon and 0.8 km upstream of the canyon. Big Spring spinedace were present upstream of the currently listed critical habitat, including in the tributary Kill Wash. We found no Big Spring spinedace in the lower 3.3 km of Condor Canyon. We tagged Big Spring spinedace ≥70 mm fork length (range 70–103 mm) with passive integrated transponder tags during October 2008 (n = 100) and March 2009 (n = 103) to document movement. At least 47 of these individuals moved from their release location (up to 2 km). Thirty-nine individuals moved to Kill Wash or the confluence area with Meadow Valley Wash. Ninety-three percent of movement occurred in spring 2009. Fish moved both upstream and downstream. We found no movement downstream over a small waterfall at river km 7.9 and recorded only one fish that moved downstream over Delmue Falls (a 12-m drop) at river km 6.1. At the time of tagging, there was no significant difference in fork length or condition between Big Spring Spinedace that were later detected moving and those not detected moving. We found no significant difference in fork length or condition at time of tagging of Big Spring spinedace ≥70 mm fork length that were detected moving and those not detected moving. Kill Wash and its confluence area appeared important to Big Spring spinedace; connectivity with these areas may be key to species persistence. These areas may provide a habitat template for restoration or translocation. The lower 3.3 km of

  4. Processing Solutions for Big Data in Astronomy

    NASA Astrophysics Data System (ADS)

    Fillatre, L.; Lepiller, D.

    2016-09-01

    This paper gives a simple introduction to processing solutions applied to massive amounts of data. It proposes a general presentation of the Big Data paradigm. The Hadoop framework, which is considered as the pioneering processing solution for Big Data, is described together with YARN, the integrated Hadoop tool for resource allocation. This paper also presents the main tools for the management of both the storage (NoSQL solutions) and computing capacities (MapReduce parallel processing schema) of a cluster of machines. Finally, more recent processing solutions like Spark are discussed. Big Data frameworks are now able to run complex applications while keeping the programming simple and greatly improving the computing speed.

  5. Big Bend National Park

    NASA Image and Video Library

    2017-12-08

    Alternately known as a geologist’s paradise and a geologist’s nightmare, Big Bend National Park in southwestern Texas offers a multitude of rock formations. Sparse vegetation makes finding and observing the rocks easy, but they document a complicated geologic history extending back 500 million years. On May 10, 2002, the Enhanced Thematic Mapper Plus on NASA’s Landsat 7 satellite captured this natural-color image of Big Bend National Park. A black line delineates the park perimeter. The arid landscape appears in muted earth tones, some of the darkest hues associated with volcanic structures, especially the Rosillos and Chisos Mountains. Despite its bone-dry appearance, Big Bend National Park is home to some 1,200 plant species, and hosts more kinds of cacti, birds, and bats than any other U.S. national park. Read more: go.nasa.gov/2bzGaZU Credit: NASA/Landsat7 NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  6. Cohort differences in Big Five personality factors over a period of 25 years.

    PubMed

    Smits, Iris A M; Dolan, Conor V; Vorst, Harrie C M; Wicherts, Jelte M; Timmerman, Marieke E

    2011-06-01

    The notion of personality traits implies a certain degree of stability in the life span of an individual. But what about generational effects? Are there generational changes in the distribution or structure of personality traits? This article examines cohort changes on the Big Five personality factors Extraversion, Agreeableness, Conscientiousness, Neuroticism, and Openness to Experience, among first-year psychology students in The Netherlands, ages 18 to 25 years, between 1982 and 2007. Because measurement invariance of a personality test is essential for a sound interpretation of cohort differences in personality, we first assessed measurement invariance with respect to cohort for males and females separately on the Big Five personality factors, as measured by the Dutch instrument Five Personality Factors Test. Results identified 11 (females) and 2 (males) biased items with respect to cohort, out of a total of 70 items. Analyzing the unbiased items, results indicated small linear increases over time in Extraversion, Agreeableness, and Conscientiousness and small linear decreases over time in Neuroticism. No clear patterns were found on the Openness to Experience factor. Secondary analyses on students from 1971 to 2007 of females and males of different ages together revealed linear trends comparable to those in the main analyses among young adults between 1982 onward. The results imply that the broad sociocultural context may affect personality factors. 2011 APA, all rights reserved

  7. Characterization of small-to-medium head-and-face dimensions for developing respirator fit test panels and evaluating fit of filtering facepiece respirators with different faceseal design

    PubMed Central

    Lin, Yi-Chun

    2017-01-01

    A respirator fit test panel (RFTP) with facial size distribution representative of intended users is essential to the evaluation of respirator fit for new models of respirators. In this study an anthropometric survey was conducted among youths representing respirator users in mid-Taiwan to characterize head-and-face dimensions key to RFTPs for application to small-to-medium facial features. The participants were fit-tested for three N95 masks of different facepiece design and the results compared to facial size distribution specified in the RFTPs of bivariate and principal component analysis design developed in this study to realize the influence of facial characteristics to respirator fit in relation to facepiece design. Nineteen dimensions were measured for 206 participants. In fit testing the qualitative fit test (QLFT) procedures prescribed by the U.S. Occupational Safety and Health Administration were adopted. As the results show, the bizygomatic breadth of the male and female participants were 90.1 and 90.8% of their counterparts reported for the U.S. youths (P < 0.001), respectively. Compared to the bivariate distribution, the PCA design better accommodated variation in facial contours among different respirator user groups or populations, with the RFTPs reported in this study and from literature consistently covering over 92% of the participants. Overall, the facial fit of filtering facepieces increased with increasing facial dimensions. The total percentages of the tests wherein the final maneuver being completed was “Moving head up-and-down”, “Talking” or “Bending over” in bivariate and PCA RFTPs were 13.3–61.9% and 22.9–52.8%, respectively. The respirators with a three-panel flat fold structured in the facepiece provided greater fit, particularly when the users moved heads. When the facial size distribution in a bivariate RFTP did not sufficiently represent petite facial size, the fit testing was inclined to overestimate the general fit

  8. Characterization of small-to-medium head-and-face dimensions for developing respirator fit test panels and evaluating fit of filtering facepiece respirators with different faceseal design.

    PubMed

    Lin, Yi-Chun; Chen, Chen-Peng

    2017-01-01

    A respirator fit test panel (RFTP) with facial size distribution representative of intended users is essential to the evaluation of respirator fit for new models of respirators. In this study an anthropometric survey was conducted among youths representing respirator users in mid-Taiwan to characterize head-and-face dimensions key to RFTPs for application to small-to-medium facial features. The participants were fit-tested for three N95 masks of different facepiece design and the results compared to facial size distribution specified in the RFTPs of bivariate and principal component analysis design developed in this study to realize the influence of facial characteristics to respirator fit in relation to facepiece design. Nineteen dimensions were measured for 206 participants. In fit testing the qualitative fit test (QLFT) procedures prescribed by the U.S. Occupational Safety and Health Administration were adopted. As the results show, the bizygomatic breadth of the male and female participants were 90.1 and 90.8% of their counterparts reported for the U.S. youths (P < 0.001), respectively. Compared to the bivariate distribution, the PCA design better accommodated variation in facial contours among different respirator user groups or populations, with the RFTPs reported in this study and from literature consistently covering over 92% of the participants. Overall, the facial fit of filtering facepieces increased with increasing facial dimensions. The total percentages of the tests wherein the final maneuver being completed was "Moving head up-and-down", "Talking" or "Bending over" in bivariate and PCA RFTPs were 13.3-61.9% and 22.9-52.8%, respectively. The respirators with a three-panel flat fold structured in the facepiece provided greater fit, particularly when the users moved heads. When the facial size distribution in a bivariate RFTP did not sufficiently represent petite facial size, the fit testing was inclined to overestimate the general fit, thus for small

  9. Growth in Head Size during Infancy: Implications for Sound Localization.

    ERIC Educational Resources Information Center

    Clifton, Rachel K.; And Others

    1988-01-01

    Compared head circumference and interaural distance in infants between birth and 22 weeks of age and in a small sample of preschool children and adults. Calculated changes in interaural time differences according to age. Found a large shift in distance. (SKC)

  10. Clinical and anatomical observations of a two-headed lamb.

    PubMed

    Fisher, K R; Partlow, G D; Walker, A F

    1986-04-01

    The clinical and anatomical features of a live-born diprosopic lamb are described. There are no complete anatomical analyses of two-faced lambs in the literature despite the frequency of conjoined twinning in sheep. The lamb had two heads fused in the occipital region. Each head had two eyes. The pinnae of the medial ears were fused. Caudal to the neck the lamb appeared grossly normal. The lamb was unable to raise its heads or stand. Both heads showed synchronous sucking motions and cranial reflexes were present. Nystagmus, strabismus, and limb incoordination were present. The respiratory and heart rates were elevated. There was a grade IV murmur over the left heart base and a palpable thrill on the left side. Each head possessed a normal nasopharynx, oropharynx, and tongue. There was a singular laryngopharnyx and esophagus although the hyoid apparatus was partially duplicated. The cranial and cervical musculature reflected the head duplications. The aortic trunk emerged from the right ventricle just to the right of the conus arteriosus. A ventricular septal defect, patent foramen ovale, and ductus arteriosus were present along with malformed atrioventricular valves. Brainstem fusion began at the cranial medulla oblongata between cranial nerves IX and XII. The cerebella were separate but small. The ventromedial structures from each medulla oblongata were compressed into an extraneous midline remnant of tissue which extended caudally to the level of T2. The clinical signs therefore reflected the anatomical anomalies. A possible etiology for this diprosopus might be the presence early in development of an excessively large block of chordamesoderm. This would allow for the formation of two head folds and hence two "heads."

  11. Semantic Web technologies for the big data in life sciences.

    PubMed

    Wu, Hongyan; Yamaguchi, Atsuko

    2014-08-01

    The life sciences field is entering an era of big data with the breakthroughs of science and technology. More and more big data-related projects and activities are being performed in the world. Life sciences data generated by new technologies are continuing to grow in not only size but also variety and complexity, with great speed. To ensure that big data has a major influence in the life sciences, comprehensive data analysis across multiple data sources and even across disciplines is indispensable. The increasing volume of data and the heterogeneous, complex varieties of data are two principal issues mainly discussed in life science informatics. The ever-evolving next-generation Web, characterized as the Semantic Web, is an extension of the current Web, aiming to provide information for not only humans but also computers to semantically process large-scale data. The paper presents a survey of big data in life sciences, big data related projects and Semantic Web technologies. The paper introduces the main Semantic Web technologies and their current situation, and provides a detailed analysis of how Semantic Web technologies address the heterogeneous variety of life sciences big data. The paper helps to understand the role of Semantic Web technologies in the big data era and how they provide a promising solution for the big data in life sciences.

  12. Big data analytics to aid developing livable communities.

    DOT National Transportation Integrated Search

    2015-12-31

    In transportation, ubiquitous deployment of low-cost sensors combined with powerful : computer hardware and high-speed network makes big data available. USDOT defines big : data research in transportation as a number of advanced techniques applied to...

  13. Ontogeny of Big endothelin-1 effects in newborn piglet pulmonary vasculature.

    PubMed

    Liben, S; Stewart, D J; De Marte, J; Perreault, T

    1993-07-01

    Endothelin-1 (ET-1), a 21-amino acid peptide produced by endothelial cells, results from the cleavage of preproendothelin, generating Big ET-1, which is then cleaved by the ET-converting enzyme (ECE) to form ET-1. Big ET-1, like ET-1, is released by endothelial cells. Big ET-1 is equipotent to ET-1 in vivo, whereas its vasoactive effects are less in vitro. It has been suggested that the effects of Big ET-1 depend on its conversion to ET-1. ET-1 has potent vasoactive effects in the newborn pig pulmonary circulation, however, the effects of Big ET-1 remain unknown. Therefore, we studied the effects of Big ET-1 in isolated perfused lungs from 1- and 7-day-old piglets using the ECE inhibitor, phosphoramidon, and the ETA receptor antagonist, BQ-123Na. The rate of conversion of Big ET-1 to ET-1 was measured using radioimmunoassay. ET-1 (10(-13) to 10(-8) M) produced an initial vasodilation, followed by a dose-dependent potent vasoconstriction (P < 0.001), which was equal at both ages. Big ET-1 (10(-11) to 10(-8) M) also produced a dose-dependent vasoconstriction (P < 0.001). The constrictor effects of Big ET-1 and ET-1 were similar in the 1-day-old, whereas in the 7-day-old, the constrictor effect of Big ET-1 was less than that of ET-1 (P < 0.017).(ABSTRACT TRUNCATED AT 250 WORDS)

  14. Using variable importance measures to identify a small set of single nucleotide polymorphisms capable of predicting heading date in perennial ryegrass

    USDA-ARS?s Scientific Manuscript database

    Prior knowledge on heading date enables the selection of parents for synthetic cultivars that are well-matched with respect to heading date, which is necessary to ensure plants put together will successfully cross with each other. Heading date of individual plants can be determined directly, which h...

  15. Deep learning based classification for head and neck cancer detection with hyperspectral imaging in an animal model

    NASA Astrophysics Data System (ADS)

    Ma, Ling; Lu, Guolan; Wang, Dongsheng; Wang, Xu; Chen, Zhuo Georgia; Muller, Susan; Chen, Amy; Fei, Baowei

    2017-03-01

    Hyperspectral imaging (HSI) is an emerging imaging modality that can provide a noninvasive tool for cancer detection and image-guided surgery. HSI acquires high-resolution images at hundreds of spectral bands, providing big data to differentiating different types of tissue. We proposed a deep learning based method for the detection of head and neck cancer with hyperspectral images. Since the deep learning algorithm can learn the feature hierarchically, the learned features are more discriminative and concise than the handcrafted features. In this study, we adopt convolutional neural networks (CNN) to learn the deep feature of pixels for classifying each pixel into tumor or normal tissue. We evaluated our proposed classification method on the dataset containing hyperspectral images from 12 tumor-bearing mice. Experimental results show that our method achieved an average accuracy of 91.36%. The preliminary study demonstrated that our deep learning method can be applied to hyperspectral images for detecting head and neck tumors in animal models.

  16. Infrastructure for Big Data in the Intensive Care Unit.

    PubMed

    Zelechower, Javier; Astudillo, José; Traversaro, Francisco; Redelico, Francisco; Luna, Daniel; Quiros, Fernan; San Roman, Eduardo; Risk, Marcelo

    2017-01-01

    The Big Data paradigm can be applied in intensive care unit, in order to improve the treatment of the patients, with the aim of customized decisions. This poster is about the infrastructure necessary to built a Big Data system for the ICU. Together with the infrastructure, the conformation of a multidisciplinary team is essential to develop Big Data to use in critical care medicine.

  17. Modelling the Species Distribution of Flat-Headed Cats (Prionailurus planiceps), an Endangered South-East Asian Small Felid

    PubMed Central

    Hearn, Andrew J.; Hesse, Deike; Mohamed, Azlan; Traeholdt, Carl; Cheyne, Susan M.; Sunarto, Sunarto; Jayasilan, Mohd-Azlan; Ross, Joanna; Shapiro, Aurélie C.; Sebastian, Anthony; Dech, Stefan; Breitenmoser, Christine; Sanderson, Jim; Duckworth, J. W.; Hofer, Heribert

    2010-01-01

    Background The flat-headed cat (Prionailurus planiceps) is one of the world's least known, highly threatened felids with a distribution restricted to tropical lowland rainforests in Peninsular Thailand/Malaysia, Borneo and Sumatra. Throughout its geographic range large-scale anthropogenic transformation processes, including the pollution of fresh-water river systems and landscape fragmentation, raise concerns regarding its conservation status. Despite an increasing number of camera-trapping field surveys for carnivores in South-East Asia during the past two decades, few of these studies recorded the flat-headed cat. Methodology/Principal Findings In this study, we designed a predictive species distribution model using the Maximum Entropy (MaxEnt) algorithm to reassess the potential current distribution and conservation status of the flat-headed cat. Eighty-eight independent species occurrence records were gathered from field surveys, literature records, and museum collections. These current and historical records were analysed in relation to bioclimatic variables (WorldClim), altitude (SRTM) and minimum distance to larger water resources (Digital Chart of the World). Distance to water was identified as the key predictor for the occurrence of flat-headed cats (>50% explanation). In addition, we used different land cover maps (GLC2000, GlobCover and SarVision LLC for Borneo), information on protected areas and regional human population density data to extract suitable habitats from the potential distribution predicted by the MaxEnt model. Between 54% and 68% of suitable habitat has already been converted to unsuitable land cover types (e.g. croplands, plantations), and only between 10% and 20% of suitable land cover is categorised as fully protected according to the IUCN criteria. The remaining habitats are highly fragmented and only a few larger forest patches remain. Conclusion/Significance Based on our findings, we recommend that future conservation efforts for

  18. Modelling the species distribution of flat-headed cats (Prionailurus planiceps), an endangered South-East Asian small felid.

    PubMed

    Wilting, Andreas; Cord, Anna; Hearn, Andrew J; Hesse, Deike; Mohamed, Azlan; Traeholdt, Carl; Cheyne, Susan M; Sunarto, Sunarto; Jayasilan, Mohd-Azlan; Ross, Joanna; Shapiro, Aurélie C; Sebastian, Anthony; Dech, Stefan; Breitenmoser, Christine; Sanderson, Jim; Duckworth, J W; Hofer, Heribert

    2010-03-17

    The flat-headed cat (Prionailurus planiceps) is one of the world's least known, highly threatened felids with a distribution restricted to tropical lowland rainforests in Peninsular Thailand/Malaysia, Borneo and Sumatra. Throughout its geographic range large-scale anthropogenic transformation processes, including the pollution of fresh-water river systems and landscape fragmentation, raise concerns regarding its conservation status. Despite an increasing number of camera-trapping field surveys for carnivores in South-East Asia during the past two decades, few of these studies recorded the flat-headed cat. In this study, we designed a predictive species distribution model using the Maximum Entropy (MaxEnt) algorithm to reassess the potential current distribution and conservation status of the flat-headed cat. Eighty-eight independent species occurrence records were gathered from field surveys, literature records, and museum collections. These current and historical records were analysed in relation to bioclimatic variables (WorldClim), altitude (SRTM) and minimum distance to larger water resources (Digital Chart of the World). Distance to water was identified as the key predictor for the occurrence of flat-headed cats (>50% explanation). In addition, we used different land cover maps (GLC2000, GlobCover and SarVision LLC for Borneo), information on protected areas and regional human population density data to extract suitable habitats from the potential distribution predicted by the MaxEnt model. Between 54% and 68% of suitable habitat has already been converted to unsuitable land cover types (e.g. croplands, plantations), and only between 10% and 20% of suitable land cover is categorised as fully protected according to the IUCN criteria. The remaining habitats are highly fragmented and only a few larger forest patches remain. Based on our findings, we recommend that future conservation efforts for the flat-headed cat should focus on the identified remaining key

  19. Phylogeny and Bayesian divergence time estimations of small-headed flies (Diptera: Acroceridae) using multiple molecular markers.

    PubMed

    Winterton, Shaun L; Wiegmann, Brian M; Schlinger, Evert I

    2007-06-01

    The first formal analysis of phylogenetic relationships among small-headed flies (Acroceridae) is presented based on DNA sequence data from two ribosomal (16S and 28S) and two protein-encoding genes: carbomoylphosphate synthase (CPS) domain of CAD (i.e., rudimentary locus) and cytochrome oxidase I (COI). DNA sequences from 40 species in 22 genera of Acroceridae (representing all three subfamilies) were compared with outgroup exemplars from Nemestrinidae, Stratiomyidae, Tabanidae, and Xylophagidae. Parsimony and Bayesian simultaneous analyses of the full data set recover a well-resolved and strongly supported hypothesis of phylogenetic relationships for major lineages within the family. Molecular evidence supports the monophyly of traditionally recognised subfamilies Philopotinae and Panopinae, but Acrocerinae are polyphyletic. Panopinae, sometimes considered "primitive" based on morphology and host-use, are always placed in a more derived position in the current study. Furthermore, these data support emerging morphological evidence that the type genus Acrocera Meigen, and its sister genus Sphaerops, are atypical acrocerids, comprising a sister lineage to all other Acroceridae. Based on the phylogeny generated in the simultaneous analysis, historical divergence times were estimated using Bayesian methodology constrained with fossil data. These estimates indicate Acroceridae likely evolved during the late Triassic but did not diversify greatly until the Cretaceous.

  20. 76 FR 7837 - Big Rivers Electric Corporation; Notice of Filing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-11

    ... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. NJ11-11-000] Big Rivers Electric Corporation; Notice of Filing Take notice that on February 4, 2011, Big Rivers Electric Corporation (Big Rivers) filed a notice of cancellation of its Second Revised and Restated Open Access...

  1. Data management by using R: big data clinical research series.

    PubMed

    Zhang, Zhongheng

    2015-11-01

    Electronic medical record (EMR) system has been widely used in clinical practice. Instead of traditional record system by hand writing and recording, the EMR makes big data clinical research feasible. The most important feature of big data research is its real-world setting. Furthermore, big data research can provide all aspects of information related to healthcare. However, big data research requires some skills on data management, which however, is always lacking in the curriculum of medical education. This greatly hinders doctors from testing their clinical hypothesis by using EMR. To make ends meet, a series of articles introducing data management techniques are put forward to guide clinicians to big data clinical research. The present educational article firstly introduces some basic knowledge on R language, followed by some data management skills on creating new variables, recoding variables and renaming variables. These are very basic skills and may be used in every project of big data research.

  2. Head lice.

    PubMed

    Devore, Cynthia D; Schutze, Gordon E

    2015-05-01

    Head lice infestation is associated with limited morbidity but causes a high level of anxiety among parents of school-aged children. Since the 2010 clinical report on head lice was published by the American Academy of Pediatrics, newer medications have been approved for the treatment of head lice. This revised clinical report clarifies current diagnosis and treatment protocols and provides guidance for the management of children with head lice in the school setting. Copyright © 2015 by the American Academy of Pediatrics.

  3. (Quasi)-convexification of Barta's (multi-extrema) bounding theorem: Inf_x\\big(\\ssty\\frac{H\\Phi(x)}{\\Phi(x)} \\big) \\le E_gr \\le Sup_x \\big(\\ssty\\frac{H\\Phi(x)}{\\Phi(x)} \\big)

    NASA Astrophysics Data System (ADS)

    Handy, C. R.

    2006-03-01

    There has been renewed interest in the exploitation of Barta's configuration space theorem (BCST) (Barta 1937 C. R. Acad. Sci. Paris 204 472) which bounds the ground-state energy, Inf_x\\big({{H\\Phi(x)}\\over {\\Phi(x)}} \\big ) \\leq E_gr \\leq Sup_x \\big({{H\\Phi(x)}\\over {\\Phi(x)}}\\big) , by using any Φ lying within the space of positive, bounded, and sufficiently smooth functions, {\\cal C} . Mouchet's (Mouchet 2005 J. Phys. A: Math. Gen. 38 1039) BCST analysis is based on gradient optimization (GO). However, it overlooks significant difficulties: (i) appearance of multi-extrema; (ii) inefficiency of GO for stiff (singular perturbation/strong coupling) problems; (iii) the nonexistence of a systematic procedure for arbitrarily improving the bounds within {\\cal C} . These deficiencies can be corrected by transforming BCST into a moments' representation equivalent, and exploiting a generalization of the eigenvalue moment method (EMM), within the context of the well-known generalized eigenvalue problem (GEP), as developed here. EMM is an alternative eigenenergy bounding, variational procedure, overlooked by Mouchet, which also exploits the positivity of the desired physical solution. Furthermore, it is applicable to Hermitian and non-Hermitian systems with complex-number quantization parameters (Handy and Bessis 1985 Phys. Rev. Lett. 55 931, Handy et al 1988 Phys. Rev. Lett. 60 253, Handy 2001 J. Phys. A: Math. Gen. 34 5065, Handy et al 2002 J. Phys. A: Math. Gen. 35 6359). Our analysis exploits various quasi-convexity/concavity theorems common to the GEP representation. We outline the general theory, and present some illustrative examples.

  4. 48 CFR 619.505 - Rejecting Small Business Administration recommendations.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 4 2011-10-01 2011-10-01 false Rejecting Small Business Administration recommendations. 619.505 Section 619.505 Federal Acquisition Regulations System DEPARTMENT OF... Small Business Administration recommendations. The Procurement Executive is the agency head for the...

  5. 48 CFR 619.505 - Rejecting Small Business Administration recommendations.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Rejecting Small Business Administration recommendations. 619.505 Section 619.505 Federal Acquisition Regulations System DEPARTMENT OF... Small Business Administration recommendations. The Procurement Executive is the agency head for the...

  6. Delivering advanced therapies: the big pharma approach.

    PubMed

    Tarnowski, J; Krishna, D; Jespers, L; Ketkar, A; Haddock, R; Imrie, J; Kili, S

    2017-09-01

    After two decades of focused development and some recent clinical successes, cell and gene therapy (CGT) is emerging as a promising approach to personalized medicines. Genetically engineered cells as a medical modality are poised to stand alongside or in combination with small molecule and biopharmaceutical approaches to bring new therapies to patients globally. Big pharma can have a vital role in industrializing CGT by focusing on diseases with high unmet medical need and compelling genetic evidence. Pharma should invest in manufacturing and supply chain solutions that deliver reproducible, high-quality therapies at a commercially viable cost. Owing to the fast pace of innovation in this field proactive engagement with regulators is critical. It is also vital to understand the needs of patients all along the patient care pathway and to establish product pricing that is accepted by prescribers, payers and patients.

  7. Keeping up with Big Data--Designing an Introductory Data Analytics Class

    ERIC Educational Resources Information Center

    Hijazi, Sam

    2016-01-01

    Universities need to keep up with the demand of the business world when it comes to Big Data. The exponential increase in data has put additional demands on academia to meet the big gap in education. Business demand for Big Data has surpassed 1.9 million positions in 2015. Big Data, Business Intelligence, Data Analytics, and Data Mining are the…

  8. [Applications of eco-environmental big data: Progress and prospect].

    PubMed

    Zhao, Miao Miao; Zhao, Shi Cheng; Zhang, Li Yun; Zhao, Fen; Shao, Rui; Liu, Li Xiang; Zhao, Hai Feng; Xu, Ming

    2017-05-18

    With the advance of internet and wireless communication technology, the fields of ecology and environment have entered a new digital era with the amount of data growing explosively and big data technologies attracting more and more attention. The eco-environmental big data is based airborne and space-/land-based observations of ecological and environmental factors and its ultimate goal is to integrate multi-source and multi-scale data for information mining by taking advantages of cloud computation, artificial intelligence, and modeling technologies. In comparison with other fields, the eco-environmental big data has its own characteristics, such as diverse data formats and sources, data collected with various protocols and standards, and serving different clients and organizations with special requirements. Big data technology has been applied worldwide in ecological and environmental fields including global climate prediction, ecological network observation and modeling, and regional air pollution control. The development of eco-environmental big data in China is facing many problems, such as data sharing issues, outdated monitoring facilities and techno-logies, and insufficient data mining capacity. Despite all this, big data technology is critical to solving eco-environmental problems, improving prediction and warning accuracy on eco-environmental catastrophes, and boosting scientific research in the field in China. We expected that the eco-environmental big data would contribute significantly to policy making and environmental services and management, and thus the sustainable development and eco-civilization construction in China in the coming decades.

  9. Big system: Interactive graphics for the engineer

    NASA Technical Reports Server (NTRS)

    Quenneville, C. E.

    1975-01-01

    The BCS Interactive Graphics System (BIG System) approach to graphics was presented, along with several significant engineering applications. The BIG System precompiler, the graphics support library, and the function requirements of graphics applications are discussed. It was concluded that graphics standardization and a device independent code can be developed to assure maximum graphic terminal transferability.

  10. Insights into big sagebrush seedling storage practices

    Treesearch

    Emily C. Overton; Jeremiah R. Pinto; Anthony S. Davis

    2013-01-01

    Big sagebrush (Artemisia tridentata Nutt. [Asteraceae]) is an essential component of shrub-steppe ecosystems in the Great Basin of the US, where degradation due to altered fire regimes, invasive species, and land use changes have led to increased interest in the production of high-quality big sagebrush seedlings for conservation and restoration projects. Seedling...

  11. Attitude Heading Reference System Using MEMS Inertial Sensors with Dual-Axis Rotation

    PubMed Central

    Kang, Li; Ye, Lingyun; Song, Kaichen; Zhou, Yang

    2014-01-01

    This paper proposes a low cost and small size attitude and heading reference system based on MEMS inertial sensors. A dual-axis rotation structure with a proper rotary scheme according to the design principles is applied in the system to compensate for the attitude and heading drift caused by the large gyroscope biases. An optimization algorithm is applied to compensate for the installation angle error between the body frame and the rotation table's frame. Simulations and experiments are carried out to evaluate the performance of the AHRS. The results show that the proper rotation could significantly reduce the attitude and heading drifts. Moreover, the new AHRS is not affected by magnetic interference. After the rotation, the attitude and heading are almost just oscillating in a range. The attitude error is about 3° and the heading error is less than 3° which are at least 5 times better than the non-rotation condition. PMID:25268911

  12. The p53-reactivating small-molecule RITA enhances cisplatin-induced cytotoxicity and apoptosis in head and neck cancer.

    PubMed

    Roh, Jong-Lyel; Ko, Jung Ho; Moon, Soo Jin; Ryu, Chang Hwan; Choi, Jun Young; Koch, Wayne M

    2012-12-01

    We evaluated whether the restoration of p53 function by the p53-reactivating small molecule RITA (reactivation of p53 and induction of tumor cell apoptosis enhances cisplatin-induced cytotoxicity and apoptosis in head-and-neck cancer (HNC). RITA induced prominent accumulation and reactivation of p53 in a wild-type TP53-bearing HNC cell line. RITA showed maximal growth suppression in tumor cells showing MDM2-dependent p53 degradation. RITA promoted apoptosis in association with upregulation of p21, BAX, and cleaved caspase-3; notably, the apoptotic response was blocked by pifithrin-α, demonstrating its p53 dependence. With increasing concentrations, RITA strongly induced apoptosis rather than G2-phase arrest. In combination therapy, RITA enhanced cisplatin-induced growth inhibition and apoptosis of HNC cells invitro and in vivo. Our data suggest that the restoration of p53 tumor-suppressive function by RITA enhances the cytotoxicity and apoptosis of cisplatin, an action that may offer an attractive strategy for treating HNC. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  13. Principles of Experimental Design for Big Data Analysis.

    PubMed

    Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G

    2017-08-01

    Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis.

  14. Principles of Experimental Design for Big Data Analysis

    PubMed Central

    Drovandi, Christopher C; Holmes, Christopher; McGree, James M; Mengersen, Kerrie; Richardson, Sylvia; Ryan, Elizabeth G

    2016-01-01

    Big Datasets are endemic, but are often notoriously difficult to analyse because of their size, heterogeneity and quality. The purpose of this paper is to open a discourse on the potential for modern decision theoretic optimal experimental design methods, which by their very nature have traditionally been applied prospectively, to improve the analysis of Big Data through retrospective designed sampling in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has the potential for wide generality and advantageous inferential and computational properties. We highlight current hurdles and open research questions surrounding efficient computational optimisation in using retrospective designs, and in part this paper is a call to the optimisation and experimental design communities to work together in the field of Big Data analysis. PMID:28883686

  15. Big Data and Nursing: Implications for the Future.

    PubMed

    Topaz, Maxim; Pruinelli, Lisiane

    2017-01-01

    Big data is becoming increasingly more prevalent and it affects the way nurses learn, practice, conduct research and develop policy. The discipline of nursing needs to maximize the benefits of big data to advance the vision of promoting human health and wellbeing. However, current practicing nurses, educators and nurse scientists often lack the required skills and competencies necessary for meaningful use of big data. Some of the key skills for further development include the ability to mine narrative and structured data for new care or outcome patterns, effective data visualization techniques, and further integration of nursing sensitive data into artificial intelligence systems for better clinical decision support. We provide growth-path vision recommendations for big data competencies for practicing nurses, nurse educators, researchers, and policy makers to help prepare the next generation of nurses and improve patient outcomes trough better quality connected health.

  16. Information Retrieval Using Hadoop Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Motwani, Deepak; Madan, Madan Lal

    This paper concern on big data analysis which is the cognitive operation of probing huge amounts of information in an attempt to get uncovers unseen patterns. Through Big Data Analytics Applications such as public and private organization sectors have formed a strategic determination to turn big data into cut throat benefit. The primary occupation of extracting value from big data give rise to a process applied to pull information from multiple different sources; this process is known as extract transforms and lode. This paper approach extract information from log files and Research Paper, awareness reduces the efforts for blueprint finding and summarization of document from several positions. The work is able to understand better Hadoop basic concept and increase the user experience for research. In this paper, we propose an approach for analysis log files for finding concise information which is useful and time saving by using Hadoop. Our proposed approach will be applied on different research papers on a specific domain and applied for getting summarized content for further improvement and make the new content.

  17. Changes in Parents’ Spanking and Reading as Mechanisms for Head Start Impacts on Children

    PubMed Central

    Gershoff, Elizabeth T.; Ansari, Arya; Purtell, Kelly M.; Sexton, Holly R.

    2015-01-01

    This study examined whether Head Start, the nation’s main two-generation program for low-income families, benefits children in part through positive changes in parents’ use of spanking and reading to children. Data were drawn from the 3-year-old cohort of the national evaluation of the Head Start program known as the Head Start Impact Study (N = 2,063). Results indicated that Head Start had small indirect effects on children’s spelling ability at age 4 and their aggression at age 4 through an increase in parents’ reading to their children. Taken together, the results suggest that parents plays a role in sustaining positive benefits of the Head Start program for children’s behavior and literacy skills, one that could be enhanced with a greater emphasis on parent involvement and education. PMID:26618521

  18. The big data processing platform for intelligent agriculture

    NASA Astrophysics Data System (ADS)

    Huang, Jintao; Zhang, Lichen

    2017-08-01

    Big data technology is another popular technology after the Internet of Things and cloud computing. Big data is widely used in many fields such as social platform, e-commerce, and financial analysis and so on. Intelligent agriculture in the course of the operation will produce large amounts of data of complex structure, fully mining the value of these data for the development of agriculture will be very meaningful. This paper proposes an intelligent data processing platform based on Storm and Cassandra to realize the storage and management of big data of intelligent agriculture.

  19. Research Activities at Fermilab for Big Data Movement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Wu, Wenji; Kim, Hyun W

    2013-01-01

    Adaptation of 100GE Networking Infrastructure is the next step towards management of Big Data. Being the US Tier-1 Center for the Large Hadron Collider's (LHC) Compact Muon Solenoid (CMS) experiment and the central data center for several other large-scale research collaborations, Fermilab has to constantly deal with the scaling and wide-area distribution challenges of the big data. In this paper, we will describe some of the challenges involved in the movement of big data over 100GE infrastructure and the research activities at Fermilab to address these challenges.

  20. [Utilization of Big Data in Medicine and Future Outlook].

    PubMed

    Kinosada, Yasutomi; Uematsu, Machiko; Fujiwara, Takuya

    2016-03-01

    "Big data" is a new buzzword. The point is not to be dazzled by the volume of data, but rather to analyze it, and convert it into insights, innovations, and business value. There are also real differences between conventional analytics and big data. In this article, we show some results of big data analysis using open DPC (Diagnosis Procedure Combination) data in areas of the central part of JAPAN: Toyama, Ishikawa, Fukui, Nagano, Gifu, Aichi, Shizuoka, and Mie Prefectures. These 8 prefectures contain 51 medical administration areas called the second medical area. By applying big data analysis techniques such as k-means, hierarchical clustering, and self-organizing maps to DPC data, we can visualize the disease structure and detect similarities or variations among the 51 second medical areas. The combination of a big data analysis technique and open DPC data is a very powerful method to depict real figures on patient distribution in Japan.

  1. Research on Technology Innovation Management in Big Data Environment

    NASA Astrophysics Data System (ADS)

    Ma, Yanhong

    2018-02-01

    With the continuous development and progress of the information age, the demand for information is getting larger. The processing and analysis of information data is also moving toward the direction of scale. The increasing number of information data makes people have higher demands on processing technology. The explosive growth of information data onto the current society have prompted the advent of the era of big data. At present, people have more value and significance in producing and processing various kinds of information and data in their lives. How to use big data technology to process and analyze information data quickly to improve the level of big data management is an important stage to promote the current development of information and data processing technology in our country. To some extent, innovative research on the management methods of information technology in the era of big data can enhance our overall strength and make China be an invincible position in the development of the big data era.

  2. Finite-frequency sensitivity kernels for head waves

    NASA Astrophysics Data System (ADS)

    Zhang, Zhigang; Shen, Yang; Zhao, Li

    2007-11-01

    Head waves are extremely important in determining the structure of the predominantly layered Earth. While several recent studies have shown the diffractive nature and the 3-D Fréchet kernels of finite-frequency turning waves, analogues of head waves in a continuous velocity structure, the finite-frequency effects and sensitivity kernels of head waves are yet to be carefully examined. We present the results of a numerical study focusing on the finite-frequency effects of head waves. Our model has a low-velocity layer over a high-velocity half-space and a cylindrical-shaped velocity perturbation placed beneath the interface at different locations. A 3-D finite-difference method is used to calculate synthetic waveforms. Traveltime and amplitude anomalies are measured by the cross-correlation of synthetic seismograms from models with and without the velocity perturbation and are compared to the 3-D sensitivity kernels constructed from full waveform simulations. The results show that the head wave arrival-time and amplitude are influenced by the velocity structure surrounding the ray path in a pattern that is consistent with the Fresnel zones. Unlike the `banana-doughnut' traveltime sensitivity kernels of turning waves, the traveltime sensitivity of the head wave along the ray path below the interface is weak, but non-zero. Below the ray path, the traveltime sensitivity reaches the maximum (absolute value) at a depth that depends on the wavelength and propagation distance. The sensitivity kernels vary with the vertical velocity gradient in the lower layer, but the variation is relatively small at short propagation distances when the vertical velocity gradient is within the range of the commonly accepted values. Finally, the depression or shoaling of the interface results in increased or decreased sensitivities, respectively, beneath the interface topography.

  3. Association of Big Endothelin-1 with Coronary Artery Calcification.

    PubMed

    Qing, Ping; Li, Xiao-Lin; Zhang, Yan; Li, Yi-Lin; Xu, Rui-Xia; Guo, Yuan-Lin; Li, Sha; Wu, Na-Qiong; Li, Jian-Jun

    2015-01-01

    The coronary artery calcification (CAC) is clinically considered as one of the important predictors of atherosclerosis. Several studies have confirmed that endothelin-1(ET-1) plays an important role in the process of atherosclerosis formation. The aim of this study was to investigate whether big ET-1 is associated with CAC. A total of 510 consecutively admitted patients from February 2011 to May 2012 in Fu Wai Hospital were analyzed. All patients had received coronary computed tomography angiography and then divided into two groups based on the results of coronary artery calcium score (CACS). The clinical characteristics including traditional and calcification-related risk factors were collected and plasma big ET-1 level was measured by ELISA. Patients with CAC had significantly elevated big ET-1 level compared with those without CAC (0.5 ± 0.4 vs. 0.2 ± 0.2, P<0.001). In the multivariate analysis, big ET-1 (Tertile 2, HR = 3.09, 95% CI 1.66-5.74, P <0.001, Tertile3 HR = 10.42, 95% CI 3.62-29.99, P<0.001) appeared as an independent predictive factor of the presence of CAC. There was a positive correlation of the big ET-1 level with CACS (r = 0.567, p<0.001). The 10-year Framingham risk (%) was higher in the group with CACS>0 and the highest tertile of big ET-1 (P<0.01). The area under the receiver operating characteristic curve for the big ET-1 level in predicting CAC was 0.83 (95% CI 0.79-0.87, p<0.001), with a sensitivity of 70.6% and specificity of 87.7%. The data firstly demonstrated that the plasma big ET-1 level was a valuable independent predictor for CAC in our study.

  4. Meta-analyses of Big Six Interests and Big Five Personality Factors.

    ERIC Educational Resources Information Center

    Larson, Lisa M.; Rottinghaus, Patrick J.; Borgen, Fred H.

    2002-01-01

    Meta-analysis of 24 samples demonstrated overlap between Holland's vocational interest domains (measured by Self Directed Search, Strong Interest Inventory, and Vocational Preference Inventory) and Big Five personality factors (measured by Revised NEO Personalty Inventory). The link is stronger for five interest-personality pairs:…

  5. [Big Data- challenges and risks].

    PubMed

    Krauß, Manuela; Tóth, Tamás; Hanika, Heinrich; Kozlovszky, Miklós; Dinya, Elek

    2015-12-06

    The term "Big Data" is commonly used to describe the growing mass of information being created recently. New conclusions can be drawn and new services can be developed by the connection, processing and analysis of these information. This affects all aspects of life, including health and medicine. The authors review the application areas of Big Data, and present examples from health and other areas. However, there are several preconditions of the effective use of the opportunities: proper infrastructure, well defined regulatory environment with particular emphasis on data protection and privacy. These issues and the current actions for solution are also presented.

  6. Big game habitat use in southeastern Montana

    Treesearch

    James G. MacCracken; Daniel W. Uresk

    1984-01-01

    The loss of suitable, high quality habitat is a major problem facing big game managers in the western United States. Agricultural, water, road and highway, housing, and recreational development have contributed to loss of natural big game habitat (Wallmo et al. 1976, Reed 1981). In the western United States, surface mining of minerals has great potential to adversely...

  7. Neck Strength Imbalance Correlates With Increased Head Acceleration in Soccer Heading

    PubMed Central

    Dezman, Zachary D.W.; Ledet, Eric H.; Kerr, Hamish A.

    2013-01-01

    Background: Soccer heading is using the head to directly contact the ball, often to advance the ball down the field or score. It is a skill fundamental to the game, yet it has come under scrutiny. Repeated subclinical effects of heading may compound over time, resulting in neurologic deficits. Greater head accelerations are linked to brain injury. Developing an understanding of how the neck muscles help stabilize and reduce head acceleration during impact may help prevent brain injury. Hypothesis: Neck strength imbalance correlates to increasing head acceleration during impact while heading a soccer ball. Study Design: Observational laboratory investigation. Methods: Sixteen Division I and II collegiate soccer players headed a ball in a controlled indoor laboratory setting while player motions were recorded by a 14-camera Vicon MX motion capture system. Neck flexor and extensor strength of each player was measured using a spring-type clinical dynamometer. Results: Players were served soccer balls by hand at a mean velocity of 4.29 m/s (±0.74 m/s). Players returned the ball to the server using a heading maneuver at a mean velocity of 5.48 m/s (±1.18 m/s). Mean neck strength difference was positively correlated with angular head acceleration (rho = 0.497; P = 0.05), with a trend toward significance for linear head acceleration (rho = 0.485; P = 0.057). Conclusion: This study suggests that symmetrical strength in neck flexors and extensors reduces head acceleration experienced during low-velocity heading in experienced collegiate players. Clinical Relevance: Balanced neck strength may reduce head acceleration cumulative subclinical injury. Since neck strength is a measureable and amenable strength training intervention, this may represent a modifiable intrinsic risk factor for injury. PMID:24459547

  8. Gender differences in head-neck segment dynamic stabilization during head acceleration.

    PubMed

    Tierney, Ryan T; Sitler, Michael R; Swanik, C Buz; Swanik, Kathleen A; Higgins, Michael; Torg, Joseph

    2005-02-01

    Recent epidemiological research has revealed that gender differences exist in concussion incidence but no study has investigated why females may be at greater risk of concussion. Our purpose was to determine whether gender differences existed in head-neck segment kinematic and neuromuscular control variables responses to an external force application with and without neck muscle preactivation. Forty (20 females and 20 males) physically active volunteers participated in the study. The independent variables were gender, force application (known vs unknown), and force direction (forced flexion vs forced extension). The dependent variables were kinematic and EMG variables, head-neck segment stiffness, and head-neck segment flexor and extensor isometric strength. Statistical analyses consisted of multiple multivariate and univariate analyses of variance, follow-up univariate analyses of variance, and t-tests (P < or = 0.05). Gender differences existed in head-neck segment dynamic stabilization during head angular acceleration. Females exhibited significantly greater head-neck segment peak angular acceleration (50%) and displacement (39%) than males despite initiating muscle activity significantly earlier (SCM only) and using a greater percentage of their maximum head-neck segment muscle activity (79% peak activity and 117% muscle activity area). The head-neck segment angular acceleration differences may be because females exhibited significantly less isometric strength (49%), neck girth (30%), and head mass (43%), resulting in lower levels of head-neck segment stiffness (29%). For our subject demographic, the results revealed gender differences in head-neck segment dynamic stabilization during head acceleration in response to an external force application. Females exhibited significantly greater head-neck segment peak angular acceleration and displacement than males despite initiating muscle activity earlier (SCM only) and using a greater percentage of their maximum

  9. The Influence of Small Class Size, Duration, Intensity, and Heterogeneity on Head Start Fade

    ERIC Educational Resources Information Center

    Huss, Christopher D.

    2010-01-01

    The researcher conducted a nonexperimental study to investigate and analyze the influence of reduced class sizes, intensity (all day and every day), duration (five years), and heterogeneity (random class assignment) on the Head Start Fade effect. The researcher employed retrospective data analysis using a longitudinal explanatory design on data…

  10. A Big Data Analytics Methodology Program in the Health Sector

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony; Howell-Barber, H.

    2016-01-01

    The benefits of Big Data Analytics are cited frequently in the literature. However, the difficulties of implementing Big Data Analytics can limit the number of organizational projects. In this study, the authors evaluate business, procedural and technical factors in the implementation of Big Data Analytics, applying a methodology program. Focusing…

  11. View of New Big Oak Flat Road seen from Old ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    View of New Big Oak Flat Road seen from Old Wawona Road near location of photograph HAER CA-148-17. Note road cuts, alignment, and tunnels. Devils Dance Floor at left distance. Looking northwest - Big Oak Flat Road, Between Big Oak Flat Entrance & Merced River, Yosemite Village, Mariposa County, CA

  12. Reply to 'Comment on 'Heavy element production in inhomogeneous big bang nucleosynthesis''

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsuura, Shunji; Fujimoto, Shin-ichirou; Hashimoto, Masa-aki

    2007-03-15

    This is a reply to Rauscher [Phys. Rev. D 75, 068301 (2007)]. We studied heavy element production in the high baryon density region in the early universe [Phys. Rev. D 72, 123505 (2005)]. However, it is claimed by Rauscher [Phys. Rev. D 75, 068301 (2007)] that a small scale but high baryon density region contradicts observations for the light element abundance or, in order not to contradict the observations, the high density region must be so small that it cannot affect the present heavy element abundance. In this paper, we study big bang nucleosynthesis in the high baryon density regionmore » and show that in certain parameter spaces it is possible to produce enough of the heavy element without contradiction to cosmic microwave background and light element observations.« less

  13. BIG1 is required for the survival of deep layer neurons, neuronal polarity, and the formation of axonal tracts between the thalamus and neocortex in developing brain

    PubMed Central

    Teoh, Jia-Jie; Iwano, Tomohiko; Kunii, Masataka; Atik, Nur; Avriyanti, Erda; Yoshimura, Shin-ichiro; Moriwaki, Kenta

    2017-01-01

    BIG1, an activator protein of the small GTPase, Arf, and encoded by the Arfgef1 gene, is one of candidate genes for epileptic encephalopathy. To know the involvement of BIG1 in epileptic encephalopathy, we analyzed BIG1-deficient mice and found that BIG1 regulates neurite outgrowth and brain development in vitro and in vivo. The loss of BIG1 decreased the size of the neocortex and hippocampus. In BIG1-deficient mice, the neuronal progenitor cells (NPCs) and the interneurons were unaffected. However, Tbr1+ and Ctip2+ deep layer (DL) neurons showed spatial-temporal dependent apoptosis. This apoptosis gradually progressed from the piriform cortex (PIR), peaked in the neocortex, and then progressed into the hippocampus from embryonic day 13.5 (E13.5) to E17.5. The upper layer (UL) and DL order in the neocortex was maintained in BIG1-deficient mice, but the excitatory neurons tended to accumulate before their destination layers. Further pulse-chase migration assay showed that the migration defect was non-cell autonomous and secondary to the progression of apoptosis into the BIG1-deficient neocortex after E15.5. In BIG1-deficient mice, we observed an ectopic projection of corticothalamic axons from the primary somatosensory cortex (S1) into the dorsal lateral geniculate nucleus (dLGN). The thalamocortical axons were unable to cross the diencephalon–telencephalon boundary (DTB). In vitro, BIG1-deficient neurons showed a delay in neuronal polarization. BIG1-deficient neurons were also hypersensitive to low dose glutamate (5 μM), and died via apoptosis. This study showed the role of BIG1 in the survival of DL neurons in developing embryonic brain and in the generation of neuronal polarity. PMID:28414797

  14. Affordable Development and Demonstration of a Small NTR Engine and Stage: How Small is Big Enough?

    NASA Technical Reports Server (NTRS)

    Borowski, Stanley K.; Sefcik, Robert J.; Fittje, James E.; McCurdy, David R.; Qualls, Arthur L.; Schnitzler, Bruce G.; Werner, James E.; Weitzberg (Abraham); Joyner, Claude R.

    2015-01-01

    The Nuclear Thermal Rocket (NTR) derives its energy from fission of uranium-235 atoms contained within fuel elements that comprise the engine's reactor core. It generates high thrust and has a specific impulse potential of approximately 900 seconds - a 100% increase over today's best chemical rockets. The Nuclear Thermal Propulsion (NTP) project, funded by NASA's AES program, includes five key task activities: (1) Recapture, demonstration, and validation of heritage graphite composite (GC) fuel (selected as the "Lead Fuel" option); (2) Engine Conceptual Design; (3) Operating Requirements Definition; (4) Identification of Affordable Options for Ground Testing; and (5) Formulation of an Affordable Development Strategy. During FY'14, a preliminary DDT&E plan and schedule for NTP development was outlined by GRC, DOE and industry that involved significant system-level demonstration projects that included GTD tests at the NNSS, followed by a FTD mission. To reduce cost for the GTD tests and FTD mission, small NTR engines, in either the 7.5 or 16.5 klbf thrust class, were considered. Both engine options used GC fuel and a "common" fuel element (FE) design. The small approximately 7.5 klbf "criticality-limited" engine produces approximately 157 megawatts of thermal power (MWt) and its core is configured with parallel rows of hexagonal-shaped FEs and tie tubes (TTs) with a FE to TT ratio of approximately 1:1. The larger approximately 16.5 klbf Small Nuclear Rocket Engine (SNRE), developed by LANL at the end of the Rover program, produces approximately 367 MWt and has a FE to TT ratio of approximately 2:1. Although both engines use a common 35 inch (approximately 89 cm) long FE, the SNRE's larger diameter core contains approximately 300 more FEs needed to produce an additional 210 MWt of power. To reduce the cost of the FTD mission, a simple "1-burn" lunar flyby mission was considered to reduce the LH2 propellant loading, the stage size and complexity. Use of existing and

  15. Occurrence and Partial Characterization of Lettuce big vein associated virus and Mirafiori lettuce big vein virus in Lettuce in Iran.

    PubMed

    Alemzadeh, E; Izadpanah, K

    2012-12-01

    Mirafiori lettuce big vein virus (MiLBVV) and lettuce big vein associated virus (LBVaV) were found in association with big vein disease of lettuce in Iran. Analysis of part of the coat protein (CP) gene of Iranian isolates of LBVaV showed 97.1-100 % nucleotide sequence identity with other LBVaV isolates. Iranian isolates of MiLBVV belonged to subgroup A and showed 88.6-98.8 % nucleotide sequence identity with other isolates of this virus when amplified by PCR primer pair MiLV VP. The occurrence of both viruses in lettuce crop was associated with the presence of resting spores and zoosporangia of the fungus Olpidium brassicae in lettuce roots under field and greenhouse conditions. Two months after sowing lettuce seed in soil collected from a lettuce field with big vein affected plants, all seedlings were positive for LBVaV and MiLBVV, indicating soil transmission of both viruses.

  16. Meteoroid head echo polarization features studied by numerical electromagnetics modeling

    NASA Astrophysics Data System (ADS)

    Vertatschitsch, L. E.; Sahr, J. D.; Colestock, P.; Close, S.

    2011-12-01

    Meteoroid head echoes are radar returns associated with scatter from the dense plasma surrounding meteoroids striking the Earth's atmosphere. Such echoes are detected by high power, large aperture (HPLA) radars. Frequently such detections show large variations in signal strength that suggest constructive and destructive interference. Using the ARPA Long-Range Tracking and Instrumentation Radar (ALTAIR) we can also observe the polarization of the returns. Usually, scatter from head echoes resembles scatter from a small sphere; when transmitting right circular polarization (RC), the received signal consists entirely of left circular polarization (LC). For some detections, power is also received in the RC channel, which indicates the presence of a more complicated scattering process. Radar returns of a fragmenting meteoroid are simulated using a hard-sphere scattering model numerically evaluated in the resonant region of Mie scatter. The cross- and co-polar scattering cross-sections are computed for pairs of spheres lying within a few wavelengths, simulating the earliest stages of fragmentation upon atmospheric impact. The likelihood of detecting this sort of idealized fragmentation event is small, but this demonstrates the measurements that would result from such an event would display RC power comparable to LC power, matching the anomalous data. The resulting computations show that fragmentation is a consistent interpretation for these head echo radar returns.

  17. [Contemplation on the application of big data in clinical medicine].

    PubMed

    Lian, Lei

    2015-01-01

    Medicine is another area where big data is being used. The link between clinical treatment and outcome is the key step when applying big data in medicine. In the era of big data, it is critical to collect complete outcome data. Patient follow-up, comprehensive integration of data resources, quality control and standardized data management are the predominant approaches to avoid missing data and data island. Therefore, establishment of systemic patients follow-up protocol and prospective data management strategy are the important aspects of big data in medicine.

  18. Cincinnati Big Area Additive Manufacturing (BAAM)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duty, Chad E.; Love, Lonnie J.

    Oak Ridge National Laboratory (ORNL) worked with Cincinnati Incorporated (CI) to demonstrate Big Area Additive Manufacturing which increases the speed of the additive manufacturing (AM) process by over 1000X, increases the size of parts by over 10X and shows a cost reduction of over 100X. ORNL worked with CI to transition the Big Area Additive Manufacturing (BAAM) technology from a proof-of-principle (TRL 2-3) demonstration to a prototype product stage (TRL 7-8).

  19. Potential Solution of a Hardware-Software System V-Cluster for Big Data Analysis

    NASA Astrophysics Data System (ADS)

    Morra, G.; Tufo, H.; Yuen, D. A.; Brown, J.; Zihao, S.

    2017-12-01

    Today it cannot be denied that the Big Data revolution is taking place and is replacing HPC and numerical simulation as the main driver in society. Outside the immediate scientific arena, the Big Data market encompass much more than the AGU. There are many sectors in society that Big Data can ably serve, such as governments finances, hospitals, tourism, and, last by not least, scientific and engineering problems. In many countries, education has not kept pace with the demands from students outside computer science to get into Big Data science. Ultimate Vision (UV) in Beijing attempts to address this need in China by focusing part of our energy on education and training outside the immediate university environment. UV plans a strategy to maximize profits in our beginning. Therefore, we will focus on growing markets such as provincial governments, medical sectors, mass media, and education. And will not address issues such as performance for scientific collaboration, such as seismic networks, where the market share and profits are small by comparison. We have developed a software-hardware system, called V-Cluster, built with the latest NVIDIA GPUs and Intel CPUs with ample amounts of RAM (over couple of Tbytes) and local storage. We have put in an internal network with high bandwidth (over 100 Gbits/sec) and each node of V-Cluster can run at around 40 Tflops. Our system can scale linearly with the number of codes. Our main strength in data analytics is the use of graph-computing paradigm for optimizing the transfer rate in collaborative efforts. We focus in training and education with our clients in order to gain experience in learning about new applications. We will present the philosophy of this second generation of our Data Analytic system, whose costs fall far below those offered elsewhere.

  20. Solution structure of leptospiral LigA4 Big domain

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mei, Song; Zhang, Jiahai; Zhang, Xuecheng

    Pathogenic Leptospiraspecies express immunoglobulin-like proteins which serve as adhesins to bind to the extracellular matrices of host cells. Leptospiral immunoglobulin-like protein A (LigA), a surface exposed protein containing tandem repeats of bacterial immunoglobulin-like (Big) domains, has been proved to be involved in the interaction of pathogenic Leptospira with mammalian host. In this study, the solution structure of the fourth Big domain of LigA (LigA4 Big domain) from Leptospira interrogans was solved by nuclear magnetic resonance (NMR). The structure of LigA4 Big domain displays a similar bacterial immunoglobulin-like fold compared with other Big domains, implying some common structural aspects of Bigmore » domain family. On the other hand, it displays some structural characteristics significantly different from classic Ig-like domain. Furthermore, Stains-all assay and NMR chemical shift perturbation revealed the Ca{sup 2+} binding property of LigA4 Big domain. - Highlights: • Determining the solution structure of a bacterial immunoglobulin-like domain from a surface protein of Leptospira. • The solution structure shows some structural characteristics significantly different from the classic Ig-like domains. • A potential Ca{sup 2+}-binding site was identified by strains-all and NMR chemical shift perturbation.« less