Sample records for create point events

  1. EPA Office of Water (OW): Nonpoint Source Projects NHDPlus Indexed Dataset

    EPA Pesticide Factsheets

    GRTS locational data for nonpoint source projects. GRTS locations are coded onto NHDPlus v2.1 flowline features to create point and line events or coded onto NHDPlus v2.1 waterbody features to create area events. In addition to NHDPlus reach indexed data there may also be custom events (point, line or area) that are not associated with NHD and are in an EPA standard format that is compatible with EPA's Reach Address Database. Custom events are used to represent GRTS locations that are not represented well in NHDPlus.

  2. The "perfect storm" in compensation: convergence of events leads to a greater need to review compensation strategies.

    PubMed

    Jones, Robert B

    2004-01-01

    The recent unprecedented convergence of significant strategic events in the compensation arena has created the need for ongoing and extensive compensation planning. This article reviews the events leading to this point, describes the implications of the results from a recent Aon study with WorldatWork, and suggests what employers can do to successfully navigate the "perfect storm" in compensation.

  3. EPA Office of Water (OW): Fish Consumption Advisories and Fish Tissue Sampling Stations NHDPlus Indexed Datasets

    EPA Pesticide Factsheets

    The Fish Consumption Advisories dataset contains information on Fish Advisory events that have been indexed to the EPA Office of Water NHDPlus v2.1 hydrology and stored in the Reach Addressing Database (RAD). NHDPlus is a database that interconnects and uniquely identifies the millions of stream segments or reaches that comprise the Nations' surface water drainage system. NHDPlus provides a national framework for assigning reach addresses to water quality related entities, such as fish advisories locations. Reach addresses establish the locations of these entities relative to one another within the NHD surface water drainage network in a manner similar to street addresses. The assignment of reach addresses is accomplished through a process known as reach indexing. Fish consumption advisories and fish tissue sampling stations are reported to EPA by the states. Sampling stations are the locations where a state has collected fish tissue data for use in advisory determinations. Fish consumption advisory locations are coded onto NHDPlus flowline features to create point and linear events. Fish consumption advisory locations are also coded onto NHDPlus waterbody features to create area events. In addition to NHDPlus-reach indexed data, there may also be custom events (point, line, or area) that are not associated with NHDPlus. Although these Fish consumption advisories are not represented in NHDPlus, the data created for them are in an EPA standard format that is co

  4. Creating a Meeting Point of Understanding: Interpreters' Experiences in Swedish Childhood Cancer Care.

    PubMed

    Granhagen Jungner, Johanna; Tiselius, Elisabet; Lützén, Kim; Blomgren, Klas; Pergert, Pernilla

    2016-01-01

    Children and families with a foreign background and limited Swedish proficiency have to communicate through interpreters in childhood cancer care centers in Sweden. Interpreter-mediated events deal with many difficulties that potentially hinder the transfer of information. The purpose of our study was to explore interpreters' experiences of interpreting between health care staff and limited Swedish proficiency patients/families in childhood cancer care. Using purposive samples, we interviewed 11 interpreters individually. The interviews were analyzed using qualitative content analysis. Analyses of the data resulted in the main theme of creating a meeting point of understanding, constructed from 3 subthemes: balancing between cultures, bridging the gaps of knowledge, and balancing between compassion and professionalism. Our result shows that in order to create a sustainable meeting point of understanding, it is necessary to explain both the context and cultural differences. These results suggest that the responsibility for information transfer lies with both the health care profession and the interpreters. Tools have to be developed for both parties to contribute to creating the meeting point of understanding. © 2015 by Association of Pediatric Hematology/Oncology Nurses.

  5. Cross-cultural differences in mental representations of time: evidence from an implicit nonlinguistic task.

    PubMed

    Fuhrman, Orly; Boroditsky, Lera

    2010-11-01

    Across cultures people construct spatial representations of time. However, the particular spatial layouts created to represent time may differ across cultures. This paper examines whether people automatically access and use culturally specific spatial representations when reasoning about time. In Experiment 1, we asked Hebrew and English speakers to arrange pictures depicting temporal sequences of natural events, and to point to the hypothesized location of events relative to a reference point. In both tasks, English speakers (who read left to right) arranged temporal sequences to progress from left to right, whereas Hebrew speakers (who read right to left) arranged them from right to left, replicating previous work. In Experiments 2 and 3, we asked the participants to make rapid temporal order judgments about pairs of pictures presented one after the other (i.e., to decide whether the second picture showed a conceptually earlier or later time-point of an event than the first picture). Participants made responses using two adjacent keyboard keys. English speakers were faster to make "earlier" judgments when the "earlier" response needed to be made with the left response key than with the right response key. Hebrew speakers showed exactly the reverse pattern. Asking participants to use a space-time mapping inconsistent with the one suggested by writing direction in their language created interference, suggesting that participants were automatically creating writing-direction consistent spatial representations in the course of their normal temporal reasoning. It appears that people automatically access culturally specific spatial representations when making temporal judgments even in nonlinguistic tasks. Copyright © 2010 Cognitive Science Society, Inc.

  6. Patterns of emergency room visits, admissions and death following recommended pediatric vaccinations - a population based study of 969,519 vaccination events.

    PubMed

    Wilson, Kumanan; Hawken, Steven; Potter, Beth K; Chakraborty, Pranesh; Kwong, Jeff; Crowcroft, Natasha; Rothwell, Deanna; Manuel, Doug

    2011-05-12

    The risk of immediate adverse events due to the inflammation created by a vaccine is a potential concern for pediatric vaccine programs. We analyzed data on children born between March 2006 and March 2009 in the province of Ontario. Using the self-controlled case series design, we examined the risk of the combined endpoint of emergency room visit and hospital admission in the immediate 3 days post vaccination to a control period 9-18 days after vaccination. We examined the end points of emergency room visits, hospital admissions and death separately as secondary outcomes. We examined 969,519 separate vaccination events. The relative incidence of our combined end point was 0.85 (0.80-0.90) for vaccination at age 2 months, 0.74 (0.69-0.79) at age 4 months and 0.68 (0.63-0.72) at age 6 months. The relative incidence was reduced for the individual endpoints of emergency room visits, admissions and death. There were 5 or fewer deaths in the risk interval of all 969,519 vaccination events. In a post hoc analysis we observed a large reduction in events in the immediate 3 days prior to vaccination suggesting a large healthy vaccinee effect. There was no increased incidence of the combined end point of emergency room visits and hospitalizations in the 3-day period immediately following vaccination, nor for individual endpoints or death. The health vaccinee effect could create the perception of worsening health following vaccines in the absence of any vaccine adverse effect and could also mask an effect in the immediate post-vaccination period. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. VizieR Online Data Catalog: Second ROSAT all-sky survey (2RXS) source catalog (Boller+, 2016)

    NASA Astrophysics Data System (ADS)

    Boller, T.; Freyberg, M. J.; Truemper, J.; Haberl, F.; Voges, W.; Nandra, K.

    2016-03-01

    We have re-analysed the photon event files from the ROSAT all-sky survey. The main goal was to create a catalogue of point-like sources, which is referred to as the 2RXS source catalogue. We improved the reliability of detections by an advanced detection algorithm and a complete screening process. New data products were created to allow timing and spectral analysis. Photon event files with corrected astrometry and Moon rejection (RASS-3.1 processing) were made available in FITS format. The 2RXS catalogue will serve as the basic X-ray all-sky survey catalogue until eROSITA data become available. (2 data files).

  8. Estimation of the displacements among distant events based on parallel tracking of events in seismic traces under uncertainty

    NASA Astrophysics Data System (ADS)

    Huamán Bustamante, Samuel G.; Cavalcanti Pacheco, Marco A.; Lazo Lazo, Juan G.

    2018-07-01

    The method we propose in this paper seeks to estimate interface displacements among strata related with reflection seismic events, in comparison to the interfaces at other reference points. To do so, we search for reflection events in the reference point of a second seismic trace taken from the same 3D survey and close to a well. However, the nature of the seismic data introduces uncertainty in the results. Therefore, we perform an uncertainty analysis using the standard deviation results from several experiments with cross-correlation of signals. To estimate the displacements of events in depth between two seismic traces, we create a synthetic seismic trace with an empirical wavelet and the sonic log of the well, close to the second seismic trace. Then, we relate the events of the seismic traces to the depth of the sonic log. Finally, we test the method with data from the Namorado Field in Brazil. The results show that the accuracy of the event estimated depth depends on the results of parallel cross-correlation, primarily those from the procedures used in the integration of seismic data with data from the well. The proposed approach can correctly identify several similar events in two seismic traces without requiring all seismic traces between two distant points of interest to correlate strata in the subsurface.

  9. Geo Events (March 12-13, 2002).

    ERIC Educational Resources Information Center

    Turturice, Michael W.; Rothkopf, Stephen

    In this teaching activity, students track and present information about specific regions in the world on a bi-weekly basis. Emphasis is placed on the five themes of geography to develop a working knowledge of assigned regions. Students use Microsoft Publisher and PowerPoint to create presentations. The activity packet contains several documents…

  10. Best Practices in Supplier Relationship Management and Their Early Implementation in the Air Force Materiel Command

    DTIC Science & Technology

    2012-01-01

    information-sharing and through systematic efforts to develop the supplier. These efforts can be direct involvement activities, such as kaizen events,24 24...Womack and Jones (1996) define kaizen as “Continuous, incremental improvement of an activity to create more value with less muda. Also called point... kaizen and process kaizen .” They define muda as “any activity that consumes resources but creates no value,” i.e., waste. Figure 2.2 Krause and

  11. Creating Vibrant Communities & Economies in Rural America.

    ERIC Educational Resources Information Center

    Beaulieu, Lionel J.

    Although the economic expansion of the 1990s was felt even in small towns and rural areas, events in recent months point out that the economic health of rural America remains fragile. Rural manufacturing has suffered sizable employment declines in recent months and only modest expansion has occurred in the service sector--the lifeblood of rural…

  12. From what might have been to what must have been: counterfactual thinking creates meaning.

    PubMed

    Kray, Laura J; George, Linda G; Liljenquist, Katie A; Galinsky, Adam D; Tetlock, Philip E; Roese, Neal J

    2010-01-01

    Four experiments explored whether 2 uniquely human characteristics-counterfactual thinking (imagining alternatives to the past) and the fundamental drive to create meaning in life-are causally related. Rather than implying a random quality to life, the authors hypothesized and found that counterfactual thinking heightens the meaningfulness of key life experiences. Reflecting on alternative pathways to pivotal turning points even produced greater meaning than directly reflecting on the meaning of the event itself. Fate perceptions ("it was meant to be") and benefit-finding (recognition of positive consequences) were identified as independent causal links between counterfactual thinking and the construction of meaning. Through counterfactual reflection, the upsides to reality are identified, a belief in fate emerges, and ultimately more meaning is derived from important life events.

  13. Statistical Model Applied to NetFlow for Network Intrusion Detection

    NASA Astrophysics Data System (ADS)

    Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.

    The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.

  14. A Reading Theorist's World View through the Lens of Terrence Malick: The "Poem" Created from Transacting "the Tree of Life" Trailer

    ERIC Educational Resources Information Center

    Malich, John; Kehus, Marcella J.

    2012-01-01

    In our essay we discuss Louise Rosenblatt's transactional theory of a reading event. Second, we summarize Carole Cox and Joyce Many who applied the transactional theory and designed a 1-5 point continuum to stories and films. Third, we summarize film theorists David Bordwell's constructivism; Richard Wollheim's central imagining and…

  15. Virtual reality for spherical images

    NASA Astrophysics Data System (ADS)

    Pilarczyk, Rafal; Skarbek, Władysław

    2017-08-01

    Paper presents virtual reality application framework and application concept for mobile devices. Framework uses Google Cardboard library for Android operating system. Framework allows to create virtual reality 360 video player using standard OpenGL ES rendering methods. Framework provides network methods in order to connect to web server as application resource provider. Resources are delivered using JSON response as result of HTTP requests. Web server also uses Socket.IO library for synchronous communication between application and server. Framework implements methods to create event driven process of rendering additional content based on video timestamp and virtual reality head point of view.

  16. [Psychological approach to different skin diseases: life events and tendency to complain].

    PubMed

    Tordeurs, D; Poot, F; Janne, P; Reynaert, C; Salamon, V

    2001-01-01

    For nearly two decades, dermatology has associated with psychology to find a better way to care for dermatology conditions. A scientific trend called psychosomatics is creating a link between dermatology and psychology. The purpose of this article was to examine two concepts closely linked to psychodermatology (life events and tendency to complain) and to emphasize the difference between factors playing a role in the onset of certain skin diseases (psoriasis, alopecia areata, benign tumors, eczema). We found that psoriasis patients have a greater tendency to complain than people with the other disease. This point to the importance of taking emotions into account when studying psoriasis. We also found that life events play a role in the onset of psoriasis and alopecia areata. Moreover, these events were anterior by more than 12 months in alopecia patients. We propose exploring emotions in psoriasis patients and life events over the prior year in alopecia areata patients.

  17. Socialization of hospice volunteers: members of the family.

    PubMed

    Sadler, C; Marty, F

    1998-01-01

    The purpose of this study is to examine the turning points volunteers found important in their hospice training and volunteer experiences. Seventeen individuals who had recently completed hospice training were asked about the turning points in their training and volunteering that were important in their becoming and remaining a hospice volunteer. The study finds that volunteers have a wide variety of intrapersonal, interpersonal, and group reasons for becoming and remaining a hospice volunteer. The findings suggest that hospice staff need to create a wide variety of events which volunteers can identify with to help people want to become and remain volunteers.

  18. Calculating point of origin of blood spatter using laser scanning technology.

    PubMed

    Hakim, Nashad; Liscio, Eugene

    2015-03-01

    The point of origin of an impact pattern is important in establishing the chain of events in a bloodletting incident. In this study, the accuracy and reproducibility of the point of origin estimation using the FARO Scene software with the FARO Focus(3D) laser scanner was determined. Five impact patterns were created for each of three combinations of distances from the floor (z) and the front wall (x). Fifteen spatters were created using a custom impact rig, scanned using the laser scanner, photographed using a DSLR camera, and processed using the Scene software. Overall results gave a SD = 3.49 cm (p < 0.0001) in the x-direction, SD = 1.14 cm (p = 0.9291) in the y-direction, and SD = 9.08 cm (p < 0.0115) in the z-direction. The technique performs within literature ranges of accepted accuracy and reproducibility and is comparable to results reported for other virtual stringing software. © 2015 American Academy of Forensic Sciences.

  19. Guidelines for Guest Conductors of Honor Choirs: When Planning and Conducting Honor Ensembles, Focus on Creating a Rewarding Experience for All from Start to Finish

    ERIC Educational Resources Information Center

    Freer, Patrick K.

    2007-01-01

    Because there are hundreds, perhaps thousands, of honor choir events each year across the United States, most choir directors will likely be invited to conduct an honor choir at one point or another. Additionally, many conductors will work with honor choirs at the city, municipal, and county levels. Still others will work with ensembles from…

  20. Complex Event Recognition Architecture

    NASA Technical Reports Server (NTRS)

    Fitzgerald, William A.; Firby, R. James

    2009-01-01

    Complex Event Recognition Architecture (CERA) is the name of a computational architecture, and software that implements the architecture, for recognizing complex event patterns that may be spread across multiple streams of input data. One of the main components of CERA is an intuitive event pattern language that simplifies what would otherwise be the complex, difficult tasks of creating logical descriptions of combinations of temporal events and defining rules for combining information from different sources over time. In this language, recognition patterns are defined in simple, declarative statements that combine point events from given input streams with those from other streams, using conjunction, disjunction, and negation. Patterns can be built on one another recursively to describe very rich, temporally extended combinations of events. Thereafter, a run-time matching algorithm in CERA efficiently matches these patterns against input data and signals when patterns are recognized. CERA can be used to monitor complex systems and to signal operators or initiate corrective actions when anomalous conditions are recognized. CERA can be run as a stand-alone monitoring system, or it can be integrated into a larger system to automatically trigger responses to changing environments or problematic situations.

  1. Earthquake doublet that occurred in a pull-apart basin along the Sumatran fault and its seismotectonic implication

    NASA Astrophysics Data System (ADS)

    Nakano, M.; Kumagai, H.; Yamashina, T.; Inoue, H.; Toda, S.

    2007-12-01

    On March 6, 2007, an earthquake doublet occurred around Lake Singkarak, central Sumatra in Indonesia. An earthquake with magnitude (Mw) 6.4 at 03:49 is followed two hours later (05:49) by a similar-size event (Mw 6.3). Lake Singkarak is located between the Sianok and Sumani fault segments of the Sumatran fault system, and is a pull-apart basin formed at the segment boundary. We investigate source processes of the earthquakes using waveform data obtained from JISNET, which is a broad-band seismograph network in Indonesia. We first estimate the centroid source locations and focal mechanisms by the waveform inversion carried out in the frequency domain. Since stations are distributed almost linearly in the NW-SE direction coincident with the Sumatran fault strike direction, the estimated centroid locations are not well resolved especially in the direction orthogonal to the NW-SE direction. If we assume that these earthquakes occurred along the Sumatran fault, the first earthquake is located on the Sumani segment below Lake Singkarak and the second event is located at a few tens of kilometers north of the first event on the Sianok segment. The focal mechanisms of both events point to almost identical right-lateral strike-slip vertical faulting, which is consistent with the geometry of the Sumatran fault system. We next investigate the rupture initiation points using the particle motions of the P-waves of these earthquakes observed at station PPI, which is located about 20 km north of the Lake Singkarak. The initiation point of the first event is estimated in the north of the lake, which corresponds to the northern end of the Sumani segment. The initiation point of the second event is estimated at the southern end of the Sianok segment. The observed maximum amplitudes at stations located in the SE of the source region show larger amplitudes for the first event than those for the second one. On the other hand, the amplitudes at station BSI located in the NW of the source region show larger amplitude for the second event than that for the first one. Since the magnitudes, focal mechanisms, and source locations are almost identical for the two events, the larger amplitudes for the second event at BSI may be due to the effect of rupture directivity. Accordingly, we obtain the following image of source processes of the earthquake doublet: The first event initiated at the segment boundary and its rupture propagated along the Sumani segment to the SW direction. Then, the second event, which may be triggered by the first event, initiated at a location close to the hypocenter of the first event, but its rupture propagated along the Sianok segment to the NE direction, opposite to the first event. It is known that the previous significant seismic activity along the Sianok and Sumani segments occurred in 1926, which was also an earthquake doublet with similar magnitudes to those in 2007. If we assume that the time interval between the earthquake doublets in 1926 and 2007 represents the average recurrence interval and that typical slip in the individual earthquakes is 1 m, we obtain approximately 1 cm/year for a slip rate of the fault segments. Geological features indicate that Lake Singkrak is no more than a few million years old (Sieh and Natawidjaja, 2000, JGR). If the pull-apart basin has been created since a few million years ago with the estimated slip rate of the segments, we obtain roughly 20 km of the total offset on the Sianok and Sumani segments, which is consistent with the observed offset. Our study supports the model of Sieh and Natawidjaja (2000) that the basin continues to be created by dextral slip on the en echelon Sumani and Sianok segments.

  2. [Risk management--a new aspect of quality assessment in intensive care medicine: first results of an analysis of the DIVI's interdisciplinary quality assessment research group].

    PubMed

    Stiletto, R; Röthke, M; Schäfer, E; Lefering, R; Waydhas, Ch

    2006-10-01

    Patient security has become one of the major aspects of clinical management in recent years. The crucial point in research was focused on malpractice. In contradiction to the economic process in non medical fields, the analysis of errors during the in-patient treatment time was neglected. Patient risk management can be defined as a structured procedure in a clinical unit with the aim to reduce harmful events. A risk point model was created based on a Delphi process and founded on the DIVI data register. The risk point model was evaluated in clinically working ICU departments participating in the register data base. The results of the risk point evaluation will be integrated in the next data base update. This might be a step to improve the reliability of the register to measure quality assessment in the ICU.

  3. Identifying the turning point: using the transtheoretical model of change to map intimate partner violence disclosure in emergency department settings.

    PubMed

    Catallo, Cristina; Jack, Susan M; Ciliska, Donna; Macmillan, Harriet L

    2012-01-01

    Background. The transtheoretical model of change (TTM) was used as a framework to examine the steps that women took to disclose intimate partner violence (IPV) in urban emergency departments. Methods. Mapping methods portrayed the evolving nature of decisions that facilitated or inhibited disclosure. This paper is a secondary analysis of qualitative data from a mixed methods study that explored abused women's decision making process about IPV disclosure. Findings. Change maps were created for 19 participants with movement from the precontemplation to the maintenance stages of the model. Disclosure often occurred after a significant "turning point event" combined with a series of smaller events over a period of time. The significant life event often involved a weighing of options where participants considered the perceived risks against the potential benefits of disclosure. Conclusions. Abused women experienced intrusion from the chaotic nature of the emergency department. IPV disclosure was perceived as a positive experience when participants trusted the health care provider and felt control over their decisions to disclose IPV. Practice Implications. Nurses can use these findings to gauge the readiness of women to disclose IPV in the emergency department setting.

  4. Creating Special Events

    ERIC Educational Resources Information Center

    deLisle, Lee

    2009-01-01

    "Creating Special Events" is organized as a systematic approach to festivals and events for students who seek a career in event management. This book looks at the evolution and history of festivals and events and proceeds to the nuts and bolts of event management. The book presents event management as the means of planning, organizing, directing,…

  5. IBES: a tool for creating instructions based on event segmentation

    PubMed Central

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-01-01

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool. PMID:24454296

  6. IBES: a tool for creating instructions based on event segmentation.

    PubMed

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  7. Multi-point laser ignition device

    DOEpatents

    McIntyre, Dustin L.; Woodruff, Steven D.

    2017-01-17

    A multi-point laser device comprising a plurality of optical pumping sources. Each optical pumping source is configured to create pumping excitation energy along a corresponding optical path directed through a high-reflectivity mirror and into substantially different locations within the laser media thereby producing atomic optical emissions at substantially different locations within the laser media and directed along a corresponding optical path of the optical pumping source. An output coupler and one or more output lenses are configured to produce a plurality of lasing events at substantially different times, locations or a combination thereof from the multiple atomic optical emissions produced at substantially different locations within the laser media. The laser media is a single continuous media, preferably grown on a single substrate.

  8. El Nino, from 1870 to 2014, and other Atmospheric Circulation Forcing by Extreme Apparitions of the Eight Annual, Continental Scale, Aerosol Plumes in the Satellite Era which Point to a Possible Cause for the Current Californian Drought

    NASA Astrophysics Data System (ADS)

    Potts, K. A.

    2015-12-01

    Eight continental scale aerosol plumes exist each year as the enclosed image shows. Apparitions of seven plumes only exist for a few months in the same season each year whilst the East Asian Plume is visible all year. The aerosol optical depth (AOD) of all the plumes varies enormously interannually with two studies showing the surface radiative forcing of the South East Asian Plume (SEAP) as -150W/m2 and -286W/m2/AOD. I show that the SEAP, created by volcanic aerosols (natural) and biomass burning and gas flares in the oil industry (anthropogenic), is the sole cause of all El Nino events, the greatest interannual perturbation of the atmospheric circulation system. The SEAP creates an El Nino by absorbing solar radiation at the top of the plume which heats the upper atmosphere and cools the surface. This creates a temperature inversion compared to periods without the plume and reduces convection. With reduced convection in SE Asia, the Maritime Continent, the Trade Winds blowing across the Pacific are forced to relax as their exit into the Hadley and Walker Cells is constrained and the reduced Trade Wind speed causes the Sea Surface Temperature (SST) to rise in the central tropical Pacific Ocean as there is a strong negative correlation between wind speed and SST. The warmer SST in the central Pacific creates convection in the region which further reduces the Trade Wind speed and causes the Walker Cell to reverse - a classic El Nino. Having established the ability of such extreme aerosol plumes to create El Nino events I will then show how the South American, West African, Middle East and SEAP plumes create drought in the Amazon, Spain, Darfur and Australia as well as causing the extremely warm autumn and winter in Europe in 2006-07. All these effects are created by the plumes reducing convection in the region of the plume which forces the regional Hadley Cells into anomalous positions thereby creating persistent high pressure cells in the mid latitudes. This perturbs the mid latitude storm tracks and creates persistent high and low pressure regions around the World at those latitudes giving rise to extreme events by causing the regional winds to blow persistently from one direction. Finally I will suggest which plumes may be causing the high pressure ridge in the NE Pacific which is causing the severe drought in California.

  9. Black Hole Firewalls and Lorentzian Relativity

    NASA Astrophysics Data System (ADS)

    Winterberg, Friedwardt

    2013-04-01

    In a paper published (Z. f. Naturforsch. 56a, 889, 2001) I had shown that the pre-Einstein theory of relativity by Lorentz and Poincare, extended to the general theory of relativity and quantum mechanics, predicts the disintegration of matter by passing through the event horizon. The zero point vacuum energy is there cut-off at the Planck energy, but Lorentz-invariant all the way up to this energy. The cut-off creates a distinguished reference system in which this energy is at rest. For non-relativistic velocities relative to this reference system, the special and general relativity remain a good approximations, with matter held together in a stable equilibrium by electrostatic forces (or forces acting like them) as a solution of an elliptic partial differential equation derived from Maxwell's equation. But in approaching and crossing the velocity of light in the distinguished reference system, which is equivalent in approaching and crossing of the event horizon, the elliptic differential equation goes over into a hyperbolic differential equation (as in fluid dynamics from subsonic to supersonic flow), and there is no such equilibrium. According to Schwarzschild's interior solution, the event horizon of a collapsing mass appears first as a point in its center, thereafter moving radially outwards, thereby converting all the mass into energy, explaining the observed gamma ray bursters.

  10. Application of data cubes for improving detection of water cycle extreme events

    NASA Astrophysics Data System (ADS)

    Teng, W. L.; Albayrak, A.

    2015-12-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case for our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme (WCE) events, a specific case of anomaly detection, requiring time series data. We investigate the use of the sequential probability ratio test (SPRT) for anomaly detection and support vector machines (SVM) for anomaly classification. We show an example of detection of WCE events, using the Global Land Data Assimilation Systems (GLDAS) data set.

  11. Application of Data Cubes for Improving Detection of Water Cycle Extreme Events

    NASA Technical Reports Server (NTRS)

    Albayrak, Arif; Teng, William

    2015-01-01

    As part of an ongoing NASA-funded project to remove a longstanding barrier to accessing NASA data (i.e., accessing archived time-step array data as point-time series), for the hydrology and other point-time series-oriented communities, "data cubes" are created from which time series files (aka "data rods") are generated on-the-fly and made available as Web services from the Goddard Earth Sciences Data and Information Services Center (GES DISC). Data cubes are data as archived rearranged into spatio-temporal matrices, which allow for easy access to the data, both spatially and temporally. A data cube is a specific case of the general optimal strategy of reorganizing data to match the desired means of access. The gain from such reorganization is greater the larger the data set. As a use case of our project, we are leveraging existing software to explore the application of the data cubes concept to machine learning, for the purpose of detecting water cycle extreme events, a specific case of anomaly detection, requiring time series data. We investigate the use of support vector machines (SVM) for anomaly classification. We show an example of detection of water cycle extreme events, using data from the Tropical Rainfall Measuring Mission (TRMM).

  12. 3D Surface Reconstruction and Volume Calculation of Rills

    NASA Astrophysics Data System (ADS)

    Brings, Christine; Gronz, Oliver; Becker, Kerstin; Wirtz, Stefan; Seeger, Manuel; Ries, Johannes B.

    2015-04-01

    We use the low-cost, user-friendly photogrammetric Structure from Motion (SfM) technique, which is implemented in the Software VisualSfM, for 3D surface reconstruction and volume calculation of an 18 meter long rill in Luxembourg. The images were taken with a Canon HD video camera 1) before a natural rainfall event, 2) after a natural rainfall event and before a rill experiment and 3) after a rill experiment. Recording with a video camera results compared to a photo camera not only a huge time advantage, the method also guarantees more than adequately overlapping sharp images. For each model, approximately 8 minutes of video were taken. As SfM needs single images, we automatically selected the sharpest image from 15 frame intervals. The sharpness was estimated using a derivative-based metric. Then, VisualSfM detects feature points in each image, searches matching feature points in all image pairs, recovers the camera positions and finally by triangulation of camera positions and feature points the software reconstructs a point cloud of the rill surface. From the point cloud, 3D surface models (meshes) are created and via difference calculations of the pre and post models a visualization of the changes (erosion and accumulation areas) and quantification of erosion volumes are possible. The calculated volumes are presented in spatial units of the models and so real values must be converted via references. The outputs are three models at three different points in time. The results show that especially using images taken from suboptimal videos (bad lighting conditions, low contrast of the surface, too much in-motion unsharpness), the sharpness algorithm leads to much more matching features. Hence the point densities of the 3D models are increased and thereby clarify the calculations.

  13. Geophysical Event Casting: Assembling & Broadcasting Data Relevant to Events and Disasters

    NASA Astrophysics Data System (ADS)

    Manipon, G. M.; Wilson, B. D.

    2012-12-01

    Broadcast Atom feeds are already being used to publish metadata and support discovery of data collections, granules, and web services. Such data and service casting advertises the existence of new granules in a dataset and available services to access or transform data. Similarly, data and services relevant to studying topical geophysical events (earthquakes, hurricanes, etc.) or periodic/regional structures (El Nino, deep convection) can be broadcast by publishing new entries and links in a feed for that topic. By using the geoRSS conventions, the time and space location of the event (e.g. a moving hurricane track) is specified in the feed, along with science description, images, relevant data granules, and links to useful web services (e.g. OGC/WMS). The topic cast is used to assemble all of the relevant data/images as they come in, and publish the metadata (images, links, services) to a broad group of subscribers. All of the information in the feed is structured using standardized XML tags (e.g. georss for space & time, and tags to point to external data & services), and is thus machine-readable, which is an improvement over collecting ad hoc links on a wiki. We have created a software suite in python to generate such "event casts" when a geophysical event first happens, then update them with more information as it becomes available, and display them as an event album in a web browser. Figure 1 shows a snapshot of our Event Cast Browser displaying information from a set of casts about the hurricanes in the Western Pacific during the year 2011. The 19th cyclone is selected in the left panel, so the top right panels display the entries in that feed with metadata such as maximum wind speed, while the bottom right panel displays the hurricane track (positions every 12 hours) as KML in the Google Earth plug-in, where additional data/image layers from the feed can be turned on or off by the user. The software automatically converts (georss) space & time information to KML placemarks, and can also generate various KML visualizations for other data layers that are pointed to in the feed. The user can replay all of the data images as an animation over the several days as the cyclone develops. The goal of "event casting" is to standardize several metadata micro-formats and use them within Atom feeds to create a rich ecosystem of topical event data that can be automatically manipulated by scripts and many interfaces. For our event cast browser, the same code can display all kinds of casts, whether about hurricanes, fire, earthquakes, or even El Nino. The presentation will describe: the event cast format and its standard micro-formats, software to generate and augment casts, and the browser GUI with KML visualizations.;

  14. Bridging naturalistic and laboratory assessment of memory: the Baycrest mask fit test.

    PubMed

    Armson, Michael J; Abdi, Hervé; Levine, Brian

    2017-09-01

    Autobiographical memory tests provide a naturalistic counterpoint to the artificiality of laboratory research methods, yet autobiographical events are uncontrolled and, in most cases, unverifiable. In this study, we capitalised on a scripted, complex naturalistic event - the mask fit test (MFT), a standardised procedure required of hospital employees - to bridge the gap between naturalistic and laboratory memory assessment. We created a test of recognition memory for the MFT and administered it to 135 hospital employees who had undertaken the MFT at various points over the past five years. Multivariate analysis revealed two dimensions defined by accuracy and response bias. Accuracy scores showed the expected relationship to encoding-test delay, supporting the validity of this measure. Relative to younger adults, older adults' memory for this naturalistic event was better than would be predicted from the cognitive ageing literature, a result consistent with the notion that older adults' memory performance is enhanced when stimuli are naturalistic and personally relevant. These results demonstrate that testing recognition memory for a scripted event is a viable method of studying autobiographical memory.

  15. Contrasting environments associated with storm prediction center tornado outbreak forecasts using synoptic-scale composite analysis

    NASA Astrophysics Data System (ADS)

    Bates, Alyssa Victoria

    Tornado outbreaks have significant human impact, so it is imperative forecasts of these phenomena are accurate. As a synoptic setup lays the foundation for a forecast, synoptic-scale aspects of Storm Prediction Center (SPC) outbreak forecasts of varying accuracy were assessed. The percentages of the number of tornado outbreaks within SPC 10% tornado probability polygons were calculated. False alarm events were separately considered. The outbreaks were separated into quartiles using a point-in-polygon algorithm. Statistical composite fields were created to represent the synoptic conditions of these groups and facilitate comparison. Overall, temperature advection had the greatest differences between the groups. Additionally, there were significant differences in the jet streak strengths and amounts of vertical wind shear. The events forecasted with low accuracy consisted of the weakest synoptic-scale setups. These results suggest it is possible that events with weak synoptic setups should be regarded as areas of concern by tornado outbreak forecasters.

  16. Modeling of the Geosocial Process using GIS «Disasters»

    NASA Astrophysics Data System (ADS)

    Vikulina, Marina; Turchaninova, Alla; Dolgaya, Anna; Vikulin, Alexandr; Petrova, Elena

    2016-04-01

    The natural and social disasters generate a huge stress in the world community. Most researches searching for the relationships between different catastrophic events consider the limited sets of disasters and do not take into account their size. This fact puts to doubt the completeness and statistical significance of such approach. Thus the next indispensible step is to overpass from narrow subject framework researches of disasters to more complex researches. In order to study the relationships between the Nature and the Society a database of natural disasters and dreadful social events occurred during the last XXXVI (36) centuries of human history weighted by the magnitude was created and became a core of the GIS «Disasters» (ArcGIS 10.0). By the moment the database includes more than 2500 most socially significant ("strong") catastrophic natural (earthquakes, fires, floods, droughts, climatic anomalies, other natural disasters) as well as social (wars, revolts, genocide, epidemics, fires caused by the human being, other social disasters) events. So far, each event is presented as a point feature located in the center of the struck region in the World Map. If the event affects several countries, it is placed in the approximate center of the affected area. Every event refers to the country or group of countries which are located in a zone of its influence now. The grade J (I, II and III) is specified for each event according to the disaster force assessment scale developed by the authors. The GIS with such a detailed database of disastrous events weighted by the magnitude over a long period of time is compiled for the first time and creates fairly complete and statistically representative basis for studies of the distribution of natural and social disasters and their relationship. By the moment the statistical analysis of the database performed both for each aggregate (natural disasters and catastrophic social phenomena), and for particular statistically representative types of events led to the following conclusions: natural disasters and dreadful social events have appeared to be closely related to each other despite their apparently different nature. The numbers of events of different magnitude are distributed by logarithmic law: the bigger the event, the less likely it happens. For each type of events and each aggregate the existence of periodicities with periods of 280 ± 60 years was established. The identified properties of cyclicity, grouping and interaction create a basis for modeling essentially unified Geosocial Process at a high enough statistical level and prove the existence of the uniform planetary Geosocial Process. The evidence of interaction between "lifeless" Nature and Society is fundamental and provided a new forecasting approach of demographic crises taking into account both natural disasters and social phenomena. The idea of the interaction of Nature and Society through the disasters «exchange» as a uniform planetary Geosocial Process is an essentially new statement introduced for the first time.

  17. DSCOVR Science Data and Retrospective Access

    NASA Astrophysics Data System (ADS)

    Rowland, W. F.; Codrescu, S.; Tilton, M.; Cartwright, J.; Redmon, R. J.; Loto'aniu, P. T. M.; Mccullough, H.; Denig, W. F.

    2016-12-01

    On July 27, 2016 the Deep Space Climate Observatory (DSCOVR) became the first operational satellite at the first Lagrange point (L1). This vantage, approximately one percent of the distance from the Earth to the Sun along the Earth-Sun line, means that DSCOVR data provide critical advanced warning of impending space weather events. As such, DSCOVR data are essential for forecasters, modelers, and the scientific community. The National Oceanic and Atmospheric Administration's National Centers for Environmental Information (NOAA/NCEI) archives the retrospective data and shares them with the public. We examine the data available, with a focus on some of the more interesting events that have occurred. We also discuss mechanisms created to facilitate search and access for those data, including a user-driven interface that allows one to dynamically generate plots and order relevant data of interest.

  18. Let's Talk About Water: Using Film Screenings to Engage Students and the Public in Water Science and Policy

    NASA Astrophysics Data System (ADS)

    Saleem Arrigo, J. A.; Berry, K.; Hooper, R. P.; Lilienfeld, L.

    2013-12-01

    "Let's Talk about Water" is a film symposium designed to bring together experts and the public to talk about the complex water issues facing society. The format of the event is quite simple: a panel of experts and the audience view a water documentary (such as "FLOW", "Liquid Assets", or "Gasland") together and there is an extended moderated discussion period following the film between the panel and the audience. Over the course of several events, we have developed best practices that make this simple format very effective. A film creates a context of subject and language for the discussion--it gets the audience and the panel on the same page. The moderators must actively manage the discussion, both challenging the panelists with follow up questions, asking questions to simplify the language the expert is using, and passing a question among panelists to bring out different points of view. The panelists are provided with the film in advance to view and, most importantly, meet the day before the event to discuss the film. This makes for a much more convivial discussion at the event. We have found that these discussions can easily be sustained for 90 to 120 minutes with active audience participation. We have found key element of the event is local relevance. Films should be carefully chosen to resonate with the audience, and the local host is critical in defining the audience, goals and identified panel members. Having local experts from universities and representatives from local water authorities and environmental groups bring a sense of community and a confidence in the audience that the panel members have local knowledge that is important for sustaining discussion. The discussion begins with points raised by the movie (are these issues real? Do they apply here? What are the scientific, engineering, and policy solutions to these problems?) and then segues into a discussion about career opportunities in the water sector, volunteer opportunities in the community or other ways for the audience to get involved. This format has been applied at college campuses with a target audience of lower-level undergraduates, in several universities in the United States and Canada. Additionally, we have held public events (at the NY Public Library, concurrent with World Water Week) and have documented experiences for other educators and researchers who want to employ this format. CUAHSI has created best practice "tips," hosting guides documenting our experiences with individual films, and other information on our website.

  19. Experimental and Petrological Constraints on Lunar Differentiation from the Apollo 15 Green Picritic Glasses

    NASA Technical Reports Server (NTRS)

    Elkins-Tanton, Linda T.; Chatterjee, Nilanjan; Grove, Timothy L.

    2003-01-01

    Phase equilibrium experiments on the most magnesian Apollo 15C green picritic glass composition indicate a multiple saturation point with olivine and orthopyroxene at 1520 C and 1.3 GPa (about 260 km depth in the moon). This composition has the highest Mg# of any lunar picritic glass and the shallowest multiple saturation point. Experiments on an Apollo 15A composition indicate a multiple saturation point with olivine and orthopyroxene at 1520 C and 2.2 GPa (about 440 km depth in the moon). The importance of the distinctive compositional trends of the Apollo 15 groups A, B, and C picritic glasses merits the reanalysis of NASA slide 15426,72 with modern electron microprobe techniques. We confirm the compositional trends reported by Delano (1979, 1986) in the major element oxides SiO2, TiO2, Al2O3, Cr2O3, FeO, MnO, MgO, and CaO, and we also obtained data for the trace elements P2O5, K2O, Na2O, NiO, S, Cu, Cl, Zn, and F. Petrogenetic modeling demonstrates that the Apollo 15 A-B-C glass trends could not have been formed by fractional crystallization or any continuous assimilation/fractional crystallization (AFC) process. The B and C glass compositional trends could not have been formed by batch or incremental melting of an olivine + orthopyroxene source or any other homogeneous source, though the A glasses may have been formed by congruent melting over a small pressure range at depth. The B compositional trend is well modeled by starting with an intermediate A composition and assimilating a shallower, melted cumulate, and the C compositional trend is well modeled by a second assimilation event. The assimilation process envisioned is one in which heat and mass transfer were separated in space and time. In an initial intrusive event, a picritic magma crystallized and provided heat to melt magma ocean cumulates. In a later replenishment event, the picritic magma incrementally mixed with the melted cumulate (creating the compositional trends in the green glass data set), ascended to the lunar surface, and erupted as a fire fountain. A barometer created from multiple saturation points provides a depth estimate of other glasses in the A-B-C trend and of the depths of assimilation. This barometer demonstrates that the Apollo 15 A-B-C trend originated over a depth range of approx.460 km to approx.260 km within the moon.

  20. Dip-dependent variations in LFE duration during ETS events

    NASA Astrophysics Data System (ADS)

    Chestler, S.; Creager, K.; Ghosh, A.

    2015-12-01

    Using data from the Array of Arrays experiment, we create a new, more spatially complete catalog of LFEs beneath the Olympic Peninsula, WA. Using stacked waveforms produced by stacking 1-minute windows of data from each array over the slowness with the greatest power [Ghosh et al., 2012], we pick out peaks in tremor activity that are consistent over multiple arrays. These peaks are potential LFE detections. Fifteen-second windows of raw data centered on each peak are scanned through time. If the waveform repeats, the detection is used as a new LFE family. Template waveforms for each family are created by stacking all windows that correlate with the initial detection. During an ETS event, activity at a given point on the plate interface (i.e. the activity of an LFE family) typically lasts for 3.5 (downdip) to 5 days (updip). Activity generally begins with a flurry of LFEs lasting 8 hours (downdip) to 20 hours (updip) followed by many short bursts of activity separated by 5 hours or more. Updip families have more bursts (5-10) than downdip families (2-5 bursts). The later bursts often occur during times of encouraging tidal shear stress, while the initial flurries have no significant correlation with tides. While updip LFE families are more active during ETS events than downdip families, they seldom light up between ETS events, which only occur every 12-14 months. On the other hand, downdip LFE families are active much more frequently during the year; the most down-dip families exhibit activity every week or so. Because updip families are rarely active between ETS events, it is possible that little stress is released updip during inter-ETS time periods. Hence during ETS events more stress needs to be released updip than downdip, consistent with the longer-duration activity of updip LFE families.

  1. Combining structure-from-motion derived point clouds from satellites and unmanned aircraft systems images with ground-truth data to create high-resolution digital elevation models

    NASA Astrophysics Data System (ADS)

    Palaseanu, M.; Thatcher, C.; Danielson, J.; Gesch, D. B.; Poppenga, S.; Kottermair, M.; Jalandoni, A.; Carlson, E.

    2016-12-01

    Coastal topographic and bathymetric (topobathymetric) data with high spatial resolution (1-meter or better) and high vertical accuracy are needed to assess the vulnerability of Pacific Islands to climate change impacts, including sea level rise. According to the Intergovernmental Panel on Climate Change reports, low-lying atolls in the Pacific Ocean are extremely vulnerable to king tide events, storm surge, tsunamis, and sea-level rise. The lack of coastal topobathymetric data has been identified as a critical data gap for climate vulnerability and adaptation efforts in the Republic of the Marshall Islands (RMI). For Majuro Atoll, home to the largest city of RMI, the only elevation dataset currently available is the Shuttle Radar Topography Mission data which has a 30-meter spatial resolution and 16-meter vertical accuracy (expressed as linear error at 90%). To generate high-resolution digital elevation models (DEMs) in the RMI, elevation information and photographic imagery have been collected from field surveys using GNSS/total station and unmanned aerial vehicles for Structure-from-Motion (SfM) point cloud generation. Digital Globe WorldView II imagery was processed to create SfM point clouds to fill in gaps in the point cloud derived from the higher resolution UAS photos. The combined point cloud data is filtered and classified to bare-earth and georeferenced using the GNSS data acquired on roads and along survey transects perpendicular to the coast. A total station was used to collect elevation data under tree canopies where heavy vegetation cover blocked the view of GNSS satellites. A subset of the GPS / total station data was set aside for error assessment of the resulting DEM.

  2. Identifying the Turning Point: Using the Transtheoretical Model of Change to Map Intimate Partner Violence Disclosure in Emergency Department Settings

    PubMed Central

    Catallo, Cristina; Jack, Susan M.; Ciliska, Donna; MacMillan, Harriet L.

    2012-01-01

    Background. The transtheoretical model of change (TTM) was used as a framework to examine the steps that women took to disclose intimate partner violence (IPV) in urban emergency departments. Methods. Mapping methods portrayed the evolving nature of decisions that facilitated or inhibited disclosure. This paper is a secondary analysis of qualitative data from a mixed methods study that explored abused women's decision making process about IPV disclosure. Findings. Change maps were created for 19 participants with movement from the precontemplation to the maintenance stages of the model. Disclosure often occurred after a significant “turning point event” combined with a series of smaller events over a period of time. The significant life event often involved a weighing of options where participants considered the perceived risks against the potential benefits of disclosure. Conclusions. Abused women experienced intrusion from the chaotic nature of the emergency department. IPV disclosure was perceived as a positive experience when participants trusted the health care provider and felt control over their decisions to disclose IPV. Practice Implications. Nurses can use these findings to gauge the readiness of women to disclose IPV in the emergency department setting. PMID:22792480

  3. RE-PLAN: An Extensible Software Architecture to Facilitate Disaster Response Planning

    PubMed Central

    O’Neill, Martin; Mikler, Armin R.; Indrakanti, Saratchandra; Tiwari, Chetan; Jimenez, Tamara

    2014-01-01

    Computational tools are needed to make data-driven disaster mitigation planning accessible to planners and policymakers without the need for programming or GIS expertise. To address this problem, we have created modules to facilitate quantitative analyses pertinent to a variety of different disaster scenarios. These modules, which comprise the REsponse PLan ANalyzer (RE-PLAN) framework, may be used to create tools for specific disaster scenarios that allow planners to harness large amounts of disparate data and execute computational models through a point-and-click interface. Bio-E, a user-friendly tool built using this framework, was designed to develop and analyze the feasibility of ad hoc clinics for treating populations following a biological emergency event. In this article, the design and implementation of the RE-PLAN framework are described, and the functionality of the modules used in the Bio-E biological emergency mitigation tool are demonstrated. PMID:25419503

  4. A Spectroscopic and Photometric Study of Gravitational Microlensing Events

    NASA Astrophysics Data System (ADS)

    Kane, Stephen R.

    2000-08-01

    Gravitational microlensing has generated a great deal of scientific interest over recent years. This has been largely due to the realization of its wide-reaching applications, such as the search for dark matter, the detection of planets, and the study of Galactic structure. A significant observational advance has been that most microlensing events can be identified in real-time while the source is still being lensed. More than 400 microlensing events have now been detected towards the Galactic bulge and Magellanic Clouds by the microlensing survey teams EROS, MACHO, OGLE, DUO, and MOA. The real-time detection of these events allows detailed follow-up observations with much denser sampling, both photometrically and spectroscopically. The research undertaken in this project on photometric studies of gravitational microlensing events has been performed as a member of the PLANET (Probing Lensing Anomalies NETwork) collaboration. This is a worldwide collaboration formed in the early part of 1995 to study microlensing anomalies - departures from an achromatic point source, point lens light curve - through rapidly-sampled, multi-band, photometry. PLANET has demonstrated that it can achieve 1% photometry under ideal circumstances, making PLANET observations sensitive to detection of Earth-mass planets which require characterization of 1%--2% deviations from a standard microlensing light curve. The photometric work in this project involved over 5 months using the 1.0 m telescope at Canopus Observatory in Australia, and 3 separate observing runs using the 0.9 m telescope at the Cerro Tololo Inter-American Observatory (CTIO) in Chile. Methods were developed to reduce the vast amount of photometric data using the image analysis software MIDAS and the photometry package DoPHOT. Modelling routines were then written to analyse a selection of the resulting light curves in order to detect any deviation from an achromatic point source - point lens light curve. The photometric results presented in this thesis are from observations of 34 microlensing events over three consecutive bulge seasons. These results are presented along with a discussion of the observations and the data reduction procedures. The colour-magnitude diagrams indicate that the microlensed sources are main sequence and red clump giant stars. Most of the events appear to exhibit standard Paczynski point source - point lens curves whilst a few deviate significantly from the standard model. Various microlensing models that include anomalous structure are fitted to a selection of the observed events resulting in the discovery of a possible binary source event. These fitted events are used to estimate the sensitivity to extra-solar planets and it is found that the sampling rate for these events was insufficient by about a factor of 7.5 for detecting a Jupiter-mass planet. This result assumes that deviations of 5% can be reliably detected. If microlensing is caused predominantly by bulge stars, as has been suggested by Kiraga and Paczynski, the lensed stars should have larger extinction than other observed stars since they would preferentially be located at the far side of the Galactic bulge. Hence, spectroscopy of Galactic microlensing events may be used as a tool for studying the kinematics and extinction effects in the Galactic bulge. The spectroscopic work in this project involved using Kurucz model spectra to create theoretical extinction effects for various spectral classes towards the Galactic centre. These extinction effects are then used to interpret spectroscopic data taken with the 3.6 m ESO telescope. These data consist of a sample of microlensed stars towards the Galactic bulge and are used to derive the extinction offsets of the lensed source with respect to the average population and a measurement of the fraction of bulge-bulge lensing is made. Hence, it is shown statistically that the microlensed sources are generally located on the far side of the Galactic bulge. Measurements of the radial velocities of these sources are used to determine the kinematic properties of the far side of the Galactic bulge.

  5. The life-cycle of upper-tropospheric jet streams identified with a novel data segmentation algorithm

    NASA Astrophysics Data System (ADS)

    Limbach, S.; Schömer, E.; Wernli, H.

    2010-09-01

    Jet streams are prominent features of the upper-tropospheric atmospheric flow. Through the thermal wind relationship these regions with intense horizontal wind speed (typically larger than 30 m/s) are associated with pronounced baroclinicity, i.e., with regions where extratropical cyclones develop due to baroclinic instability processes. Individual jet streams are non-stationary elongated features that can extend over more than 2000 km in the along-flow and 200-500 km in the across-flow direction, respectively. Their lifetime can vary between a few days and several weeks. In recent years, feature-based algorithms have been developed that allow compiling synoptic climatologies and typologies of upper-tropospheric jet streams based upon objective selection criteria and climatological reanalysis datasets. In this study a novel algorithm to efficiently identify jet streams using an extended region-growing segmentation approach is introduced. This algorithm iterates over a 4-dimensional field of horizontal wind speed from ECMWF analyses and decides at each grid point whether all prerequisites for a jet stream are met. In a single pass the algorithm keeps track of all adjacencies of these grid points and creates the 4-dimensional connected segments associated with each jet stream. In addition to the detection of these sets of connected grid points, the algorithm analyzes the development over time of the distinct 3-dimensional features each segment consists of. Important events in the development of these features, for example mergings and splittings, are detected and analyzed on a per-grid-point and per-feature basis. The output of the algorithm consists of the actual sets of grid-points augmented with information about the particular events, and of the so-called event graphs, which are an abstract representation of the distinct 3-dimensional features and events of each segment. This technique provides comprehensive information about the frequency of upper-tropospheric jet streams, their preferred regions of genesis, merging, splitting, and lysis, and statistical information about their size, amplitude and lifetime. The presentation will introduce the technique, provide example visualizations of the time evolution of the identified 3-dimensional jet stream features, and present results from a first multi-month "climatology" of upper-tropospheric jets. In the future, the technique can be applied to longer datasets, for instance reanalyses and output from global climate model simulations - and provide detailed information about key characteristics of jet stream life cycles.

  6. Reply to comment on "Direct evidence of ancient shock metamorphism at the site of the 1908 Tunguska event" by Vannucchi et al. (Earth Planet. Sci. Lett. 409 (2015) 168-174)

    NASA Astrophysics Data System (ADS)

    Vannucchi, Paola; Morgan, Jason P.

    2015-04-01

    Our paper (Vannucchi et al., 2015) focuses on geologic evidence for shock metamorphism found at the epicentral region of the 1908 Tunguska event. None of the currently proposed bolide explanations for the 1908 event can produce the shock pressures indicated by the geological evidence described in Vannucchi et al. (2015). If the 1908 event would have generated these pressures over the epicentral region, an observable crater should have also formed. The comment by Melott and Overholt discusses the possibility that a 1908 cometary bolide strike in Tunguska cannot be excluded because of the absence of a detectable 14C increase at this site. They dispute the findings of a recent Liu et al.'s (2014) study that an East Asian comet impact recorded by eyewitness accounts in 773 AD was coincident with a detectable 14C increase in regional South China Sea corals that grew at that time. Their point, whether true or not, is fairly peripheral to our study because the bolide hypothesis for the 1908 Tunguska event, no matter the nature of the bolide itself, does not provide a viable explanation for the geological evidence of shock metamorphism found at the 1908 Tunguska site. Furthermore, as we discuss in our paper, the probability of a prior large impact-shock event having occurred at the site of the 1908 event is extremely low, suggesting that a terrestrial shock-generating mechanism may be linked to the resolution of the Tunguska enigma. Our preferred resolution is that a terrestrial hyper-explosive gas release event, a Verneshot (Morgan et al., 2004), created the large shock-event during the emplacement of the Siberian Traps. In this scenario, the 1908 Tunguska event was due to a much smaller gas-burst that re-used the lithospheric weakness created by the ancient Verneshot. Melott and Overholt's discussion regarding the existence and size of regional and global 14C anomalies related to cometary impacts seems, therefore, to be better addressed in response to the work of Liu et al. (2014), as appears to be done in a paper and preprint that Melott and Overholt self-cite in their comment.

  7. Centenary of the Battle of Vimy (france, 1917): Preserving the Memory of the Great War Through 3d Recording of the Maison Blanche Souterraine

    NASA Astrophysics Data System (ADS)

    Murtiyoso, A.; Grussenmeyer, P.; Guillemin, S.; Prilaux, G.

    2017-08-01

    The Battle of Vimy Ridge was a military engagement between the Canadian Corps and the German Empire during the Great War (1914-1918). In this battle, Canadian troops fought as a single unit and won the day. It marked an important point in Canadian history as a nation. The year 2017 marks the centenary of this battle. In commemoration of this event, the Pas-de-Calais Departmental Council financed a 3D recording mission for one of the underground tunnels (souterraines) used as refuge by the Canadian soldiers several weeks prior to the battle. A combination of Terrestrial Laser Scanner (TLS) and close-range photogrammetry techniques was employed in order to document not only the souterraine, but also the various carvings and graffitis created by the soldiers on its walls. The resulting point clouds were registered to the French national geodetic system, and then meshed and textured in order to create a precise 3D model of the souterraine. In this paper, the workflow taken during the project as well as several results will be discussed. In the end, the resulting 3D model was used to create derivative products such as maps, section profiles, and also virtual visit videos. The latter helps the dissemination of the 3D information and thus aids in the preservation of the memory of the Great War for Canada.

  8. Characterizing Student Experiences in Physics Competitions: The Power of Emotions

    NASA Astrophysics Data System (ADS)

    Moll, Rachel F.; Nashon, S.; Anderson, D.

    2006-12-01

    Low enrolment and motivation are key issues in physics education and recently the affective dimension of learning is being studied for evidence of its influence on student attitudes towards physics. Physics Olympics competitions are a novel context for stimulating intense emotional experiences. In this study, one team of students and their teacher were interviewed and observed prior to and during the event to characterize their emotions and determine the connections between their experiences and learning and attitudes/motivation towards physics. Results showed that certain types of events stimulated strong emotions of frustration and ownership, and that students’ attitudes were that physics is fun, diverse and relevant. Analysis of these themes indicated that the nature of emotions generated was connected to their attitudes towards physics. This finding points to the potential and value of informal and novel contexts in creating strong positive emotions, which have a strong influence on student attitudes towards physics.

  9. Shooter position estimation with muzzle blast and shockwave measurements from separate locations

    NASA Astrophysics Data System (ADS)

    Grasing, David

    2016-05-01

    There are two acoustical events associated with small arms fire: the muzzle blast (created by bullets being expelled from the barrel of the weapon), and the shockwave (created by bullets which exceed the speed of sound). Assuming the ballistics of a round are known, the times and directions of arrival of the acoustic events furnish sufficient information to determine the origin of the shot. Existing methods tacitly assume that it is a single sensor which makes measurements of the times and direction of arrival. If the sensor is located past the point where the bullet goes transonic or if the sensor is far off the axis of the shot line a single sensor localization become highly inaccurate due to the ill-conditioning of the localization problem. In this paper, a more general approach is taken which allows for localizations from measurements made at separate locations. There are considerable advantages to this approach, the most noteworthy of which is the improvement in localization accuracy due to the improvement in the conditioning of the problem. Additional benefits include: the potential to locate in cases where a single sensor has insufficient information, furnishing high quality initialization to data fusion algorithms, and the potential to identify the round from a set of possible rounds.

  10. Earthquake recording at the Stanford DAS Array with fibers in existing telecomm conduits

    NASA Astrophysics Data System (ADS)

    Biondi, B. C.; Martin, E. R.; Yuan, S.; Cole, S.; Karrenbach, M. H.

    2017-12-01

    The Stanford Distributed Acoustic Sensing Array (SDASA-1) has been continuously recording seismic data since September 2016 on 2.5 km of single mode fiber optics in existing telecommunications conduits under Stanford's campus. The array is figure-eight shaped and roughly 600 m along its widest side with a channel spacing of roughly 8 m. This array is easy to maintain and is nonintrusive, making it well suited to urban environments, but it sacrifices some cable-to-ground coupling compared to more traditional seismometers. We have been testing its utility for earthquake recording, active seismic, and ambient noise interferometry. This talk will focus on earthquake observations. We will show comparisons between the strain rates measured throughout the DAS array and the particle velocities measured at the nearby Jasper Ridge Seismic Station (JRSC). In some of these events, we will point out directionality features specific to DAS that can require slight modifications in data processing. We also compare repeatability of DAS and JRSC recordings of blasts from a nearby quarry. Using existing earthquake databases, we have created a small catalog of DAS earthquake observations by pulling records of over 700 Northern California events spanning Sep. 2016 to Jul. 2017 from both the DAS data and JRSC. On these events we have tested common array methods for earthquake detection and location including beamforming and STA/LTA analysis in time and frequency. We have analyzed these events to approximate thresholds on what distances and magnitudes are clearly detectible by the DAS array. Further analysis should be done on detectability with methods tailored to small events (for example, template matching). In creating this catalog, we have developed open source software available for free download that can manage large sets of continuous seismic data files (both existing files, and files as they stream in). This software can both interface with existing earthquake networks, and efficiently extract earthquake recordings from many continuous recordings saved on the users machines.

  11. Understanding survival analysis: Kaplan-Meier estimate.

    PubMed

    Goel, Manish Kumar; Khanna, Pardeep; Kishore, Jugal

    2010-10-01

    Kaplan-Meier estimate is one of the best options to be used to measure the fraction of subjects living for a certain amount of time after treatment. In clinical trials or community trials, the effect of an intervention is assessed by measuring the number of subjects survived or saved after that intervention over a period of time. The time starting from a defined point to the occurrence of a given event, for example death is called as survival time and the analysis of group data as survival analysis. This can be affected by subjects under study that are uncooperative and refused to be remained in the study or when some of the subjects may not experience the event or death before the end of the study, although they would have experienced or died if observation continued, or we lose touch with them midway in the study. We label these situations as censored observations. The Kaplan-Meier estimate is the simplest way of computing the survival over time in spite of all these difficulties associated with subjects or situations. The survival curve can be created assuming various situations. It involves computing of probabilities of occurrence of event at a certain point of time and multiplying these successive probabilities by any earlier computed probabilities to get the final estimate. This can be calculated for two groups of subjects and also their statistical difference in the survivals. This can be used in Ayurveda research when they are comparing two drugs and looking for survival of subjects.

  12. Compilation of Earthquakes from 1850-2007 within 200 miles of the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Seth Carpenter

    2010-07-01

    An updated earthquake compilation was created for the years 1850 through 2007 within 200 miles of the Idaho National Laboratory. To generate this compilation, earthquake catalogs were collected from several contributing sources and searched for redundant events using the search criteria established for this effort. For all sets of duplicate events, a preferred event was selected, largely based on epicenter-network proximity. All unique magnitude information for each event was added to the preferred event records and these records were used to create the compilation referred to as “INL1850-2007”.

  13. Disaster planning: the basics of creating a burn mass casualty disaster plan for a burn center.

    PubMed

    Kearns, Randy D; Conlon, Kathe M; Valenta, Andrea L; Lord, Graydon C; Cairns, Charles B; Holmes, James H; Johnson, Daryhl D; Matherly, Annette F; Sawyer, Dalton; Skarote, Mary Beth; Siler, Sean M; Helminiak, Radm Clare; Cairns, Bruce A

    2014-01-01

    In 2005, the American Burn Association published burn disaster guidelines. This work recognized that local and state assets are the most important resources in the initial 24- to 48-hour management of a burn disaster. Historical experiences suggest there is ample opportunity to improve local and state preparedness for a major burn disaster. This review will focus on the basics of developing a burn surge disaster plan for a mass casualty event. In the event of a disaster, burn centers must recognize their place in the context of local and state disaster plan activation. Planning for a burn center takes on three forms; institutional/intrafacility, interfacility/intrastate, and interstate/regional. Priorities for a burn disaster plan include: coordination, communication, triage, plan activation (trigger point), surge, and regional capacity. Capacity and capability of the plan should be modeled and exercised to determine limitations and identify breaking points. When there is more than one burn center in a given state or jurisdiction, close coordination and communication between the burn centers are essential for a successful response. Burn surge mass casualty planning at the facility and specialty planning levels, including a state burn surge disaster plan, must have interface points with governmental plans. Local, state, and federal governmental agencies have key roles and responsibilities in a burn mass casualty disaster. This work will include a framework and critical concepts any burn disaster planning effort should consider when developing future plans.

  14. Let's Talk About Water: Film Screenings as an Entrée to Water Science

    NASA Astrophysics Data System (ADS)

    Hooper, R. P.; Lilienfeld, L.; Arrigo, J.

    2011-12-01

    "Let's Talk about Water" is a film symposium designed to bring together experts and the public to talk about the complex water issues facing society. The format of the event is quite simple: a panel of experts and the audience view a water documentary (such as "FLOW", "Liquid Assets", or "Gasland") together and there is an extended moderated discussion period following the film between the panel and the audience. Properly handled, this simple format can be very effective. A film creates a context of subject and language for the discussion--it gets the audience and the panel on the same page. The moderators must actively manage the discussion, both challenging the panelists with follow up questions, asking questions to simplify the language the expert is using, and passing a question among panelists to bring out different points of view. The panelists are provided with the film in advance to view and, most importantly, meet the day before the event to discuss the film. This makes for a much more convivial discussion at the event. We have found that these discussions can easily be sustained for 90 to 120 minutes with active audience participation. This format has been applied at college campuses with a target audience of lower-level undergraduates. Student clubs are engaged to help with publicity before the event and to assist with registration and ushering during the event. Appropriate classes offer extra credit for student attendance to ensure a strong turnout. A Hollywood film ("Chinatown" in southern California, "A Civil Action" in Boston) is shown on campus during the week preceding the event to help advertise the event. The event itself is typically held on a Saturday with a morning screening of the film. The audience is provided with index cards and pencils to write down questions they have about the film. A lunch is provided during which the questions are organized and used to initiate different discussion themes. The discussion begins with points raised by the movie (are these issues real? Do they apply here? What are the scientific, engineering, and policy solutions to these problems?) and then segues into a discussion about career opportunities in the water sector. Our past events at UC Irvine and at UMass Boston have been successful in attracting large audiences and have been viewed positively by attendees.

  15. Reconstruction of improvised explosive device blast loading to personnel in the open

    NASA Astrophysics Data System (ADS)

    Wiri, Suthee; Needham, Charles

    2016-05-01

    Significant advances in reconstructing attacks by improvised explosive devices (IEDs) and other blast events are reported. A high-fidelity three-dimensional computational fluid dynamics tool, called Second-order Hydrodynamic Automatic Mesh Refinement Code, was used for the analysis. Computer-aided design models for subjects or vehicles in the scene accurately represent geometries of objects in the blast field. A wide range of scenario types and blast exposure levels were reconstructed including free field blast, enclosed space of vehicle cabin, IED attack on a vehicle, buried charges, recoilless rifle operation, rocket-propelled grenade attack and missile attack with single subject or multiple subject exposure to pressure levels from ˜ 27.6 kPa (˜ 4 psi) to greater than 690 kPa (>100 psi). To create a full 3D pressure time-resolved reconstruction of a blast event for injury and blast exposure analysis, a combination of intelligence data and Blast Gauge data can be used to reconstruct an actual in-theatre blast event. The methodology to reconstruct an event and the "lessons learned" from multiple reconstructions in open space are presented. The analysis uses records of blast pressure at discrete points, and the output is a spatial and temporal blast load distribution for all personnel involved.

  16. Real time validation of GPS TEC precursor mask for Greece

    NASA Astrophysics Data System (ADS)

    Pulinets, Sergey; Davidenko, Dmitry

    2013-04-01

    It was established by earlier studies of pre-earthquake ionospheric variations that for every specific site these variations manifest definite stability in their temporal behavior within the time interval few days before the seismic shock. This self-similarity (characteristic to phenomena registered for processes observed close to critical point of the system) permits us to consider these variations as a good candidate to short-term precursor. Physical mechanism of GPS TEC variations before earthquakes is developed within the framework of Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model. Taking into account the different tectonic structure and different source mechanisms of earthquakes in different regions of the globe, every site has its individual behavior in pre-earthquake activity what creates individual "imprint" on the ionosphere behavior at every given point. Just this so called "mask" of the ionosphere variability before earthquake in the given point creates opportunity to detect anomalous behavior of electron concentration in ionosphere basing not only on statistical processing procedure but applying the pattern recognition technique what facilitates the automatic recognition of short-term ionospheric precursors of earthquakes. Such kind of precursor mask was created using the GPS TEC variation around the time of 9 earthquakes with magnitude from M6.0 till M6.9 which took place in Greece within the time interval 2006-2011. The major anomaly revealed in the relative deviation of the vertical TEC was the positive anomaly appearing at ~04PM UT one day before the seismic shock and lasting nearly 12 hours till ~04AM UT. To validate this approach it was decided to check the mask in real-time monitoring of earthquakes in Greece starting from the 1 of December 2012 for the earthquakes with magnitude more than 4.5. During this period (till 9 of January 2013) 4 cases of seismic shocks were registered, including the largest one M5.7 on 8 of January. For all of them the mask confirmed its validity and 6 of December event was predicted in advance.

  17. LMJ Points Plus v2.6

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos

    Short summary of the software's functionality: • built-in scan feature to acquire optical image of the surface to be analyzed • click-and-point selection of points of interest on the surface • supporting standalone autosampler/HPLC/MS operation: creating independent batch files after points of interests are selected for LEAPShell (autosampler control software from Leap Technologies) and Analyst® (mass spectrometry (MS) software from AB Sciex) • supporting integrated autosampler/HPLC/MS operation: creating one batch file for all instruments controlled by Analyst® (mass spectrometry software from AB Sciex) after points of interests are selected •creating heatmaps of analytes of interests from collected MS files inmore » a hand-off fashion« less

  18. The Compressed Baryonic Matter experiment at FAIR

    NASA Astrophysics Data System (ADS)

    Höhne, Claudia

    2018-02-01

    The CBM experiment will investigate highly compressed baryonic matter created in A+A collisions at the new FAIR research center. With a beam energy range up to 11 AGeV for the heaviest nuclei at the SIS 100 accelerator, CBM will investigate the QCD phase diagram in the intermediate range, i.e. at moderate temperatures but high net-baryon densities. This intermediate range of the QCD phase diagram is of particular interest, because a first order phase transition ending in a critical point and possibly new highdensity phases of strongly interacting matter are expected. In this range of the QCD phase diagram only exploratory measurements have been performed so far. CBM, as a next generation, high-luminosity experiment, will substantially improve our knowledge of matter created in this region of the QCD phase diagram and characterize its properties by measuring rare probes such as multi-strange hyperons, dileptons or charm, but also with event-by-event fluctuations of conserved quantities, and collective flow of identified particles. The experimental preparations with special focus on hadronic observables and strangeness is presented in terms of detector development, feasibility studies and fast track reconstruction. Preparations are progressing well such that CBM will be ready with FAIR start. As quite some detectors are ready before, they will be used as upgrades or extensions of already running experiments allowing for a rich physics program prior to FAIR start.

  19. Predicting Long-term Ischemic Events Using Routine Clinical Parameters in Patients with Coronary Artery Disease: The OPT-CAD Risk Score.

    PubMed

    Han, Yaling; Chen, Jiyan; Qiu, Miaohan; Li, Yi; Li, Jing; Feng, Yingqing; Qiu, Jian; Meng, Liang; Sun, Yihong; Tao, Guizhou; Wu, Zhaohui; Yang, Chunyu; Guo, Jincheng; Pu, Kui; Chen, Shaoliang; Wang, Xiaozeng

    2018-06-05

    The prognosis of patients with coronary artery disease (CAD) at hospital discharge was constantly varying, and post-discharge risk of ischemic events remain a concern. However, risk prediction tools to identify risk of ischemia for these patients has not yet been reported. We sought to develop a scoring system for predicting long-term ischemic events in CAD patients receiving antiplatelet therapy that would be beneficial in appropriate personalized decision-making for these patients. In this prospective Optimal antiPlatelet Therapy for Chinese patients with Coronary Artery Disease (OPT-CAD, NCT01735305) registry, a total of 14,032 patients with CAD receiving at least one kind of antiplatelet agent were enrolled from 107 centers across China, from January 2012 to March 2014. The risk scoring system was developed in a derivation cohort (enrolled initially 10,000 patients in the database) using a logistic regression model and was subsequently tested in a validation cohort (the last 4,032 patients). Points in risk score was assigned based on the multivariable odds ratio of each factor. Ischemic events were defined as the composite of cardiac death, myocardial infarction or stroke. Ischemic events occurred in 342 (3.4%) patients in the derivation cohort and 160 (4.0%) patients in the validation cohort during 1-year follow-up. The OPT-CAD score, ranging from 0-257 points, consist of 10 independent risk factors, including age (0-71 points), heart rates (0-36 points), hypertension (0-20 points), prior myocardial infarction (16 points), prior stroke (16 points), renal insufficient (21 points), anemia (19 points), low ejection fraction (22 points), positive cardiac troponin (23 points) and ST-segment deviation (13 points). In predicting 1-year ischemic events, the area under receiver operating characteristics curve were 0.73 and 0.72 in derivation and validation cohort, respectively. The incidences of ischemic events in low- (0-90 points), medium- (91-150 points) and high-risk (≥151 points) patients were 1.6%, 5.5%, and 15.0%, respectively. Compared to GRACE score, OPT-CAD score had a better discrimination in predicting ischemic events and all-cause mortality (ischemic events: 0.72 vs 0.65, all-cause mortality: 0.79 vs 0.72, both P<0.001). Among CAD patients, a risk score based on 10 baseline clinical variables performed better than the GRACE risk score in predicting long-term ischemic events. However, further research is needed to assess the value of the OPT-CAD score in guiding the management of antiplatelet therapy for patients with CAD. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  20. Dynamic Creation of Social Networks for Syndromic Surveillance Using Information Fusion

    NASA Astrophysics Data System (ADS)

    Holsopple, Jared; Yang, Shanchieh; Sudit, Moises; Stotz, Adam

    To enhance the effectiveness of health care, many medical institutions have started transitioning to electronic health and medical records and sharing these records between institutions. The large amount of complex and diverse data makes it difficult to identify and track relationships and trends, such as disease outbreaks, from the data points. INFERD: Information Fusion Engine for Real-Time Decision-Making is an information fusion tool that dynamically correlates and tracks event progressions. This paper presents a methodology that utilizes the efficient and flexible structure of INFERD to create social networks representing progressions of disease outbreaks. Individual symptoms are treated as features allowing multiple hypothesis being tracked and analyzed for effective and comprehensive syndromic surveillance.

  1. Integrating Low-Cost Mems Accelerometer Mini-Arrays (mama) in Earthquake Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Nof, R. N.; Chung, A. I.; Rademacher, H.; Allen, R. M.

    2016-12-01

    Current operational Earthquake Early Warning Systems (EEWS) acquire data with networks of single seismic stations, and compute source parameters assuming earthquakes to be point sources. For large events, the point-source assumption leads to an underestimation of magnitude, and the use of single stations leads to large uncertainties in the locations of events outside the network. We propose the use of mini-arrays to improve EEWS. Mini-arrays have the potential to: (a) estimate reliable hypocentral locations by beam forming (FK-analysis) techniques; (b) characterize the rupture dimensions and account for finite-source effects, leading to more reliable estimates for large magnitudes. Previously, the high price of multiple seismometers has made creating arrays cost-prohibitive. However, we propose setting up mini-arrays of a new seismometer based on low-cost (<$150), high-performance MEMS accelerometer around conventional seismic stations. The expected benefits of such an approach include decreasing alert-times, improving real-time shaking predictions and mitigating false alarms. We use low-resolution 14-bit Quake Catcher Network (QCN) data collected during Rapid Aftershock Mobilization Program (RAMP) in Christchurch, NZ following the M7.1 Darfield earthquake in September 2010. As the QCN network was so dense, we were able to use small sub-array of up to ten sensors spread along a maximum area of 1.7x2.2 km2 to demonstrate our approach and to solve for the BAZ of two events (Mw4.7 and Mw5.1) with less than ±10° error. We will also present the new 24-bit device details, benchmarks, and real-time measurements.

  2. Sky Fest: A Model of Successful Scientist Participation in E/PO

    NASA Astrophysics Data System (ADS)

    Dalton, H.; Shipp, S. S.; Shaner, A. J.; LaConte, K.; Shupla, C. B.

    2014-12-01

    Participation in outreach events is an easy way for scientists to get involved with E/PO and reach many people with minimal time commitment. At the Lunar and Planetary Institute (LPI) in Houston, Texas, the E/PO team holds Sky Fest outreach events several times a year. These events each have a science content theme and include several activities for children and their parents, night sky viewing through telescopes, and scientist presentations. LPI scientists have the opportunity to participate in Sky Fest events either by helping lead an activity or by giving the scientist presentation (a short lecture and/or demonstration). Scientists are involved in at least one preparation meeting before the event. This allows them to ask questions, understand what activity they will be leading, and learn the key points that they should be sharing with the public, as well as techniques for effectively teaching members of the public about the event topic. During the event, each activity is run by one E/PO specialist and one scientist, enabling the scientist to learn about effective E/PO practices from the E/PO specialist and the E/PO specialist to get more science information about the event topic. E/PO specialists working together with scientists at stations provides a more complete, richer experience for event participants. Surveys of event participants have shown that interacting one-on-one with scientists is often one of their favorite parts of the events. Interviews with scientists indicated that they enjoyed Sky Fest because there was very little time involved on their parts outside of the actual event; the activities were created and/or chosen by the E/PO professionals, and setup for the events was completed before they arrived. They also enjoyed presenting their topic to people without a background in science, and who would not have otherwise sought out the information that was presented.

  3. Triggering Mechanism for Neutron Induced Single-Event Burnout in Power Devices

    NASA Astrophysics Data System (ADS)

    Shoji, Tomoyuki; Nishida, Shuichi; Hamada, Kimimori

    2013-04-01

    Cosmic ray neutrons can trigger catastrophic failures in power devices. It has been reported that parasitic transistor action causes single-event burnout (SEB) in power metal-oxide-semiconductor field-effect transistors (MOSFETs) and insulated gate bipolar transistors (IGBTs). However, power diodes do not have an inherent parasitic transistor. In this paper, we describe the mechanism triggering SEB in power diodes for the first time using transient device simulation. Initially, generated electron-hole pairs created by incident recoil ions generate transient current, which increases the electron density in the vicinity of the n-/n+ boundary. The space charge effect of the carriers leads to an increase in the strength of the electric field at the n-/n+ boundary. Finally, the onset of impact ionization at the n-/n+ boundary can trigger SEB. Furthermore, this failure is closely related to diode secondary breakdown. It was clarified that the impact ionization at the n-/n+ boundary is a key point of the mechanism triggering SEB in power devices.

  4. The dynamics of evolving beliefs, concerns emotions, and behavioral avoidance following 9/11: a longitudinal analysis of representative archival samples.

    PubMed

    McArdle, Shelly C; Rosoff, Heather; John, Richard S

    2012-04-01

    September 11 created a natural experiment that enables us to track the psychological effects of a large-scale terror event over time. The archival data came from 8,070 participants of 10 ABC and CBS News polls collected from September 2001 until September 2006. Six questions investigated emotional, behavioral, and cognitive responses to the events of September 11 over a five-year period. We found that heightened responses after September 11 dissipated and reached a plateau at various points in time over a five-year period. We also found that emotional, cognitive, and behavioral reactions were moderated by age, sex, political affiliation, and proximity to the attack. Both emotional and behavioral responses returned to a normal state after one year, whereas cognitively-based perceptions of risk were still diminishing as late as September 2006. These results provide insight into how individuals will perceive and respond to future similar attacks. © 2012 Society for Risk Analysis.

  5. Modeling hard clinical end-point data in economic analyses.

    PubMed

    Kansal, Anuraag R; Zheng, Ying; Palencia, Roberto; Ruffolo, Antonio; Hass, Bastian; Sorensen, Sonja V

    2013-11-01

    The availability of hard clinical end-point data, such as that on cardiovascular (CV) events among patients with type 2 diabetes mellitus, is increasing, and as a result there is growing interest in using hard end-point data of this type in economic analyses. This study investigated published approaches for modeling hard end-points from clinical trials and evaluated their applicability in health economic models with different disease features. A review of cost-effectiveness models of interventions in clinically significant therapeutic areas (CV diseases, cancer, and chronic lower respiratory diseases) was conducted in PubMed and Embase using a defined search strategy. Only studies integrating hard end-point data from randomized clinical trials were considered. For each study included, clinical input characteristics and modeling approach were summarized and evaluated. A total of 33 articles (23 CV, eight cancer, two respiratory) were accepted for detailed analysis. Decision trees, Markov models, discrete event simulations, and hybrids were used. Event rates were incorporated either as constant rates, time-dependent risks, or risk equations based on patient characteristics. Risks dependent on time and/or patient characteristics were used where major event rates were >1%/year in models with fewer health states (<7). Models of infrequent events or with numerous health states generally preferred constant event rates. The detailed modeling information and terminology varied, sometimes requiring interpretation. Key considerations for cost-effectiveness models incorporating hard end-point data include the frequency and characteristics of the relevant clinical events and how the trial data is reported. When event risk is low, simplification of both the model structure and event rate modeling is recommended. When event risk is common, such as in high risk populations, more detailed modeling approaches, including individual simulations or explicitly time-dependent event rates, are more appropriate to accurately reflect the trial data.

  6. Superposed ruptile deformational events revealed by field and VOM structural analysis

    NASA Astrophysics Data System (ADS)

    Kumaira, Sissa; Guadagnin, Felipe; Keller Lautert, Maiara

    2017-04-01

    Virtual outcrop models (VOM) is becoming an important application in the analysis of geological structures due to the possibility of obtaining the geometry and in some cases kinematic aspects of analyzed structures in a tridimensional photorealistic space. These data are used to gain quantitative information on the deformational features which coupled with numeric models can assist in understands deformational processes. Old basement units commonly register superposed deformational events either ductile or ruptile along its evolution. The Porongos Belt, located at southern Brazil, have a complex deformational history registering at least five ductile and ruptile deformational events. In this study, we presents a structural analysis of a quarry in the Porongos Belt, coupling field and VOM structural information to understand process involved in the last two deformational events. Field information was acquired using traditional structural methods for analysis of ruptile structures, such as the descriptions, drawings, acquisition of orientation vectors and kinematic analysis. VOM was created from the image-based modeling method through photogrammetric data acquisition and orthorectification. Photogrammetric data acquisition was acquired using Sony a3500 camera and a total of 128 photographs were taken from ca. 10-20 m from the outcrop in different orientations. Thirty two control point coordinates were acquired using a combination of RTK dGPS surveying and total station work, providing a precision of few millimeters for x, y and z. Photographs were imported into the Photo Scan software to create a 3D dense point cloud from structure from-motion algorithm, which were triangulated and textured to generate the VOM. VOM was georreferenced (oriented and scaled) using the ground control points, and later analyzed in OpenPlot software to extract structural information. Data was imported in Wintensor software to obtain tensor orientations, and Move software to process and interpret geometrical and kinematic data. Planar and linear structural orientations and kinematic indicators revealed superposition of three deformational events: i) compressive, ii) transtensional, and iii) extensional paleostress regimes. The compressive regime was related to a radial to pure compression with N-S horizontal maximum compression vector. This stress regime corresponds mainly to the development of dextral tension fractures and NE-SW reverse faults. The transtensional regime has NW-SE sub-horizontal extension, NE-SW horizontal compressional, and sub-vertical intermediate tensors, generating mainly shear fractures by reactivation of the metamorphic foliation (anisotropy), NE-SW reverse faults and NE-vertical veins and gashes. The extensional regime of strike-slip type presents a NE-SW sub-horizontal extension and NW-SE trending sub-vertical maximum compression vector. Structures related to this regime are sub-vertical tension gashes, conjugate fractures and NW-SE normal faults. Cross-cutting relations show that compression was followed by transtension, which reactivate the ductile foliation, and in the last stage, extension dominated. Most important findings show that: i) local stress fields can modify expected geometry and ii) anisotropy developed by previous structures control the nucleation of new fractures and reactivations. Use of field data integrated in a VOM has great potential as analogues for structured reservoirs.

  7. Market-based control mechanisms for patient safety

    PubMed Central

    Coiera, E; Braithwaite, J

    2009-01-01

    A new model is proposed for enhancing patient safety using market-based control (MBC), inspired by successful approaches to environmental governance. Emissions trading, enshrined in the Kyoto protocol, set a carbon price and created a carbon market—is it possible to set a patient safety price and let the marketplace find ways of reducing clinically adverse events? To “cap and trade,” a regulator would need to establish system-wide and organisation-specific targets, based on the cost of adverse events, create a safety market for trading safety credits and then police the market. Organisations are given a clear policy signal to reduce adverse event rates, are told by how much, but are free to find mechanisms best suited to their local needs. The market would inevitably generate novel ways of creating safety credits, and accountability becomes hard to evade when adverse events are explicitly measured and accounted for in an organisation’s bottom line. PMID:19342522

  8. A new method for estimating the probable maximum hail loss of a building portfolio based on hailfall intensity determined by radar measurements

    NASA Astrophysics Data System (ADS)

    Aller, D.; Hohl, R.; Mair, F.; Schiesser, H.-H.

    2003-04-01

    Extreme hailfall can cause massive damage to building structures. For the insurance and reinsurance industry it is essential to estimate the probable maximum hail loss of their portfolio. The probable maximum loss (PML) is usually defined with a return period of 1 in 250 years. Statistical extrapolation has a number of critical points, as historical hail loss data are usually only available from some events while insurance portfolios change over the years. At the moment, footprints are derived from historical hail damage data. These footprints (mean damage patterns) are then moved over a portfolio of interest to create scenario losses. However, damage patterns of past events are based on the specific portfolio that was damaged during that event and can be considerably different from the current spread of risks. A new method for estimating the probable maximum hail loss to a building portfolio is presented. It is shown that footprints derived from historical damages are different to footprints of hail kinetic energy calculated from radar reflectivity measurements. Based on the relationship between radar-derived hail kinetic energy and hail damage to buildings, scenario losses can be calculated. A systematic motion of the hail kinetic energy footprints over the underlying portfolio creates a loss set. It is difficult to estimate the return period of losses calculated with footprints derived from historical damages being moved around. To determine the return periods of the hail kinetic energy footprints over Switzerland, 15 years of radar measurements and 53 years of agricultural hail losses are available. Based on these data, return periods of several types of hailstorms were derived for different regions in Switzerland. The loss set is combined with the return periods of the event set to obtain an exceeding frequency curve, which can be used to derive the PML.

  9. The Record Los Angeles Heat Event of September 2010: 1. Synoptic-Scale-Meso-β-Scale Analyses of Interactive Planetary Wave Breaking, Terrain- and Coastal-Induced Circulations

    NASA Astrophysics Data System (ADS)

    Kaplan, Michael L.; Tilley, Jeffrey S.; Hatchett, Benjamin J.; Smith, Craig M.; Walston, Joshua M.; Shourd, Kacie N.; Lewis, John M.

    2017-10-01

    On 27 September 2010 the Los Angeles Civic Center reached its all-time record maximum temperature of 45°C before 1330 local daylight time with several other regional stations observing all-time record breaking heat early in that afternoon. This record event is associated with a general circulation pattern predisposed to hemispheric wave breaking. Three days before the event, wave breaking organizes complex terrain- and coastal-induced processes that lead to isentropic surface folding into the Los Angeles Basin. The first wave break occurs over the western two thirds of North America leading to trough elongation across the southwestern U.S. Collocated with this trough is an isentropic potential vorticity filament that is the locus of a thermally indirect circulation central to warming and associated thickness increases and ridging westward across the Great Basin. In response to this circulation, two subsynoptic wave breaks are triggered along the Pacific coast. The isentropic potential vorticity filament is coupled to the breaking waves and the interaction produces a subsynoptic low-pressure center and a deep vortex aloft over the southeastern California desert. This coupling leads to advection of an elevated mixed layer over Point Conception the night before the record-breaking heat that creates a coastally trapped low-pressure area southwest of Los Angeles. The two low-pressure centers create a low-level pressure gradient and east-southeasterly jet directed offshore over the Los Angeles Basin by sunrise on 27 September. This allows the advection of low-level warm air from the inland terrain toward the coastally trapped disturbance and descending circulation resulting in record heating.

  10. Rapid Gorge Formation in an Artificially Created Waterfall

    NASA Astrophysics Data System (ADS)

    Anton, L.; Mather, A. E.; Stokes, M.; Munoz Martin, A.

    2014-12-01

    A number of studies have examined rates of gorge formation, nick point retreat, and the controls on those rates via bedrock erodibility, the effectiveness of bedrock erosion mechanisms and the role of hillslope processes. Most findings are based on conceptual / empirical models or long term landscape analysis; but studies of recent quantifiable events are scarce yet highly valuable. Here we present expert eye witness account and quantitative survey of large and rapid fluvial erosion events that occurred over an artificially created waterfall at a spillway mouth. In 6 years a ~270 m long, ~100 m deep and ~100 to 160 m wide canyon was carved, and ~1.58 x106 m3 of granite bedrock was removed from the spillway site. Available flow data indicates that the erosion took place under unremarkable flood discharge conditions. The analysis of historic topographic maps enables the reconstruction of the former topography and successive erosion events, enabling the quantification of bedrock erosion amounts, and rates. Analysis of bedrock erodibility and discontinuity patterns demonstrates that the bedrock is mechanically strong, and that similar rock strength and fracture patterns are found throughout the region. It is apparent that structural pre-conditioning through fracture density and orientation in relation to flow and slope direction is of paramount importance in the gorge development. The presented example provides an exceptional opportunity for studying the evolution process of a bedrock canyon and to precisely measure the rate of bedrock channel erosion over a six year period. Results illustrate the highly episodic nature of the erosion and highlight several key observations for the adjustability of bedrock rivers. The observations have implications for the efficiency of bedrock erosion and raise important questions about incision rates, driving mechanisms and timescale assumptions' in models of landscape change.

  11. Using NLM exhibits and events to engage library users and reach the community.

    PubMed

    Auten, Beth; Norton, Hannah F; Tennant, Michele R; Edwards, Mary E; Stoyan-Rosenzweig, Nina; Daley, Matthew

    2013-01-01

    In an effort to reach out to library users and make the library a more relevant, welcoming place, the University of Florida's Health Science Center Library hosted exhibits from the National Library of Medicine's (NLM) Traveling Exhibition Program. From 2010 through 2012, the library hosted four NLM exhibits and created event series for each. Through reflection and use of a participant survey, lessons were learned concerning creating relevant programs, marketing events, and forming new partnerships. Each successive exhibit added events and activities to address different audiences. A survey of libraries that have hosted NLM exhibits highlights lessons learned at those institutions.

  12. Using NLM Exhibits and Events to Engage Library Users and Reach the Community

    PubMed Central

    Auten, Beth; Norton, Hannah F.; Tennant, Michele R.; Edwards, Mary E.; Stoyan-Rosenzweig, Nina; Daley, Matthew

    2013-01-01

    In an effort to reach out to library users and make the library a more relevant, welcoming place, the University of Florida’s Health Science Center Library hosted exhibits from the National Library of Medicine’s (NLM) Traveling Exhibition Program. From 2010 through 2012, the library hosted four NLM exhibits and created event series for each. Through reflection and use of a participant survey, lessons were learned concerning creating relevant programs, marketing events, and forming new partnerships. Each successive exhibit added events and activities to address different audiences. A survey of libraries that have hosted NLM exhibits highlights lessons learned at those institutions. PMID:23869634

  13. Creating Non-Believed Memories for Recent Autobiographical Events

    PubMed Central

    Clark, Andrew; Nash, Robert A.; Fincham, Gabrielle; Mazzoni, Giuliana

    2012-01-01

    A recent study showed that many people spontaneously report vivid memories of events that they do not believe to have occurred [1]. In the present experiment we tested for the first time whether, after powerful false memories have been created, debriefing might leave behind nonbelieved memories for the fake events. In Session 1 participants imitated simple actions, and in Session 2 they saw doctored video-recordings containing clips that falsely suggested they had performed additional (fake) actions. As in earlier studies, this procedure created powerful false memories. In Session 3, participants were debriefed and told that specific actions in the video were not truly performed. Beliefs and memories for all critical actions were tested before and after the debriefing. Results showed that debriefing undermined participants' beliefs in fake actions, but left behind residual memory-like content. These results indicate that debriefing can leave behind vivid false memories which are no longer believed, and thus we demonstrate for the first time that the memory of an event can be experimentally dissociated from the belief in the event's occurrence. These results also confirm that belief in and memory for an event can be independently-occurring constructs. PMID:22427927

  14. Photoelectric effect from observer's mathematics point of view

    NASA Astrophysics Data System (ADS)

    Khots, Boris; Khots, Dmitriy

    2014-12-01

    When we consider and analyze physical events with the purpose of creating corresponding models we often assume that the mathematical apparatus used in modeling is infallible. In particular, this relates to the use of infinity in various aspects and the use of Newton's definition of a limit in analysis. We believe that is where the main problem lies in contemporary study of nature. This work considers Physical aspects in a setting of arithmetic, algebra, geometry, analysis, topology provided by Observer's Mathematics (see www.mathrelativity.com). Certain results and communications pertaining to solution of these problems are provided. In particular, we prove the following Theorems, which give Observer's Mathematics point of view on Einstein photoelectric effect theory and Lamb-Scully and Hanbury-Brown-Twiss experiments: Theorem 1. There are some values of light intensity where anticorrelation parameter A ∈ [0,1). Theorem 2. There are some values of light intensity where anticorrelation parameter A = 1. Theorem 3. There are some values of light intensity where anticorrelation parameter A > 1.

  15. Adapting End Host Congestion Control for Mobility

    NASA Technical Reports Server (NTRS)

    Eddy, Wesley M.; Swami, Yogesh P.

    2005-01-01

    Network layer mobility allows transport protocols to maintain connection state, despite changes in a node's physical location and point of network connectivity. However, some congestion-controlled transport protocols are not designed to deal with these rapid and potentially significant path changes. In this paper we demonstrate several distinct problems that mobility-induced path changes can create for TCP performance. Our premise is that mobility events indicate path changes that require re-initialization of congestion control state at both connection end points. We present the application of this idea to TCP in the form of a simple solution (the Lightweight Mobility Detection and Response algorithm, that has been proposed in the IETF), and examine its effectiveness. In general, we find that the deficiencies presented are both relatively easily and painlessly fixed using this solution. We also find that this solution has the counter-intuitive property of being both more friendly to competing traffic, and simultaneously more aggressive in utilizing newly available capacity than unmodified TCP.

  16. Interactions between Point Bar Growth and Bank Erosion on a Low Sinuosity Meander Bend in an Ephemeral Channel: Insights from Repeat Topographic Surveys and Numerical Modeling

    NASA Astrophysics Data System (ADS)

    Ursic, M.; Langendoen, E. J.

    2017-12-01

    Interactions between point bar growth, bank migration, and hydraulics on meandering rivers are complicated and not well understood. For ephemeral streams, rapid fluctuations in flow further complicate studying and understanding these interactions. This study seeks to answer the following `cause-and-effect' question: Does point bar morphologic adjustment determine where bank erosion occurs (for example, through topographic steering of the flow), or does local bank retreat determine where accretion/erosion occurs on the point bar, or do bank erosion and point bar morphologic adjustment co-evolve? Further, is there a response time between the `cause-and-effect' processes and what variables determine its magnitude and duration? In an effort to answer these questions for an ephemeral stream, a dataset of forty-eight repeat topographic surveys over a ten-year period (1996-2006) of a low sinuosity bend within the Goodwin Creek Experimental Watershed, located near Batesville, MS, were utilized in conjunction with continuous discharge measurements to correlate flow variability and erosional and depositional zones, spatially and temporally. Hydraulically, the bend is located immediately downstream of a confluence with a major tributary. Supercritical flumes on both the primary and tributary channels just upstream of the confluence provide continuous measured discharges to the bend over the survey period. In addition, water surface elevations were continuously measured at the upstream and downstream ends of the bend. No spatial correlation trends could be discerned between reach-scale bank retreat, point bar morphologic adjustment, and flow discharge. Because detailed flow patterns were not available, the two-dimensional computer model Telemac2D was used to provide these details. The model was calibrated and validated for a set of runoff events for which more detailed flow data were available. Telemac2D simulations were created for each topographic survey period. Flows greater than baseflow were combined to create contiguous hydrographs for each survey period. Statistical examination of local flow variability and morphological changes throughout the bend will be conducted and presented.

  17. High speed point derivative microseismic detector

    DOEpatents

    Uhl, J.E.; Warpinski, N.R.; Whetten, E.B.

    1998-06-30

    A high speed microseismic event detector constructed in accordance with the present invention uses a point derivative comb to quickly and accurately detect microseismic events. Compressional and shear waves impinging upon microseismic receiver stations disposed to collect waves are converted into digital data and analyzed using a point derivative comb including assurance of quiet periods prior to declaration of microseismic events. If a sufficient number of quiet periods have passed, the square of a two point derivative of the incoming digital signal is compared to a trip level threshold exceeding the determined noise level to declare a valid trial event. The squaring of the derivative emphasizes the differences between noise and signal, and the valid event is preferably declared when the trip threshold has been exceeded over a temporal comb width to realize a comb over a given time period. Once a trial event has been declared, the event is verified through a spatial comb, which applies the temporal event comb to additional stations. The detector according to the present invention quickly and accurately detects initial compressional waves indicative of a microseismic event which typically exceed the ambient cultural noise level by a small amount, and distinguishes the waves from subsequent larger amplitude shear waves. 9 figs.

  18. High speed point derivative microseismic detector

    DOEpatents

    Uhl, James Eugene; Warpinski, Norman Raymond; Whetten, Ernest Blayne

    1998-01-01

    A high speed microseismic event detector constructed in accordance with the present invention uses a point derivative comb to quickly and accurately detect microseismic events. Compressional and shear waves impinging upon microseismic receiver stations disposed to collect waves are converted into digital data and analyzed using a point derivative comb including assurance of quiet periods prior to declaration of microseismic events. If a sufficient number of quiet periods have passed, the square of a two point derivative of the incoming digital signal is compared to a trip level threshold exceeding the determined noise level to declare a valid trial event. The squaring of the derivative emphasizes the differences between noise and signal, and the valid event is preferably declared when the trip threshold has been exceeded over a temporal comb width to realize a comb over a given time period. Once a trial event has been declared, the event is verified through a spatial comb, which applies the temporal event comb to additional stations. The detector according to the present invention quickly and accurately detects initial compressional waves indicative of a microseismic event which typically exceed the ambient cultural noise level by a small amount, and distinguishes the waves from subsequent larger amplitude shear waves.

  19. Employee reactions and adjustment to euthanasia-related work: identifying turning-point events through retrospective narratives.

    PubMed

    Reeve, Charlie L; Spitzmuller, Christiane; Rogelberg, Steven G; Walker, Alan; Schultz, Lisa; Clark, Olga

    2004-01-01

    This study used a retrospective narrative procedure to examine the critical events that influence reactions and adjustment to euthanasia-related work of 35 employees who have stayed in the animal care and welfare field for at least 2 years. The study analyzed adjustment trajectory graphs and interview notes to identify turning-point events that spurred either a positive or negative change in shelter workers' psychological well-being. Analysis of the identified turning-point events revealed 10 common event themes that have implications for a range of work, personnel, and organizational practices. The article discusses implications for shelter, employee, and animal welfare.

  20. EPA Office of Water (OW): 303(d) Listed Impaired Waters NHDPlus Indexed Dataset

    EPA Pesticide Factsheets

    The 303(d) Listed Impaired Waters program system provides impaired water data and impaired water features reflecting river segments, lakes, and estuaries designated under Section 303(d) of the Clean Water Act. Each State will establish Total Maximum Daily Loads (TMDLs) for these waters. Note the CWA Section 303(d) list of impaired waters does not represent waters that are impaired but have an EPA-approved TMDL established, impaired waters for which other pollution control mechanisms are in place and expected to attain water quality standards, or waters impaired as a result of pollution and is not caused by a pollutant. Therefore, the Impaired Waters layers do not represent all impaired waters reported in a state's Integrated Report, but only the waters comprised of a state's approved 303(d) list. For more information regarding impaired waters refer to EPA's Integrated Reporting Guidance at: http://water.epa.gov/lawsregs/lawsguidance/cwa/tmdl/guidance.cfm. 303(d) waterbodies are coded onto NHDPlus v2.1 flowline and waterbody features to create line, area, and point events. In addition to NHDPlus reach indexed data there may also be custom event data (point, line, or polygon) that are not associated with NHDPlus and are in an EPA standard format that is compatible with EPA's Reach Address Database. These custom features are used to represent locations of 303(d) waterbodies that are not represented well in NHDPlus.

  1. History of Late-Notice HIEs

    NASA Technical Reports Server (NTRS)

    Frakes, P.

    2016-01-01

    Question was raised: are we seeing more late-notice events in recent months? Two definitions oflate-notice used to compare data: Event has at least one data point between TCA-4 days and TCA-2 days where the Pcwas below 1E-7, OR there were no data points in that timeframe. Event has at least one data point between TCA-2days and TCA where the Pc was at least 1E-4Event has at least one data point between TCA-4 days and TCA-2 dayswhere the Pc was below 1E-5, OR there were no data points in that timeframe. Event has at least one data pointbetween TCA-2 days and TCA where the Pc was at least 1E-4. The case studies that were examined all fall within criteriafor both definitions Terra vs 38192; TCA 24 JUN 2015Aura vs 89477; TCA 29 AUG 2015Terra vs 37131; TCA 19 DEC2015GPM vs 28685; TCA 5 SEP 2015.

  2. A picture is worth a thousand lies: using false photographs to create false childhood memories.

    PubMed

    Wade, Kimberley A; Garry, Maryanne; Read, J Don; Lindsay, D Stephen

    2002-09-01

    Because image-enhancing technology is readily available, people are frequently exposed to doctored images. However, in prior research on how adults can be led to report false childhood memories, subjects have typically been exposed to personalized and detailed narratives describing false events. Instead, we exposed 20 subjects to a false childhood event via a fake photograph and imagery instructions. Over three interviews, subjects thought about a photograph showing them on a hot air balloon ride and tried to recall the event byusing guided-imagery exercises. Fifty percent of the subjects created complete or partial false memories. The results bear on ways in which false memories can be created and also have practical implications for those involved in clinical and legal settings.

  3. X Marks the Spot: Creating and Managing a Single Service Point to Improve Customer Service and Maximize Resources

    ERIC Educational Resources Information Center

    Venner, Mary Ann; Keshmiripour, Seti

    2016-01-01

    This article will describe how merging service points in an academic library is an opportunity to improve customer service and utilize staffing resources more efficiently. Combining service points provides libraries with the ability to create a more positive library experience for patrons by minimizing the ping-pong effect for assistance. The…

  4. Topological events on the lines of circular polarization in nonparaxial vector optical fields.

    PubMed

    Freund, Isaac

    2017-02-01

    In nonparaxial vector optical fields, the following topological events are shown to occur in apparent violation of charge conservation: as one translates the observation plane along a line of circular polarization (a C line), the points on the line (C points) are seen to change not only the signs of their topological charges, but also their handedness, and, at turning points on the line, paired C points with the same topological charge and opposite handedness are seen to nucleate. These counter-intuitive events cannot occur in paraxial fields.

  5. Predicting the possibility of not yet observed situations as higher goal of space environment standards.

    NASA Astrophysics Data System (ADS)

    Nymmik, Rikho

    Space environment models are intended for fairly describing the quantitative behavior of nature space environment. Usually, they are constructed on the basis of some experimental data set generalization, which is characteristic of the conditions that were taking place during measurement period. It is often to see that such models state and postulate realities of the past. The typical example of this point of view is the situation around extremely SEP events. During dozens of years models of such events have been based on the largest occurrences observed, which features were measured by some instruments with the reliability that was not always analyzed. It is obvious, that this way does not agree with reality, because any new extreme event conflicts with it. From this follow that space environment models can not be created by using numerical observed data only, when such data are changing in time, or have the probability nature. The model's goal is not only describing the average environment characteristics, but the predicting of extreme ones too. Such a prediction can only be result of analyzing the causes that stimulate environment change and taking them into account in model parameters. In this report we present the analysis of radiation environment formed by solar-generated high energy particles. A progresses and failures of SEP event modeling attempts are also shown and analyzed.

  6. Conscious experience and episodic memory: hippocampus at the crossroads.

    PubMed

    Behrendt, Ralf-Peter

    2013-01-01

    If an instance of conscious experience of the seemingly objective world around us could be regarded as a newly formed event memory, much as an instance of mental imagery has the content of a retrieved event memory, and if, therefore, the stream of conscious experience could be seen as evidence for ongoing formation of event memories that are linked into episodic memory sequences, then unitary conscious experience could be defined as a symbolic representation of the pattern of hippocampal neuronal firing that encodes an event memory - a theoretical stance that may shed light into the mind-body and binding problems in consciousness research. Exceedingly detailed symbols that describe patterns of activity rapidly self-organizing, at each cycle of the θ rhythm, in the hippocampus are instances of unitary conscious experience that jointly constitute the stream of consciousness. Integrating object information (derived from the ventral visual stream and orbitofrontal cortex) with contextual emotional information (from the anterior insula) and spatial environmental information (from the dorsal visual stream), the hippocampus rapidly forms event codes that have the informational content of objects embedded in an emotional and spatiotemporally extending context. Event codes, formed in the CA3-dentate network for the purpose of their memorization, are not only contextualized but also allocentric representations, similarly to conscious experiences of events and objects situated in a seemingly objective and observer-independent framework of phenomenal space and time. Conscious perception, creating the spatially and temporally extending world that we perceive around us, is likely to be evolutionarily related to more fleeting and seemingly internal forms of conscious experience, such as autobiographical memory recall, mental imagery, including goal anticipation, and to other forms of externalized conscious experience, namely dreaming and hallucinations; and evidence pointing to an important contribution of the hippocampus to these conscious phenomena will be reviewed.

  7. Event Discrimination Using Seismoacoustic Catalog Probabilities

    NASA Astrophysics Data System (ADS)

    Albert, S.; Arrowsmith, S.; Bowman, D.; Downey, N.; Koch, C.

    2017-12-01

    Presented here are three seismoacoustic catalogs from various years and locations throughout Utah and New Mexico. To create these catalogs, we combine seismic and acoustic events detected and located using different algorithms. Seismoacoustic events are formed based on similarity of origin time and location. Following seismoacoustic fusion, the data is compared against ground truth events. Each catalog contains events originating from both natural and anthropogenic sources. By creating these seismoacoustic catalogs, we show that the fusion of seismic and acoustic data leads to a better understanding of the nature of individual events. The probability of an event being a surface blast given its presence in each seismoacoustic catalog is quantified. We use these probabilities to discriminate between events from natural and anthropogenic sources. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  8. 77 FR 51951 - Special Local Regulation for Marine Events; Temporary Change of Dates for Recurring Marine Events...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-28

    ... Events in the Fifth Coast Guard District, Poquoson Seafood Festival Workboat Races, Back River; Poquoson... recurring marine event in the Fifth Coast Guard District. This event is the Poquoson Seafood Festival... city's annual seafood festival. A special local regulation is effective annually to create a safety...

  9. The structural basis for function in diamond-like carbon binding peptides.

    PubMed

    Gabryelczyk, Bartosz; Szilvay, Géza R; Linder, Markus B

    2014-07-29

    The molecular structural basis for the function of specific peptides that bind to diamond-like carbon (DLC) surfaces was investigated. For this, a competition assay that provided a robust way of comparing relative affinities of peptide variants was set up. Point mutations of specific residues resulted in significant effects, but it was shown that the chemical composition of the peptide was not sufficient to explain peptide affinity. More significantly, rearrangements in the sequence indicated that the binding is a complex recognition event that is dependent on the overall structure of the peptide. The work demonstrates the unique properties of peptides for creating functionality at interfaces via noncovalent binding for potential applications in, for example, nanomaterials, biomedical materials, and sensors.

  10. Position sensitive detection of nuclear radiation mediated by non equilibrium phonons at low temperatures

    NASA Astrophysics Data System (ADS)

    Pröbst, F.; Peterreins, Th.; Feilitzsch, F. v.; Kraus, H.

    1990-03-01

    Many experiments in nuclear and particle physics would benefit from the development of a device capable of detecting non-ionizing events with a low energy threshold. In this context, we report on experimental tests of a detector based on the registration of nonequilibrium phonons. The device is composed of a silicon single crystal (size: 20×10×3 mm 3) and of an array of superconducting tunnel junctions evaporated onto the surface of the crystal. The junctions serve as sensors for phonons created by absorption of nuclear radiation in the crystal. We show how pulse height analysis and the investigation of time differences between correlated pulses in different junctions can be used to obtain information about the point of absorption.

  11. An Offline-Online Android Application for Hazard Event Mapping Using WebGIS Open Source Technologies

    NASA Astrophysics Data System (ADS)

    Olyazadeh, Roya; Jaboyedoff, Michel; Sudmeier-Rieux, Karen; Derron, Marc-Henri; Devkota, Sanjaya

    2016-04-01

    Nowadays, Free and Open Source Software (FOSS) plays an important role in better understanding and managing disaster risk reduction around the world. National and local government, NGOs and other stakeholders are increasingly seeking and producing data on hazards. Most of the hazard event inventories and land use mapping are based on remote sensing data, with little ground truthing, creating difficulties depending on the terrain and accessibility. Open Source WebGIS tools offer an opportunity for quicker and easier ground truthing of critical areas in order to analyse hazard patterns and triggering factors. This study presents a secure mobile-map application for hazard event mapping using Open Source WebGIS technologies such as Postgres database, Postgis, Leaflet, Cordova and Phonegap. The objectives of this prototype are: 1. An Offline-Online android mobile application with advanced Geospatial visualisation; 2. Easy Collection and storage of events information applied services; 3. Centralized data storage with accessibility by all the service (smartphone, standard web browser); 4. Improving data management by using active participation in hazard event mapping and storage. This application has been implemented as a low-cost, rapid and participatory method for recording impacts from hazard events and includes geolocation (GPS data and Internet), visualizing maps with overlay of satellite images, viewing uploaded images and events as cluster points, drawing and adding event information. The data can be recorded in offline (Android device) or online version (all browsers) and consequently uploaded through the server whenever internet is available. All the events and records can be visualized by an administrator and made public after approval. Different user levels can be defined to access the data for communicating the information. This application was tested for landslides in post-earthquake Nepal but can be used for any other type of hazards such as flood, avalanche, etc. Keywords: Offline, Online, WebGIS Open source, Android, Hazard Event Mapping

  12. Creating a Web-accessible, point-of-care, team-based information system (PointTIS): the librarian as publisher.

    PubMed

    Burrows, S C; Moore, K M; Lemkau, H L

    2001-04-01

    The Internet has created new opportunities for librarians to develop information systems that are readily accessible at the point of care. This paper describes the multiyear process used to justify, fund, design, develop, promote, and evaluate a rehabilitation prototype of a point-of-care, team-based information system (PoinTIS) and train health care providers to use this prototype for their spinal cord injury and traumatic brain injury patient care and education activities. PoinTIS is a successful model for librarians in the twenty-first century to serve as publishers of information created or used by their parent organizations and to respond to the opportunities for information dissemination provided by recent technological advances.

  13. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    NASA Technical Reports Server (NTRS)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand DES capabilities to address KSC's planning needs.

  14. Individual Events as a Laboratory for Argument: Analogues for Limited Preparation Events.

    ERIC Educational Resources Information Center

    Kay, Jack

    To better serve as a laboratory for argument, individual events competition should represent analogues of "real world" argumentation/communication situations. The individual events laboratory must fulfill a pedagogical function, and should also "create" knowledge about argumentation strategies, specific fields of argument, and…

  15. Building Partnerships through Classroom-Based Events

    ERIC Educational Resources Information Center

    Zacarian, Debbie; Silverstone, Michael

    2017-01-01

    Building partnerships with families can be a challenge, especially in ethnically diverse classrooms. In this article, the authors describe how to create such partnerships with three kinds of classroom events: community-building events that deepen social relationships and make families feel welcome; curriculum showcase events that give families a…

  16. High speed point derivative microseismic detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uhl, J.E.; Warpinski, N.R.; Whetten, E.B.

    A high speed microseismic event detector constructed in accordance with the present invention uses a point derivative comb to quickly and accurately detect microseismic events. Compressional and shear waves impinging upon microseismic receiver stations disposed to collect waves are converted into digital data and analyzed using a point derivative comb including assurance of quiet periods prior to declaration of microseismic events. If a sufficient number of quiet periods have passed, the square of a two point derivative of the incoming digital signal is compared to a trip level threshold exceeding the determined noise level to declare a valid trial event.more » The squaring of the derivative emphasizes the differences between noise and signal, and the valid event is preferably declared when the trip threshold has been exceeded over a temporal comb width to realize a comb over a given time period. Once a trial event has been declared, the event is verified through a spatial comb, which applies the temporal event comb to additional stations. The detector according to the present invention quickly and accurately detects initial compressional waves indicative of a microseismic event which typically exceed the ambient cultural noise level by a small amount, and distinguishes the waves from subsequent larger amplitude shear waves. 9 figs.« less

  17. Self-delivered misinformation - Merging the choice blindness and misinformation effect paradigms.

    PubMed

    Stille, Lotta; Norin, Emelie; Sikström, Sverker

    2017-01-01

    Choice blindness is the failure to detect a discrepancy between a choice and its outcome. The misinformation effect occurs when the recollection of an event changes because new, misleading information about the event is received. The purpose of this study was to merge the choice blindness and misinformation effect paradigms, and thus examine whether choice blindness can be created for individuals' recollections of a witnessed event, and whether this will affect their later recollections of the event. Thus, as a way of delivering misinformation the participants ostensibly became their own source of the misleading information. The participants watched a short film and filled out a questionnaire about events shown in the film. Some of their answers were then manipulated using reattachable stickers, which allowed alteration of their original answers. The participants gave justifications for their manipulated choices, and later their recollection of the original event was tested through another questionnaire. Choice blindness was created for a majority of the participants. A majority of the choice blind participants later changed their reported recollection of the event in line with the manipulations, whereas only a small minority of the participants in the control condition changed their recollection. This study provides new information about the misinformation effect, suggesting that this effect also can occur when misinformation is given immediately following presentation of the original stimuli, and about choice blindness and its effects on the recollections of events. The results suggest that memory blindness can be created when people inadvertently supply themselves with misleading information about an event, causing a change in their recollection.

  18. Self-delivered misinformation - Merging the choice blindness and misinformation effect paradigms

    PubMed Central

    Stille, Lotta; Norin, Emelie; Sikström, Sverker

    2017-01-01

    Choice blindness is the failure to detect a discrepancy between a choice and its outcome. The misinformation effect occurs when the recollection of an event changes because new, misleading information about the event is received. The purpose of this study was to merge the choice blindness and misinformation effect paradigms, and thus examine whether choice blindness can be created for individuals’ recollections of a witnessed event, and whether this will affect their later recollections of the event. Thus, as a way of delivering misinformation the participants ostensibly became their own source of the misleading information. The participants watched a short film and filled out a questionnaire about events shown in the film. Some of their answers were then manipulated using reattachable stickers, which allowed alteration of their original answers. The participants gave justifications for their manipulated choices, and later their recollection of the original event was tested through another questionnaire. Choice blindness was created for a majority of the participants. A majority of the choice blind participants later changed their reported recollection of the event in line with the manipulations, whereas only a small minority of the participants in the control condition changed their recollection. This study provides new information about the misinformation effect, suggesting that this effect also can occur when misinformation is given immediately following presentation of the original stimuli, and about choice blindness and its effects on the recollections of events. The results suggest that memory blindness can be created when people inadvertently supply themselves with misleading information about an event, causing a change in their recollection. PMID:28273151

  19. Bifurcations and degenerate periodic points in a three dimensional chaotic fluid flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L. D., E-mail: lachlan.smith@monash.edu; CSIRO Mineral Resources, Clayton, Victoria 3800; Rudman, M.

    2016-05-15

    Analysis of the periodic points of a conservative periodic dynamical system uncovers the basic kinematic structure of the transport dynamics and identifies regions of local stability or chaos. While elliptic and hyperbolic points typically govern such behaviour in 3D systems, degenerate (parabolic) points also play an important role. These points represent a bifurcation in local stability and Lagrangian topology. In this study, we consider the ramifications of the two types of degenerate periodic points that occur in a model 3D fluid flow. (1) Period-tripling bifurcations occur when the local rotation angle associated with elliptic points is reversed, creating a reversalmore » in the orientation of associated Lagrangian structures. Even though a single unstable point is created, the bifurcation in local stability has a large influence on local transport and the global arrangement of manifolds as the unstable degenerate point has three stable and three unstable directions, similar to hyperbolic points, and occurs at the intersection of three hyperbolic periodic lines. The presence of period-tripling bifurcation points indicates regions of both chaos and confinement, with the extent of each depending on the nature of the associated manifold intersections. (2) The second type of bifurcation occurs when periodic lines become tangent to local or global invariant surfaces. This bifurcation creates both saddle–centre bifurcations which can create both chaotic and stable regions, and period-doubling bifurcations which are a common route to chaos in 2D systems. We provide conditions for the occurrence of these tangent bifurcations in 3D conservative systems, as well as constraints on the possible types of tangent bifurcation that can occur based on topological considerations.« less

  20. American Heart Association's Life's Simple 7: Avoiding Heart Failure and Preserving Cardiac Structure and Function.

    PubMed

    Folsom, Aaron R; Shah, Amil M; Lutsey, Pamela L; Roetker, Nicholas S; Alonso, Alvaro; Avery, Christy L; Miedema, Michael D; Konety, Suma; Chang, Patricia P; Solomon, Scott D

    2015-09-01

    Many people may underappreciate the role of lifestyle in avoiding heart failure. We estimated whether greater adherence in middle age to American Heart Association's Life's Simple 7 guidelines—on smoking, body mass, physical activity, diet, cholesterol, blood pressure, and glucose—is associated with lower lifetime risk of heart failure and greater preservation of cardiac structure and function in old age. We studied the population-based Atherosclerosis Risk in Communities Study cohort of 13,462 adults ages 45-64 years in 1987-1989. From the 1987-1989 risk factor measurements, we created a Life's Simple 7 score (range 0-14, giving 2 points for ideal, 1 point for intermediate, and 0 points for poor components). We identified 2218 incident heart failure events using surveillance of hospital discharge and death codes through 2011. In addition, in 4855 participants free of clinical cardiovascular disease in 2011-2013, we performed echocardiography from which we quantified left ventricular hypertrophy and diastolic dysfunction. One in four participants (25.5%) developed heart failure through age 85 years. Yet, this lifetime heart failure risk was 14.4% for those with a middle-age Life's Simple 7 score of 10-14 (optimal), 26.8% for a score of 5-9 (average), and 48.6% for a score of 0-4 (inadequate). Among those with no clinical cardiovascular event, the prevalence of left ventricular hypertrophy in late life was approximately 40% as common, and diastolic dysfunction was approximately 60% as common, among those with an optimal middle-age Life's Simple 7 score, compared with an inadequate score. Greater achievement of American Heart Association's Life's Simple 7 in middle age is associated with a lower lifetime occurrence of heart failure and greater preservation of cardiac structure and function. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. TITAN: inference of copy number architectures in clonal cell populations from tumor whole-genome sequence data

    PubMed Central

    Roth, Andrew; Khattra, Jaswinder; Ho, Julie; Yap, Damian; Prentice, Leah M.; Melnyk, Nataliya; McPherson, Andrew; Bashashati, Ali; Laks, Emma; Biele, Justina; Ding, Jiarui; Le, Alan; Rosner, Jamie; Shumansky, Karey; Marra, Marco A.; Gilks, C. Blake; Huntsman, David G.; McAlpine, Jessica N.; Aparicio, Samuel

    2014-01-01

    The evolution of cancer genomes within a single tumor creates mixed cell populations with divergent somatic mutational landscapes. Inference of tumor subpopulations has been disproportionately focused on the assessment of somatic point mutations, whereas computational methods targeting evolutionary dynamics of copy number alterations (CNA) and loss of heterozygosity (LOH) in whole-genome sequencing data remain underdeveloped. We present a novel probabilistic model, TITAN, to infer CNA and LOH events while accounting for mixtures of cell populations, thereby estimating the proportion of cells harboring each event. We evaluate TITAN on idealized mixtures, simulating clonal populations from whole-genome sequences taken from genomically heterogeneous ovarian tumor sites collected from the same patient. In addition, we show in 23 whole genomes of breast tumors that the inference of CNA and LOH using TITAN critically informs population structure and the nature of the evolving cancer genome. Finally, we experimentally validated subclonal predictions using fluorescence in situ hybridization (FISH) and single-cell sequencing from an ovarian cancer patient sample, thereby recapitulating the key modeling assumptions of TITAN. PMID:25060187

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yuanbin; Pálffy, Adriana, E-mail: yuanbin.wu@mpi-hd.mpg.de, E-mail: Palffy@mpi-hd.mpg.de

    Due to screening effects, nuclear reactions in astrophysical plasmas may behave differently than in the laboratory. The possibility to determine the magnitude of these screening effects in colliding laser-generated plasmas is investigated theoretically, having as a starting point a proposed experimental setup with two laser beams at the Extreme Light Infrastructure facility. A laser pulse interacting with a solid target produces a plasma through the Target Normal Sheath Acceleration scheme, and this rapidly streaming plasma (ion flow) impacts a secondary plasma created by the interaction of a second laser pulse on a gas jet target. We model this scenario heremore » and calculate the reaction events for the astrophysically relevant reaction {sup 13}C({sup 4}He, n ){sup 16}O. We find that it should be experimentally possible to determine the plasma screening enhancement factor for fusion reactions by detecting the difference in reaction events between two scenarios of ion flow interacting with the plasma target and a simple gas target. This provides a way to evaluate nuclear reaction cross-sections in stellar environments and can significantly advance the field of nuclear astrophysics.« less

  3. Determination of Plasma Screening Effects for Thermonuclear Reactions in Laser-generated Plasmas

    NASA Astrophysics Data System (ADS)

    Wu, Yuanbin; Pálffy, Adriana

    2017-03-01

    Due to screening effects, nuclear reactions in astrophysical plasmas may behave differently than in the laboratory. The possibility to determine the magnitude of these screening effects in colliding laser-generated plasmas is investigated theoretically, having as a starting point a proposed experimental setup with two laser beams at the Extreme Light Infrastructure facility. A laser pulse interacting with a solid target produces a plasma through the Target Normal Sheath Acceleration scheme, and this rapidly streaming plasma (ion flow) impacts a secondary plasma created by the interaction of a second laser pulse on a gas jet target. We model this scenario here and calculate the reaction events for the astrophysically relevant reaction 13C(4He, n)16O. We find that it should be experimentally possible to determine the plasma screening enhancement factor for fusion reactions by detecting the difference in reaction events between two scenarios of ion flow interacting with the plasma target and a simple gas target. This provides a way to evaluate nuclear reaction cross-sections in stellar environments and can significantly advance the field of nuclear astrophysics.

  4. Development of a Two-Wheel Contingency Mode for the MAP Spacecraft

    NASA Technical Reports Server (NTRS)

    Starin, Scott R.; ODonnell, James R., Jr.; Bauer, Frank H. (Technical Monitor)

    2002-01-01

    In the event of a failure of one of MAP's three reaction wheel assemblies (RWAs), it is not possible to achieve three-axis, full-state attitude control using the remaining two wheels. Hence, two of the attitude control algorithms implemented on the MAP spacecraft will no longer be usable in their current forms: Inertial Mode, used for slewing to and holding inertial attitudes, and Observing Mode, which implements the nominal dual-spin science mode. This paper describes the effort to create a complete strategy for using software algorithms to cope with a RWA failure. The discussion of the design process will be divided into three main subtopics: performing orbit maneuvers to reach and maintain an orbit about the second Earth-Sun libration point in the event of a RWA failure, completing the mission using a momentum-bias two-wheel science mode, and developing a new thruster-based mode for adjusting the inertially fixed momentum bias. In this summary, the philosophies used in designing these changes is shown; the full paper will supplement these with algorithm descriptions and testing results.

  5. Finding Every Root of a Broad Class of Real, Continuous Functions in a Given Interval

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.; Wolgast, Paul A.

    2011-01-01

    One of the most pervasive needs within the Deep Space Network (DSN) Metric Prediction Generator (MPG) view period event generation is that of finding solutions to given occurrence conditions. While the general form of an equation expresses equivalence between its left-hand and right-hand expressions, the traditional treatment of the subject subtracts the two sides, leaving an expression of the form Integral of(x) = 0. Values of the independent variable x satisfying this condition are roots, or solutions. Generally speaking, there may be no solutions, a unique solution, multiple solutions, or a continuum of solutions to a given equation. In particular, all view period events are modeled as zero crossings of various metrics; for example, the time at which the elevation of a spacecraft reaches its maximum value, as viewed from a Deep Space Station (DSS), is found by locating that point at which the derivative of the elevation function becomes zero. Moreover, each event type may have several occurrences within a given time interval of interest. For example, a spacecraft in a low Moon orbit will experience several possible occultations per day, each of which must be located in time. The MPG is charged with finding all specified event occurrences that take place within a given time interval (or pass ), without any special clues from operators as to when they may occur, for the entire spectrum of missions undertaken by the DSN. For each event type, the event metric function is a known form that can be computed for any instant within the interval. A method has been created for a mathematical root finder to be capable of finding all roots of an arbitrary continuous function, within a given interval, to be subject to very lenient, parameterized assumptions. One assumption is that adjacent roots are separated at least by a given amount, xGuard. Any point whose function value is less than ef in magnitude is considered to be a root, and the function values at distances xGuard away from a root are larger than ef, unless there is another root located in this vicinity. A root is considered found if, during iteration, two root candidates differ by less than a pre-specified ex, and the optimum cubic polynomial matching the function at the end and at two interval points (that is within a relative error fraction L at its midpoint) is reliable in indicating whether the function has extrema within the interval. The robustness of this method depends solely on choosing these four parameters that control the search. The roots of discontinuous functions were also found, but at degraded performance.

  6. The Role of Social Networking Sites in Creating Moral Crisis and the Role of the University in Confronting It from the View Point of Qassim University Faculty Members

    ERIC Educational Resources Information Center

    Al-Smadi, Hend Sam'an Ibrahim

    2017-01-01

    The study aimed at recognizing the effect of the social networking sites (henceforth snss) in creating moral crisis and the role of the university in its confrontation from the view point of faculty members at Qassim University. Two tests were constructed; the first included (29 items) developed to identify the role of snss in creating moral…

  7. Psychiatric Mental Health Leadership at the Tipping Point.

    PubMed

    R Delaney, Kathleen

    2015-05-01

    Currently the United States health care system is responding to the Patient Protection and Affordable Care Act (PPACA) and the vision it contains for health care transformation. Along with sweeping changes in service delivery and payment structures, health care reform has championed concepts such as patient-centered care, integrated care, and wellness. Although these are not new ideas, their adaptation, in both ideology and service design has been accelerated in the context for reform. Indeed they are reaching a tipping point; the point where ideas gain wide acceptance and become influential trends. Although psychiatric mental health (PMH) nurses have been active in wellness, patient-centered care, and integrated care, at the current time they seem to be situated peripheral to these national trends. Increased presence of PMH nurses will facilitate their contribution to the development of these concepts within service structures and interventions. To increase knowledge and appreciation of PMH nurses' practice and unique perspective on these issues, leaders are needed who will connect and effectively communicate PMH nursing efforts to the broader health care arena. This article outlines the events that created a context for these three concepts (patient-centered care, wellness, and integrated care), and I suggest why they have reached a tipping point and discuss the need for greater PMH nursing presence in the American national dialog and the role of nursing leaders in facilitating these connections.

  8. Design Tools for Reconfigurable Hardware in Orbit (RHinO)

    NASA Technical Reports Server (NTRS)

    French, Mathew; Graham, Paul; Wirthlin, Michael; Larchev, Gregory; Bellows, Peter; Schott, Brian

    2004-01-01

    The Reconfigurable Hardware in Orbit (RHinO) project is focused on creating a set of design tools that facilitate and automate design techniques for reconfigurable computing in space, using SRAM-based field-programmable-gate-array (FPGA) technology. These tools leverage an established FPGA design environment and focus primarily on space effects mitigation and power optimization. The project is creating software to automatically test and evaluate the single-event-upsets (SEUs) sensitivities of an FPGA design and insert mitigation techniques. Extensions into the tool suite will also allow evolvable algorithm techniques to reconfigure around single-event-latchup (SEL) events. In the power domain, tools are being created for dynamic power visualiization and optimization. Thus, this technology seeks to enable the use of Reconfigurable Hardware in Orbit, via an integrated design tool-suite aiming to reduce risk, cost, and design time of multimission reconfigurable space processors using SRAM-based FPGAs.

  9. Creating a High-Touch Recruitment Event: Utilizing Faculty to Recruit and Yield Students

    ERIC Educational Resources Information Center

    Freed, Lindsey R.; Howell, Leanne L.

    2018-01-01

    The following article describes the planning and implementation of a university student recruitment event that produced a high (new) student yield. Detailed descriptions of how staff and faculty worked together to plan and implement this event are described.

  10. Turning Points in Even Start Programs. Occasional Paper #4.

    ERIC Educational Resources Information Center

    Rasinski, Timothy; Padak, Nancy

    To investigate the initial experiences of the various Even Start programs, a project developed a survey that was sent to program coordinators in Ohio. It asked open-ended questions to get descriptions and perceptions of situations that preceded turning point events and the turning point events themselves. Data from eight programs highlighted their…

  11. EPA Office of Water (OW): 303(d) Listed Impaired Waters NHDPlus Indexed Dataset

    EPA Pesticide Factsheets

    The 303(d) Listed Impaired Waters program system provides impaired water data and impaired water features reflecting river segments, lakes, and estuaries designated under Section 303(d) of the Clean Water Act. Each State will establish Total Maximum Daily Loads (TMDLs) for these waters. Note the CWA Section 303(d) list of impaired waters does not represent waters that are impaired but have an EPA-approved TMDL established, impaired waters for which other pollution control mechanisms are in place and expected to attain water quality standards, or waters impaired as a result of pollution and is not caused by a pollutant. Therefore, the Impaired Waters layers do not represent all impaired waters reported in a state's Integrated Report, but only the waters comprised of a state's approved 303(d) list. For more information regarding impaired waters refer to EPA's Integrated Reporting Guidance at: http://water.epa.gov/lawsregs/lawsguidance/cwa/tmdl/guidance.cfm. 303(d) waterbodies are coded onto NHDPlus v2.1 flowline and waterbody features to create line, area, and point events. In addition to NHDPlus reach indexed data there may also be custom event data (point, line, or polygon) that are not associated with NHDPlus and are in an EPA standard format that is compatible with EPA's Reach Address Database. These custom features are used to represent locations of 303(d) waterbodies that are not represented well in NHDPlus.R2GIS selected out the Region 2 extent plus a one

  12. Point-of-Care Programming for Neuromodulation: A Feasibility Study Using Remote Presence.

    PubMed

    Mendez, Ivar; Song, Michael; Chiasson, Paula; Bustamante, Luis

    2013-01-01

    The expansion of neuromodulation and its indications has resulted in hundreds of thousands of patients with implanted devices worldwide. Because all patients require programming, this growth has created a heavy burden on neuromodulation centers and patients. Remote point-of-care programming may provide patients with real-time access to neuromodulation expertise in their communities. To test the feasibility of remotely programming a neuromodulation device using a remote-presence robot and to determine the ability of an expert programmer to telementor a nonexpert in programming the device. A remote-presence robot (RP-7) was used for remote programming. Twenty patients were randomly assigned to either conventional programming or a robotic session. The expert remotely mentored 10 nurses with no previous experience to program the devices of patients assigned to the remote-presence sessions. Accuracy of programming, adverse events, and satisfaction scores for all participants were assessed. There was no difference in the accuracy or clinical outcomes of programming between the standard and remote-presence sessions. No adverse events occurred in any session. The patients, nurses, and the expert programmer expressed high satisfaction scores with the remote-presence sessions. This study establishes the proof-of-principle that remote programming of neuromodulation devices using telepresence and expert telementoring of an individual with no previous experience to accurately program a device is feasible. We envision a time in the future when patients with implanted devices will have real-time access to neuromodulation expertise from the comfort of their own home.

  13. EPA Office of Water (OW): 303(d) Listed Impaired Waters NHDPlus Indexed Dataset

    EPA Pesticide Factsheets

    The 303(d) Listed Impaired Waters program system provides impaired water data and impaired water features reflecting river segments, lakes, and estuaries designated under Section 303(d) of the Clean Water Act. Each State will establish Total Maximum Daily Loads (TMDLs) for these waters. Note the CWA Section 303(d) list of impaired waters does not represent waters that are impaired but have an EPA-approved TMDL established, impaired waters for which other pollution control mechanisms are in place and expected to attain water quality standards, or waters impaired as a result of pollution and is not caused by a pollutant. Therefore, the Impaired Waters layers do not represent all impaired waters reported in a state's Integrated Report, but only the waters comprised of a state's approved 303(d) list. For more information regarding impaired waters refer to EPA's Integrated Reporting Guidance at: http://water.epa.gov/lawsregs/lawsguidance/cwa/tmdl/guidance.cfm. 303(d) waterbodies are coded onto NHDPlus v2.1 flowline and waterbody features to create line, area, and point events. In addition to NHDPlus reach indexed data there may also be custom event data (point, line, or polygon) that are not associated with NHDPlus and are in an EPA standard format that is compatible with EPA's Reach Address Database. These custom features are used to represent locations of 303(d) waterbodies that are not represented well in NHDPlus. R2GIS selected out the Region 2 extent plus a one

  14. Halitosis: Current concepts on etiology, diagnosis and management

    PubMed Central

    Kapoor, Uditi; Sharma, Gaurav; Juneja, Manish; Nagpal, Archna

    2016-01-01

    Halitosis or oral malodor is an offensive odor originating from the oral cavity, leading to anxiety and psychosocial embarrassment. A patient with halitosis is most likely to contact primary care practitioner for the diagnosis and management. With proper diagnosis, identification of the etiology and timely referrals certain steps are taken to create a successful individualized therapeutic approach for each patient seeking assistance. It is significant to highlight the necessity of an interdisciplinary method for the treatment of halitosis to prevent misdiagnosis or unnecessary treatment. The literature on halitosis, especially with randomized clinical trials, is scarce and additional studies are required. This article succinctly focuses on the development of a systematic flow of events to come to the best management of the halitosis from the primary care practitioner's point of view. PMID:27095913

  15. Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations

    ERIC Educational Resources Information Center

    Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.

    2007-01-01

    Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…

  16. Communicating Volcanic Hazards in the North Pacific

    NASA Astrophysics Data System (ADS)

    Dehn, J.; Webley, P.; Cunningham, K. W.

    2014-12-01

    For over 25 years, effective hazard communication has been key to effective mitigation of volcanic hazards in the North Pacific. These hazards are omnipresent, with a large event happening in Alaska every few years to a decade, though in many cases can happen with little or no warning (e.g. Kasatochi and Okmok in 2008). Here a useful hazard mitigation strategy has been built on (1) a large database of historic activity from many datasets, (2) an operational alert system with graduated levels of concern, (3) scenario planning, and (4) routine checks and communication with emergency managers and the public. These baseline efforts are then enhanced in the time of crisis with coordinated talking points, targeted studies and public outreach. Scientists naturally tend to target other scientists as their audience, whereas in effective monitoring of hazards that may only occur on year to decadal timescales, details can distract from the essentially important information. Creating talking points and practice in public communications can help make hazard response a part of the culture. Promoting situational awareness and familiarity can relieve indecision and concerns at the time of a crisis.

  17. 'Faking til you make it': social capital accumulation of individuals on low incomes living in contrasting socio-economic neighbourhoods and its implications for health and wellbeing.

    PubMed

    Browne-Yung, Kathryn; Ziersch, Anna; Baum, Fran

    2013-05-01

    People on low-income living in low socio-economic neighbourhoods have poorer health in comparison with those living in advantaged neighbourhoods. To explore neighbourhood effects on health and social capital creation, the experiences of low-income people living in contrasting socio-economic neighbourhoods were compared, in order to examine how low-income status and differing levels of neighbourhood resources contributed to perceived health and wellbeing. Quantitative and qualitative data were analysed: survey data from 601 individuals living in contrasting socio-economic areas and in-depth interviews with a new sample of 24 individuals on low-incomes. The study was guided by Bourdieu's theory of practice, which examines how social inequalities are created and reproduced through the relationship between individuals' varying resources of economic, social and cultural capital. This included an examination of individual life histories, cultural distinction and how social positions are reproduced. Participants' accounts of their early life experience showed how parental socio-economic position and socially patterned events taking place across the life course, created different opportunities for social network creation, choice of neighbourhood and levels of resources available throughout life, all of which can influence health and wellbeing. A definition of poverty by whether an individual or household has sufficient income at a particular point in time was an inadequate measure of disadvantage. This static measure of 'low income' as a category disguised a number of different ways in which disadvantage was experienced or, conversely, how life course events could mitigate the impact of low-income. This study found that the resources necessary to create social capital such as cultural capital and the ability to socially network, differed according to the socio-economic status of the neighbourhood, and that living in an advantaged area does not automatically guarantee access to potentially beneficial social networks. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Vallerani Micro-Catchment Infiltration Dynamics and Erosion from Simulated Rainfall and Concentrated Flow

    NASA Astrophysics Data System (ADS)

    Founds, M. J.; McGwire, K.; Weltz, M.

    2017-12-01

    Critical research gaps in rangeland hydrology still exist on the impact of conservation practices on erosion and subsequent mobilization of dissolved solids to streams. This study develops the scientific foundation necessary to better understand how a restoration strategy using a Vallerani Plow can be optimized to minimize erosion from rainfall impact and concentrated flow. Use of the Vallerani system has been proposed for use in the Upper Colorado River Basin (UCRB), where rapidly eroding rangelands contribute high salt loads to the Colorado River at a significant economic cost. The poster presentation will document the findings from a series of physical rainfall and concentrated flow simulations taking place at an experimental site northeast of Reno, NV in early August. A Walnut Gulch Rainfall simulator is used to apply variable intensity and duration rainfall events to micro-catchment structures created by the Vallerani Plow. The erosion and deposition caused by simulated rainfall will be captured from multi-angle photography using structure from motion (SFM) to create sub-centimeter 3-D models between each rainfall event. A rill-simulator also will be used to apply large volumes of concentrated flow to Vallerani micro-catchments, testing the point at which their infiltration capacity is exceeded and micro-catchments are overtopped. This information is important to adequately space structures on a given hillslope so that chances of failure are minimized. Measurements of saturated hydraulic conductivity and sorptivity from a Guelph Permeameter will be compared to the experimental results in order to develop an efficient method for surveying new terrain for treatment with the Vallerani plow. The effect of micro-catchments on surface flow and erosion will eventually be incorporated into the process-based Rangeland Hydrology and Erosion Model (RHEM) to create a tool that provides decision makers with quantitative estimates of potential reductions in erosion when using the Vallerani System to restore highly erosive rangelands within the UCRB.

  19. A comparison of the conditional inference survival forest model to random survival forests based on a simulation study as well as on two applications with time-to-event data.

    PubMed

    Nasejje, Justine B; Mwambi, Henry; Dheda, Keertan; Lesosky, Maia

    2017-07-28

    Random survival forest (RSF) models have been identified as alternative methods to the Cox proportional hazards model in analysing time-to-event data. These methods, however, have been criticised for the bias that results from favouring covariates with many split-points and hence conditional inference forests for time-to-event data have been suggested. Conditional inference forests (CIF) are known to correct the bias in RSF models by separating the procedure for the best covariate to split on from that of the best split point search for the selected covariate. In this study, we compare the random survival forest model to the conditional inference model (CIF) using twenty-two simulated time-to-event datasets. We also analysed two real time-to-event datasets. The first dataset is based on the survival of children under-five years of age in Uganda and it consists of categorical covariates with most of them having more than two levels (many split-points). The second dataset is based on the survival of patients with extremely drug resistant tuberculosis (XDR TB) which consists of mainly categorical covariates with two levels (few split-points). The study findings indicate that the conditional inference forest model is superior to random survival forest models in analysing time-to-event data that consists of covariates with many split-points based on the values of the bootstrap cross-validated estimates for integrated Brier scores. However, conditional inference forests perform comparably similar to random survival forests models in analysing time-to-event data consisting of covariates with fewer split-points. Although survival forests are promising methods in analysing time-to-event data, it is important to identify the best forest model for analysis based on the nature of covariates of the dataset in question.

  20. Symposia in undergraduate medical education: tailoring training in competencies to students' needs.

    PubMed

    Reefman, Karin; Daelmans, Hester E M; Klumpers, Ursula M H; Croiset, Gerda

    2017-12-01

    In mastering competencies, it is a challenge to create training sessions which acknowledge individual students' needs and are logistically feasible in the medical master's program. Symposia were implemented in the medical master's program to provide knowledge and training of skills in a number of topics, providing a positive contribution to students' competencies and personal development. Each symposium contained a morning and afternoon program, structured around medical and societal themes addressing various competencies and covering current national and international events. Alternating interactive teaching methods were used. Students were asked to rate each daypart program on a 5-point Likert scale in terms of both teaching methods and content, and to comment on the best aspects of the symposium as well as areas for improvement. Scores higher than 3.5 were interpreted as a predominantly favourable outcome. In 2016, 10 symposia were organized with an average of 108 attendees and a response rate of 63% (1,366 completed questionnaires). Mean overall scores on 'teaching methods' and 'usefulness for professional development' were 3.8 and 3.7, respectively. The overall results corresponded with a high level of student appreciation. Symposia offer a podium for training students in subject matter and competencies that is greatly appreciated. Using alternating interactive teaching methods, symposia are structured around medical and societal themes and adjusted to the latest developments and current events in healthcare. By allowing students to select the symposia they would like to participate in, a tailor-made medical master's program in competencies is created.

  1. Manual for automatic generation of finite element models of spiral bevel gears in mesh

    NASA Technical Reports Server (NTRS)

    Bibel, G. D.; Reddy, S.; Kumar, A.

    1994-01-01

    The goal of this research is to develop computer programs that generate finite element models suitable for doing 3D contact analysis of faced milled spiral bevel gears in mesh. A pinion tooth and a gear tooth are created and put in mesh. There are two programs: Points.f and Pat.f to perform the analysis. Points.f is based on the equation of meshing for spiral bevel gears. It uses machine tool settings to solve for an N x M mesh of points on the four surfaces, pinion concave and convex, and gear concave and convex. Points.f creates the file POINTS.OUT, an ASCI file containing N x M points for each surface. (N is the number of node points along the length of the tooth, and M is nodes along the height.) Pat.f reads POINTS.OUT and creates the file tl.out. Tl.out is a series of PATRAN input commands. In addition to the mesh density on the tooth face, additional user specified variables are the number of finite elements through the thickness, and the number of finite elements along the tooth full fillet. A full fillet is assumed to exist for both the pinion and gear.

  2. Analysis of brand personality to involve event involvement and loyalty: A case study of Jakarta Fashion Week 2017

    NASA Astrophysics Data System (ADS)

    Nasution, A. H.; Rachmawan, Y. A.

    2018-04-01

    Fashion trend in the world changed extremely fast. Fashion has become the one of people’s lifestyle in the world. Fashion week events in several areas can be a measurement of fahion trend nowadays. There was a fashion week event in Indonesia called Jakarta Fashion Week (JFW) aims to show fashion trend to people who want to improve their fashion style. People will join some events if the event has involvement to them, hence they will come to that event again and again. Annually and continuously event is really important to create loyalty among people who are involved in it, in order to increase positive development towards the organizer in organizing the next event. Saving a huge amount from the marketing budget, and creating a higher quality event. This study aims to know the effect of 5 brand personality dimension to event involvement and loyalty in Jakarta Fashion Week (JFW). This study use quantitative confirmative method with Structural Equation Model (SEM) analysis technique. The sample of this study is 150 respondents who became a participant of Jakarta Fashion Week 2017. Result show that there was significant effect of 5 brand personality dimension to 3 dimension of event involvement and loyalty. Meanwhile, there was one dimension of event involvement called personal self-expression that has not effect to loyalty.

  3. Creative Copper Crests

    ERIC Educational Resources Information Center

    Knab, Thomas

    2011-01-01

    In this article, the author discusses how to create an art activity that would link the computer-created business cards of fourth-grade students with an upcoming school-wide medieval event. Creating family crests from copper foil would be a great connection, since they, like business cards, are an individual's way to identify themselves to others.…

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aartsen, M. G.; Abraham, K.; Ackermann, M.

    Observation of a point source of astrophysical neutrinos would be a “smoking gun” signature of a cosmic-ray accelerator. While IceCube has recently discovered a diffuse flux of astrophysical neutrinos, no localized point source has been observed. Previous IceCube searches for point sources in the southern sky were restricted by either an energy threshold above a few hundred TeV or poor neutrino angular resolution. Here we present a search for southern sky point sources with greatly improved sensitivities to neutrinos with energies below 100 TeV. By selecting charged-current ν{sub μ} interacting inside the detector, we reduce the atmospheric background while retainingmore » efficiency for astrophysical neutrino-induced events reconstructed with sub-degree angular resolution. The new event sample covers three years of detector data and leads to a factor of 10 improvement in sensitivity to point sources emitting below 100 TeV in the southern sky. No statistically significant evidence of point sources was found, and upper limits are set on neutrino emission from individual sources. A posteriori analysis of the highest-energy (∼100 TeV) starting event in the sample found that this event alone represents a 2.8 σ deviation from the hypothesis that the data consists only of atmospheric background.« less

  5. The Participative Design of an Endoscopy Facility using Lean 3P

    PubMed Central

    Smith, Iain

    2016-01-01

    In the UK, bowel cancer is the second largest cancer killer. Diagnosing people earlier can save lives but demand for endoscopies is increasing and this can put pressure on waiting times. To address this challenge, an endoscopy unit in North East England decided to improve their facilities to increase capacity and create environments that improve the experience of users. This presented a significant opportunity for step change improvement but also a problem in terms of creating designs that meet user requirements whilst addressing structural or space constraints. The Lean design process known as ‘3P' (standing for the production preparation process) was utilised as a participative design strategy to engage stakeholders in the design of the new department. This involved a time-out workshop (or 3P event) in which Lean and participative design tools were utilised to create an innovative design based on ‘point of delivery' (POD) principles. The team created a design that demonstrated an increase in treatment room capacity by 25% and bed capacity by 70% whilst reducing travel distance for patients by 25.8% and staff by 27.1%. This was achieved with an increase in available space of only 13%. The Lean 3P method provided a structured approach for corporate and clinical staff to work together with patient representatives as cross-functional teams. This participative approach facilitated communication and learning between stakeholders about care processes and personal preferences. Lean 3P therefore appears to be a promising approach to improving the healthcare facilities design process to meet user requirements. PMID:27493744

  6. East Lancashire Hospital Trust creates an open culture paving the way for service improvement 'Below ten thousand'.

    PubMed

    Tomlinson, Robert

    2018-05-01

    Reacting to a never event is difficult and often embarrassing for staff involved. East Lancashire Hospitals NHS Trust has demonstrated that treating staff with respect after a never event, creates an open culture that encourages problem solving and service improvement. The approach has allowed learning to be shared and paved the way for the trust to be the first in the UK to launch the patient centric behavioural noise reduction strategy 'Below ten thousand'.

  7. Specific analysis of the recent rockfall activity in the southeast face of the Piz Lischana (Engadin Valley, Graubünden, Switzerland)

    NASA Astrophysics Data System (ADS)

    Büsing, Susanna; Guerin, Antoine; Derron, Marc-Henri; Jaboyedoff, Michel; Phillips, Marcia

    2016-04-01

    The study of permafrost is now attracting more and more researchers because the warming observed in the Alps since the beginning of last century is causing changes in active layer depth and in the thermal state of this climate indicator. In mountain regions, permafrost degradation is becoming critical for the whole population since slopes and rock walls are being destabilized, thus increasing risk for infrastructure and inhabitants of mountain valleys. To anticipate the triggering of future events better, it is necessary to improve understanding on the relation between permafrost thaw and slope instabilities. A rockfall of about 7000 m3 occurred in the upper part of the southeast face of the Piz Lischana (3105 m), in the Engadin Valley (Graubünden, Switzerland) around noon on 31 July 2011. Luckily, this event was filmed and ice could be observed on the failure plane after analysis of the images. In September 2014 and in the same area, another rockfall of 2340 m3 occurred along a prominent open fracture which was apparent since the failure of the rock mass in 2011. In order to characterize and analyze these two events, three 3D high density point clouds have been made using Structure from Motion (SfM) and LiDAR, one before and two after the September 2014 rockfall. For this purpose, 120 photos were taken during a helicopter flight in July 2014 to produce the first SfM point cloud, and more than 400 terrestrial photos were taken at the end of September to produce the second SfM point cloud. In July 2015 a third point cloud was created from three LiDAR scans, taken from two different positions. The point clouds were georeferenced with a 2 m resolution digital elevation model and compared to each other in order to calculate the volume of the rockfalls. A detailed structural analysis of the two rockfalls was made and compared to the geological structures of the whole southeast face. The structural analysis also allowed to improve the understanding of the failure mechanisms of the past events and to better assess the probability of future rockfalls. Furthermore, valuable information about the velocity of the failure mechanisms could be extracted from the July 2011 video, using a Particle Image Velocimetry method (Matlab script developed by Thielicke and Stamhuis, 2014). These results, combined with analyses of potential triggering factors (permafrost, freeze-thaw cycles, thermomechanical processes, rainfall, radiation, glacier decompression and seismics) show that many of them contributed towards destabilization. It seems that the "special" structural situation led to the failure of Piz Lischana, but it also highlights the influence of permafrost. This study also provided the opportunity to perform a comparison of both LiDAR - SfM. The point clouds have been analyzed regarding their general quality, the quality of their meshes, the quantity of instrumental noise, the point density of different discontinuities, the structural analysis and kinematic tests. Results show the SfM also allows detailed structural analysis and that a good choice of the parameters allows to approach the quality of the LiDAR data. However, several factors (focal length, variation of distance to object, image resolution) may increase the uncertainty of the photo alignment. This study confirms that the coupling of the two techniques is possible and provides reliable results. This shows that SfM is one of the possible cheap methods to monitor rock summits that are subject to permafrost thaw.

  8. Automatic recovery of aftershock sequences at the International Data Centre: from concept to pipeline

    NASA Astrophysics Data System (ADS)

    Kitov, I.; Bobrov, D.; Rozhkov, M.

    2016-12-01

    Aftershocks of larger earthquakes represent an important source of information on the distribution and evolution of stresses and deformations in pre-seismic, co-seismic and post-seismic phases. For the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO) largest aftershocks sequences are also a challenge for automatic and interactive processing. The highest rate of events recorded by two and more seismic stations of the International Monitoring System from a relatively small aftershock area may reach hundreds per hour (e.g. Sumatra 2004 and Tohoku 2011). Moreover, there are thousands of reflected/refracted phases per hour with azimuth and slowness within the uncertainty limits of the first P-waves. Misassociation of these later phases, both regular and site specific, as the first P-wave results in creation of numerous wrong event hypotheses in automatic IDC pipeline. In turn, interactive review of such wrong hypotheses is direct waste of analysts' resources. Waveform cross correlation (WCC) is a powerful tool to separate coda phases from actual P-wave arrivals and to fully utilize the repeat character of waveforms generated by events close in space. Array seismic stations of the IMS enhance the performance of the WCC in two important aspects - they reduce detection threshold and effectively suppress arrivals from all sources except master events. An IDC specific aftershock tool has been developed and merged with standard IDC pipeline. The tool includes several procedures: creation of master events consisting of waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point of interactive analysis. Since global monitoring of underground nuclear tests is based on historical and synthetic data, each aftershock sequence can be tested for the CTBT violation with big earthquakes as an evasion scenario.

  9. Using wireless sensor networks to improve understanding of rain-on-snow events across the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Maurer, T.; Avanzi, F.; Oroza, C.; Malek, S. A.; Glaser, S. D.; Bales, R. C.; Conklin, M. H.

    2017-12-01

    We use data gathered from Wireless Sensor Networks (WSNs) between 2008 and 2017 to investigate the temporal/spatial patterns of rain-on-snow events in three river basins of California's Sierra Nevada. Rain-on-snow transitions occur across a broad elevation range (several hundred meters), both between storms and within a given storm, creating an opportunity to use spatially and temporally dense data to forecast and study them. WSNs collect snow depth; meteorological data; and soil moisture and temperature data across relatively dense sensor clusters. Ten to twelve measurement nodes per cluster are placed across 1-km2 areas in locations representative of snow patterns at larger scales. Combining precipitation and snow data from snow-pillow and climate stations with an estimation of dew-point temperature from WSNs, we determine the frequency, timing, and geographic extent of rain-on-snow events. We compare these results to WSN data to evaluate the impact of rain-on-snow events on snowpack energy balance, density, and depth as well as on soil moisture. Rain-on-snow events are compared to dry warm-weather days to identify the relative importance of rain and radiation as the primary energy input to the snowpack for snowmelt generation. An intercomparison of rain-on-snow events for the WSNs in the Feather, American, and Kings River basins captures the behavior across a 2° latitudinal range of the Sierra Nevada. Rain-on-snow events are potentially a more important streamflow generation mechanism in the lower-elevation Feather River basin. Snowmelt response to rain-on-snow events changes throughout the wet season, with later events resulting in more melt due to snow isothermal conditions, coarser grain size, and more-homogeneous snow stratigraphy. Regardless of snowmelt response, rain-on-snow events tend to result in decreasing snow depth and a corresponding increase in snow density. Our results demonstrate that strategically placed WSNs can provide the necessary data at high temporal resolution to investigate how hydrologic responses evolve in both space and time, data not available from operational networks.

  10. HOTS: A Hierarchy of Event-Based Time-Surfaces for Pattern Recognition.

    PubMed

    Lagorce, Xavier; Orchard, Garrick; Galluppi, Francesco; Shi, Bertram E; Benosman, Ryad B

    2017-07-01

    This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract features using increasingly large spatio-temporal windows. The central concept is to use the rich temporal information provided by events to create contexts in the form of time-surfaces which represent the recent temporal activity within a local spatial neighborhood. We demonstrate that this concept can robustly be used at all stages of an event-based hierarchical model. First layer feature units operate on groups of pixels, while subsequent layer feature units operate on the output of lower level feature units. We report results on a previously published 36 class character recognition task and a four class canonical dynamic card pip task, achieving near 100 percent accuracy on each. We introduce a new seven class moving face recognition task, achieving 79 percent accuracy.This paper describes novel event-based spatio-temporal features called time-surfaces and how they can be used to create a hierarchical event-based pattern recognition architecture. Unlike existing hierarchical architectures for pattern recognition, the presented model relies on a time oriented approach to extract spatio-temporal features from the asynchronously acquired dynamics of a visual scene. These dynamics are acquired using biologically inspired frameless asynchronous event-driven vision sensors. Similarly to cortical structures, subsequent layers in our hierarchy extract increasingly abstract features using increasingly large spatio-temporal windows. The central concept is to use the rich temporal information provided by events to create contexts in the form of time-surfaces which represent the recent temporal activity within a local spatial neighborhood. We demonstrate that this concept can robustly be used at all stages of an event-based hierarchical model. First layer feature units operate on groups of pixels, while subsequent layer feature units operate on the output of lower level feature units. We report results on a previously published 36 class character recognition task and a four class canonical dynamic card pip task, achieving near 100 percent accuracy on each. We introduce a new seven class moving face recognition task, achieving 79 percent accuracy.

  11. An alternative for presenting interactive dynamic data sets in electronic presentations: a scrollable flash movie loop.

    PubMed

    Yam, Chun-Shan

    2007-11-01

    The purpose of this article is to describe an alternative for creating scrollable movie loops for electronic presentations including PowerPoint. The alternative provided in this article enables academic radiologists to present scrollable movie loops in PowerPoint. The scrolling capability is created using Flash ActionScript. A Flash template with the required ActionScript code is provided. Users can simply download the template and follow the step-by-step demonstration to create scrollable movie loops. No previous ActionScript programming knowledge is necessary.

  12. Local dynamic nuclear polarization using quantum point contacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wald, K.R.; Kouwenhoven, L.P.; McEuen, P.L.

    1994-08-15

    We have used quantum point contacts (QPCs) to locally create and probe dynamic nuclear polarization (DNP) in GaAs heterostructures in the quantum Hall regime. DNP is created via scattering between spin-polarized Landau level electrons and the Ga and As nuclear spins, and it leads to hysteresis in the dc transport characteristics. The nuclear origin of this hysteresis is demonstrated by nuclear magnetic resonance (NMR). Our results show that QPCs can be used to create and probe local nuclear spin populations, opening up new possibilities for mesoscopic NMR experiments.

  13. Rupture characteristics of the three M ∼ 4.7 (1992-1994) Parkfield earthquakes

    USGS Publications Warehouse

    Fletcher, Jon Peter B.; Spudich, Paul A.

    1998-01-01

    Slip on the San Andreas fault was determined for three M ∼ 4.7 earthquakes using a tomographic inverse system [Beroza and Spudich, 1988] to invert seismic source time functions (STFs) from S waves. STFs were obtained by deconvolving mainshock accelerograms by those from collocated smaller earthquakes. Accelerograms were from the U.S. Geological Survey Parkfield Small Aperture Array (UPSAR) and from a distributed array of digital accelerometer stations at Parkfield. Eight or nine STFs are used in each of the three inversions. STFs are typically symmetrical pulses with a duration of about 0.3–0.5 s. In the inversion, mainshock rise time was set to 0.05 s, and we allowed the rupture time to vary slightly from a constant rupture velocity of approximately 0.85β. Rupture for all three events, which are located in or close to the Middle Mountain preparation zone or box (MMB), quickly reaches a local maximum in slip and then propagates outward to peaks, ridges, or plateaus in the slip distribution. Slip for the October 20, 1992, event (located just inside the southern edge of the MMB) propagates from an initial spike north and updip along a curving ridge for about 2 km. The initial spike continued to grow in the November 14, 1993, event (located north of the October 20, 1992, event just beneath the hypocenter of the 1966 Parkfield earthquake), which shows little directivity, although there is a smaller patch of slip updip and to the south. In contrast, rupture for the December 20, 1994, event (located just south of the October 20, 1992, event) propagated north and slightly updip, creating a rough plateau in slip a few kilometers wide on a side. Directivity for this event also is to the north. Directivity for all three events points in the approximate direction of the 1966 hypocenter. Small pulses, which comprise a coda, are found on the STFs for several seconds after the initial impulsive event. Several tests based on the assumption that the average of all STFs from UPSAR for each event is an estimate of the true slip at the source suggest that the codas in the STFs are S waves from a long-duration source rather than uncorrected site response. An initiation phase is found on the array average for the November 14, 1993, and December 20, 1994, events. These precursory phases are the result of a spike in slip at the hypocenter. A value of 2.4–4 mm is obtained for Dc, the slip-weakening distance, by interpreting the initial spike as a critical patch. The few aftershocks for the October 20, 1992, event are distributed to the north and updip of the mainshock, but the November 14, 1993, event had a strong burst of aftershock activity that propagated to the north of its hypocenter at roughly the same depth. Aftershocks of the December 20, 1994, event are mostly updip. The November 14, 1993, event had the simplest slip distribution, appeared to be the most impulsive, and had the most active aftershock sequence and the greatest depth. If the eventual Parkfield earthquake initiates near the 1966 hypocenter, then the directivity of the three events studied here will have pointed to it. However, it is certainly possible that both the initiation of characteristic Parkfield shocks and the directivity of smaller events are controlled by fault properties on a larger scale such as by fault bends or jogs.

  14. Trial Implementation of a Secure Application Using Ten15

    DTIC Science & Technology

    1991-03-01

    requcrd here. boi suaosin,!y the system would sot allow tis Thusa Ptu pes Event was sed instead which cold he choicedl When the systen, is improved...tuie:ModutelRoPtI III Let createjoatnat - Use ( dataitore, rct,.pset. date tine) In Late,:Fhcepti porn Event. Nu S : ptpets Event)) Letilnt event: Event.* (who...datastore, and a journal is created using Createlivar ts deliver a new pvar with parent root..Pset. Finally journal in initialised to contoa the porn Event

  15. Effects of organizational safety practices and perceived safety climate on PPE usage, engineering controls, and adverse events involving liquid antineoplastic drugs among nurses.

    PubMed

    DeJoy, David M; Smith, Todd D; Woldu, Henok; Dyal, Mari-Amanda; Steege, Andrea L; Boiano, James M

    2017-07-01

    Antineoplastic drugs pose risks to the healthcare workers who handle them. This fact notwithstanding, adherence to safe handling guidelines remains inconsistent and often poor. This study examined the effects of pertinent organizational safety practices and perceived safety climate on the use of personal protective equipment, engineering controls, and adverse events (spill/leak or skin contact) involving liquid antineoplastic drugs. Data for this study came from the 2011 National Institute for Occupational Safety and Health (NIOSH) Health and Safety Practices Survey of Healthcare Workers which included a sample of approximately 1,800 nurses who had administered liquid antineoplastic drugs during the past seven days. Regression modeling was used to examine predictors of personal protective equipment use, engineering controls, and adverse events involving antineoplastic drugs. Approximately 14% of nurses reported experiencing an adverse event while administering antineoplastic drugs during the previous week. Usage of recommended engineering controls and personal protective equipment was quite variable. Usage of both was better in non-profit and government settings, when workers were more familiar with safe handling guidelines, and when perceived management commitment to safety was higher. Usage was poorer in the absence of specific safety handling procedures. The odds of adverse events increased with number of antineoplastic drugs treatments and when antineoplastic drugs were administered more days of the week. The odds of such events were significantly lower when the use of engineering controls and personal protective equipment was greater and when more precautionary measures were in place. Greater levels of management commitment to safety and perceived risk were also related to lower odds of adverse events. These results point to the value of implementing a comprehensive health and safety program that utilizes available hazard controls and effectively communicates and demonstrates the importance of safe handling practices. Such actions also contribute to creating a positive safety climate.

  16. Optimized breeding strategies for multiple trait integration: II. Process efficiency in event pyramiding and trait fixation.

    PubMed

    Peng, Ting; Sun, Xiaochun; Mumm, Rita H

    2014-01-01

    Multiple trait integration (MTI) is a multi-step process of converting an elite variety/hybrid for value-added traits (e.g. transgenic events) through backcross breeding. From a breeding standpoint, MTI involves four steps: single event introgression, event pyramiding, trait fixation, and version testing. This study explores the feasibility of marker-aided backcross conversion of a target maize hybrid for 15 transgenic events in the light of the overall goal of MTI of recovering equivalent performance in the finished hybrid conversion along with reliable expression of the value-added traits. Using the results to optimize single event introgression (Peng et al. Optimized breeding strategies for multiple trait integration: I. Minimizing linkage drag in single event introgression. Mol Breed, 2013) which produced single event conversions of recurrent parents (RPs) with ≤8 cM of residual non-recurrent parent (NRP) germplasm with ~1 cM of NRP germplasm in the 20 cM regions flanking the event, this study focused on optimizing process efficiency in the second and third steps in MTI: event pyramiding and trait fixation. Using computer simulation and probability theory, we aimed to (1) fit an optimal breeding strategy for pyramiding of eight events into the female RP and seven in the male RP, and (2) identify optimal breeding strategies for trait fixation to create a 'finished' conversion of each RP homozygous for all events. In addition, next-generation seed needs were taken into account for a practical approach to process efficiency. Building on work by Ishii and Yonezawa (Optimization of the marker-based procedures for pyramiding genes from multiple donor lines: I. Schedule of crossing between the donor lines. Crop Sci 47:537-546, 2007a), a symmetric crossing schedule for event pyramiding was devised for stacking eight (seven) events in a given RP. Options for trait fixation breeding strategies considered selfing and doubled haploid approaches to achieve homozygosity as well as seed chipping and tissue sampling approaches to facilitate genotyping. With selfing approaches, two generations of selfing rather than one for trait fixation (i.e. 'F2 enrichment' as per Bonnett et al. in Strategies for efficient implementation of molecular markers in wheat breeding. Mol Breed 15:75-85, 2005) were utilized to eliminate bottlenecking due to extremely low frequencies of desired genotypes in the population. The efficiency indicators such as total number of plants grown across generations, total number of marker data points, total number of generations, number of seeds sampled by seed chipping, number of plants requiring tissue sampling, and number of pollinations (i.e. selfing and crossing) were considered in comparisons of breeding strategies. A breeding strategy involving seed chipping and a two-generation selfing approach (SC + SELF) was determined to be the most efficient breeding strategy in terms of time to market and resource requirements. Doubled haploidy may have limited utility in trait fixation for MTI under the defined breeding scenario. This outcome paves the way for optimizing the last step in the MTI process, version testing, which involves hybridization of female and male RP conversions to create versions of the converted hybrid for performance evaluation and possible commercial release.

  17. A numerical approach to model and predict the energy absorption and crush mechanics within a long-fiber composite crush tube

    NASA Astrophysics Data System (ADS)

    Pickett, Leon, Jr.

    Past research has conclusively shown that long fiber structural composites possess superior specific energy absorption characteristics as compared to steel and aluminum structures. However, destructive physical testing of composites is very costly and time consuming. As a result, numerical solutions are desirable as an alternative to experimental testing. Up until this point, very little numerical work has been successful in predicting the energy absorption of composite crush structures. This research investigates the ability to use commercially available numerical modeling tools to approximate the energy absorption capability of long-fiber composite crush tubes. This study is significant because it provides a preliminary analysis of the suitability of LS-DYNA to numerically characterize the crushing behavior of a dynamic axial impact crushing event. Composite crushing theory suggests that there are several crushing mechanisms occurring during a composite crush event. This research evaluates the capability and suitability of employing, LS-DYNA, to simulate the dynamic crush event of an E-glass/epoxy cylindrical tube. The model employed is the composite "progressive failure model", a much more limited failure model when compared to the experimental failure events which naturally occur. This numerical model employs (1) matrix cracking, (2) compression, and (3) fiber breakage failure modes only. The motivation for the work comes from the need to reduce the significant cost associated with experimental trials. This research chronicles some preliminary efforts to better understand the mechanics essential in pursuit of this goal. The immediate goal is to begin to provide deeper understanding of a composite crush event and ultimately create a viable alternative to destructive testing of composite crush tubes.

  18. Colloid-facilitated mobilization of metals by freeze-thaw cycles.

    PubMed

    Mohanty, Sanjay K; Saiers, James E; Ryan, Joseph N

    2014-01-21

    The potential of freeze-thaw cycles to release colloids and colloid-associated contaminants into water is unknown. We examined the effect of freeze-thaw cycles on the mobilization of cesium and strontium in association with colloids in intact cores of a fractured soil, where preferential flow paths are prevalent. Two intact cores were contaminated with cesium and strontium. To mobilize colloids and metal cations sequestered in the soil cores, each core was subjected to 10 intermittent wetting events separated by 66 h pauses. During the first five pauses, the cores were dried at room temperature, and during last five pauses, the cores were subjected to 42 h of freezing followed by 24 h of thawing. In comparison to drying, freeze-thaw cycles created additional preferential flow paths through which colloids, cesium, and strontium were mobilized. The wetting events following freeze-thaw intervals mobilized about twice as many colloids as wetting events following drying at room temperature. Successive wetting events following 66 h of drying mobilized similar amounts of colloids; in contrast, successive wetting events after 66 h of freeze-thaw intervals mobilized greater amounts of colloids than the previous one. Drying and freeze-thaw treatments, respectively, increased and decreased the dissolved cesium and strontium, but both treatments increased the colloidal cesium and strontium. Overall, the freeze-thaw cycles increased the mobilization of metal contaminants primarily in association with colloids through preferential flow paths. These findings suggest that the mobilization of colloid and colloid-associated contaminants could increase when temperature variations occur around the freezing point of water. Thus, climate extremes have the potential to mobilize contaminants that have been sequestered in the vadose zone for decades.

  19. Linking Nurses' Clinical Leadership to Patient Care Quality: The Role of Transformational Leadership and Workplace Empowerment.

    PubMed

    Boamah, Sheila

    2018-03-01

    Background While improving patient safety requires strong nursing leadership, there has been little empirical research that has examined the mechanisms by which leadership influences patient safety outcomes. Aim To test a model examining relationships among transformational leadership, structural empowerment, staff nurse clinical leadership, and nurse-assessed adverse patient outcomes. Methods A cross-sectional survey was conducted with a randomly selected sample of 378 registered nurses working in direct patient care in acute care hospitals across Ontario, Canada. Structural equation modeling was used to test the hypothesized model. Results The model had an acceptable fit, and all paths were significant. Transformational leadership was significantly associated with decreased adverse patient outcomes through structural empowerment and staff nurse clinical leadership. Discussion This study highlights the importance of transformational leadership in creating empowering practice environments that foster high-quality care. The findings indicate that a more complete understanding of what drives desired patient outcomes warrants the need to focus on how to empower nurses and foster clinical leadership practices at the point of care. Conclusion In planning safety strategies, managers must demonstrate transformational leadership behaviors in order to modify the work environment to create better defenses for averting adverse events.

  20. Native point defects in MoS2 and their influences on optical properties by first principles calculations

    NASA Astrophysics Data System (ADS)

    Saha, Ashim Kumar; Yoshiya, Masato

    2018-03-01

    Stability of native point defect species and optical properties are quantitatively examined through first principles calculations in order to identify possible native point defect species in MoS2 and its influences on electronic structures and resultant optical properties. Possible native point defect species are identified as functions of thermodynamic environment and location of Fermi-level in MoS2. It is found that sulphur vacancies can be introduced more easily than other point defect species which will create impurity levels both in bandgap and in valence band. Additionally, antisite Mo and/or Mo vacancies can be created depending on chemical potential of sulphur, both of which will create impurity levels in bandgap and in valence band. Those impurity levels result in pronounced photon absorption in visible light region, though each of these point defects alone has limited impact on the optical properties unless their concentration remained low. Thus, attention must be paid when intentional impurity doping is made to MoS2 to avoid unwanted modification of optical properties of MoS2. Those impurity may enable further exploitation of photovoltaic energy conversion at longer wavelength.

  1. Neural Activity during Encoding Predicts False Memories Created by Misinformation

    ERIC Educational Resources Information Center

    Okado, Yoko; Stark, Craig E. L.

    2005-01-01

    False memories are often demonstrated using the misinformation paradigm, in which a person's recollection of a witnessed event is altered after exposure to misinformation about the event. The neural basis of this phenomenon, however, remains unknown. The authors used fMRI to investigate encoding processes during the viewing of an event and…

  2. A Typology of Language-Brokering Events in Dual-Language Immersion Classrooms

    ERIC Educational Resources Information Center

    Coyoca, Anne Marie; Lee, Jin Sook

    2009-01-01

    This paper examines language-brokering events to better understand how children utilize their linguistic resources to create spaces where the coexistence of two languages can enable or restrict understanding and learning of academic content for themselves and others. An analysis of the structure of language-brokering events reveals that different…

  3. ADESSA: A Real-Time Decision Support Service for Delivery of Semantically Coded Adverse Drug Event Data

    PubMed Central

    Duke, Jon D.; Friedlin, Jeff

    2010-01-01

    Evaluating medications for potential adverse events is a time-consuming process, typically involving manual lookup of information by physicians. This process can be expedited by CDS systems that support dynamic retrieval and filtering of adverse drug events (ADE’s), but such systems require a source of semantically-coded ADE data. We created a two-component system that addresses this need. First we created a natural language processing application which extracts adverse events from Structured Product Labels and generates a standardized ADE knowledge base. We then built a decision support service that consumes a Continuity of Care Document and returns a list of patient-specific ADE’s. Our database currently contains 534,125 ADE’s from 5602 product labels. An NLP evaluation of 9529 ADE’s showed recall of 93% and precision of 95%. On a trial set of 30 CCD’s, the system provided adverse event data for 88% of drugs and returned these results in an average of 620ms. PMID:21346964

  4. Public Outreach Guerilla Style: Just Add Science to Existing Events

    NASA Astrophysics Data System (ADS)

    Gelderman, Richard

    2016-01-01

    We report on a campaign to use the visual appeal of astronomy as a gateway drug to inject public outreach into settings where people aren't expecting an encounter with science. Our inspiration came from the team at guerillascience.org, who have earned a reputation for creating, at sites around the world, "experiences and events that are unexpected, thought-provoking, but, above all, that delight and entertain." Our goal is to insert astronomy into existing festivals of music, culture, and art; county and state fairs; sporting events; and local farmer's markets. With volunteers and near-zero budgets, we have been able to meaningfully engage with audience members who would never willingly attend an event advertised as science related. By purposefully relating astronomy to the non-science aspects of the event that caused the audience members to attend, new learning experiences are created that alter the often negative pre-conceived notions about science that many of them held before our encounter.

  5. Surface Deformation Associated with the 1983 Borah Peak Earthquake Measured from Digital Surface Model Differencing

    NASA Astrophysics Data System (ADS)

    Reitman, N. G.; Briggs, R.; Gold, R. D.; DuRoss, C. B.

    2015-12-01

    Post-earthquake, field-based assessments of surface displacement commonly underestimate offsets observed with remote sensing techniques (e.g., InSAR, image cross-correlation) because they fail to capture the total deformation field. Modern earthquakes are readily characterized by comparing pre- and post-event remote sensing data, but historical earthquakes often lack pre-event data. To overcome this challenge, we use historical aerial photographs to derive pre-event digital surface models (DSMs), which we compare to modern, post-event DSMs. Our case study focuses on resolving on- and off-fault deformation along the Lost River fault that accompanied the 1983 M6.9 Borah Peak, Idaho, normal-faulting earthquake. We use 343 aerial images from 1952-1966 and vertical control points selected from National Geodetic Survey benchmarks measured prior to 1983 to construct a pre-event point cloud (average ~ 0.25 pts/m2) and corresponding DSM. The post-event point cloud (average ~ 1 pt/m2) and corresponding DSM are derived from WorldView 1 and 2 scenes processed with NASA's Ames Stereo Pipeline. The point clouds and DSMs are coregistered using vertical control points, an iterative closest point algorithm, and a DSM coregistration algorithm. Preliminary results of differencing the coregistered DSMs reveal a signal spanning the surface rupture that is consistent with tectonic displacement. Ongoing work is focused on quantifying the significance of this signal and error analysis. We expect this technique to yield a more complete understanding of on- and off-fault deformation patterns associated with the Borah Peak earthquake along the Lost River fault and to help improve assessments of surface deformation for other historical ruptures.

  6. Evaluating the use of different precipitation datasets in simulating a flood event

    NASA Astrophysics Data System (ADS)

    Akyurek, Z.; Ozkaya, A.

    2016-12-01

    Floods caused by convective storms in mountainous regions are sensitive to the temporal and spatial variability of rainfall. Space-time estimates of rainfall from weather radar, satellites and numerical weather prediction models can be a remedy to represent pattern of the rainfall with some inaccuracy. However, there is a strong need for evaluation of the performance and limitations of these estimates in hydrology. This study aims to provide a comparison of gauge, radar, satellite (Hydro-Estimator (HE)) and numerical weather prediciton model (Weather Research and Forecasting (WRF)) precipitation datasets during an extreme flood event (22.11.2014) lasting 40 hours in Samsun-Turkey. For this study, hourly rainfall data from 13 ground observation stations were used in the analyses. This event having a peak discharge of 541 m3/sec created flooding at the downstream of Terme Basin. Comparisons were performed in two parts. First the analysis were performed in areal and point based manner. Secondly, a semi-distributed hydrological model was used to assess the accuracy of the rainfall datasets to simulate river flows for the flood event. Kalman Filtering was used in the bias correction of radar rainfall data compared to gauge measurements. Radar, gauge, corrected radar, HE and WRF rainfall data were used as model inputs. Generally, the HE product underestimates the cumulative rainfall amounts in all stations, radar data underestimates the results in cumulative sense but keeps the consistency in the results. On the other hand, almost all stations in WRF mean statistics computations have better results compared to the HE product but worse than the radar dataset. Results in point comparisons indicated that, trend of the rainfall is captured by the radar rainfall estimation well but radar underestimates the maximum values. According to cumulative gauge value, radar underestimated the cumulative rainfall amount by % 32. Contrary to other datasets, the bias of WRF is positive due to the overestimation of rainfall forecasts. It was seen that radar-based flow predictions demonstrated good potential for successful hydrological modeling. Moreover, flow predictions obtained from bias corrected radar rainfall values produced an increase in the peak flows compared to the ones obtained from radar data itself.

  7. Transcription as a Threat to Genome Integrity.

    PubMed

    Gaillard, Hélène; Aguilera, Andrés

    2016-06-02

    Genomes undergo different types of sporadic alterations, including DNA damage, point mutations, and genome rearrangements, that constitute the basis for evolution. However, these changes may occur at high levels as a result of cell pathology and trigger genome instability, a hallmark of cancer and a number of genetic diseases. In the last two decades, evidence has accumulated that transcription constitutes an important natural source of DNA metabolic errors that can compromise the integrity of the genome. Transcription can create the conditions for high levels of mutations and recombination by its ability to open the DNA structure and remodel chromatin, making it more accessible to DNA insulting agents, and by its ability to become a barrier to DNA replication. Here we review the molecular basis of such events from a mechanistic perspective with particular emphasis on the role of transcription as a genome instability determinant.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculationsmore » require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.« less

  9. Hope, despair and transformation: Climate change and the promotion of mental health and wellbeing

    PubMed Central

    Fritze, Jessica G; Blashki, Grant A; Burke, Susie; Wiseman, John

    2008-01-01

    Background This article aims to provide an introduction to emerging evidence and debate about the relationship between climate change and mental health. Discussion and Conclusion The authors argue that: i) the direct impacts of climate change such as extreme weather events will have significant mental health implications; ii) climate change is already impacting on the social, economic and environmental determinants of mental health with the most severe consequences being felt by disadvantaged communities and populations; iii) understanding the full extent of the long term social and environmental challenges posed by climate change has the potential to create emotional distress and anxiety; and iv) understanding the psycho-social implications of climate change is also an important starting point for informed action to prevent dangerous climate change at individual, community and societal levels. PMID:18799005

  10. Supercomputing meets seismology in earthquake exhibit

    ScienceCinema

    Blackwell, Matt; Rodger, Arthur; Kennedy, Tom

    2018-02-14

    When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.

  11. Electromagnetic Modeling of the Passive Stabilization Loop at EAST

    NASA Astrophysics Data System (ADS)

    Ji, Xiang; Song, Yuntao; Wu, Songtao; Wang, Zhibin; Shen, Guang; Liu, Xufeng; Cao, Lei; Zhou, Zibo; Peng, Xuebing; Wang, Chenghao

    2012-09-01

    A passive stabilization loop (PSL) has been designed and manufactured in order to enhance the control of vertical instability and accommodate the new stage for high-performance plasma at EAST. Eddy currents are induced by vertical displacement events (VDEs) and disruption, which can produce a magnetic field to control the vertical instability of the plasma in a short timescale. A finite element model is created and meshed using ANSYS software. Based on the simulation of plasma VDEs and disruption, the distribution and decay curve of the eddy currents on the PSL are obtained. The largest eddy current is 200 kA and the stress is 68 MPa at the outer current bridge, which is the weakest point of the PSL because of the eddy currents and the magnetic fields. The analysis results provide the supporting data for the structural design.

  12. Narrative event boundaries, reading times, and expectation.

    PubMed

    Pettijohn, Kyle A; Radvansky, Gabriel A

    2016-10-01

    During text comprehension, readers create mental representations of the described events, called situation models. When new information is encountered, these models must be updated or new ones created. Consistent with the event indexing model, previous studies have shown that when readers encounter an event shift, reading times often increase. However, such increases are not consistently observed. This paper addresses this inconsistency by examining the extent to which reading-time differences observed at event shifts reflect an unexpectedness in the narrative rather than processes involved in model updating. In two reassessments of prior work, event shifts known to increase reading time were rated as less expected, and expectedness ratings significantly predicted reading time. In three new experiments, participants read stories in which an event shift was or was not foreshadowed, thereby influencing expectedness of the shift. Experiment 1 revealed that readers do not expect event shifts, but foreshadowing eliminates this. Experiment 2 showed that foreshadowing does not affect identification of event shifts. Finally, Experiment 3 found that, although reading times increased when an event shift was not foreshadowed, they were not different from controls when it was. Moreover, responses to memory probes were slower following an event shift regardless of foreshadowing, suggesting that situation model updating had taken place. Overall, the results support the idea that previously observed reading time increases at event shifts reflect, at least in part, a reader's unexpected encounter with a shift rather than an increase in processing effort required to update a situation model.

  13. 3 Ways.

    ERIC Educational Resources Information Center

    Bostick, Darla F.; And Others.

    1982-01-01

    Describes one classroom art activity and two events celebrating Youth Art Month. In the activity, junior high school students created collages and then made drawings from the collages. The events included student art demonstrations in shopping malls and other experiential art workshops. (AM)

  14. PowerPoint Workshop for Teachers[TM].

    ERIC Educational Resources Information Center

    Caughlin, Janet

    This guide for teachers to the Microsoft PowerPoint multimedia presentation program begins with a section that introduces what PowerPoint is and why teachers should use it, Windows 95/98 basics, Macintosh basics, getting started, PowerPoint toolbars, and presentation tips. The next section discusses learning PowerPoint, including creating a…

  15. SU(2) x U(1) vacuum and the Centauro events

    NASA Technical Reports Server (NTRS)

    Kazanas, D.; Balasubrahmanyan, V. K.; Streitmatter, R. E.

    1985-01-01

    It is proposed that the fireballs invoked to explain the Centauro events are bubbles of a metastable superdense state of nuclear matter, created in high energy (E approximately 10 to the 15th power eV) cosmic ray collisions at the top of the atmosphere. If these bubbles are created with a Lorentz factor gamma approximately equals 10 at their CM frame, the objections against the origin of these events in cosmic ray interactions are overcome. A relationship then between their lifetime, tau, and the threshold energy for bubble formation, E sub th, appears to be insensitive to the value of tau and always close to E sub th approximately 10 to 15th power eV. Finally it is speculated that these bubbles might be manifestations of the SU(2) x U(1) false vacuum excited in these collisions. The absence of in the Centauro events is then explained by the decay modes of these excitations.

  16. Evolution of a plant-specific copper chaperone family for chloroplast copper homeostasis

    DOE PAGES

    Blaby-Haas, Crysten E.; Padilla-Benavides, Teresita; Stübe, Roland; ...

    2014-12-02

    Metallochaperones traffic copper (Cu +) from its point of entry at the plasma membrane to its destination. In plants, one destination is the chloroplast, which houses plastocyanin, a Cu-dependent electron transfer protein involved in photosynthesis. In this paper, we present a previously unidentified Cu + chaperone that evolved early in the plant lineage by an alternative-splicing event of the pre-mRNA encoding the chloroplast P-type ATPase in Arabidopsis 1 (PAA1). In several land plants, recent duplication events created a separate chaperone-encoding gene coincident with loss of alternative splicing. The plant-specific Cu + chaperone delivers Cu + with specificity for PAA1, whichmore » is flipped in the envelope relative to prototypical bacterial ATPases, compatible with a role in Cu + import into the stroma and consistent with the canonical catalytic mechanism of these enzymes. The ubiquity of the chaperone suggests conservation of this Cu +-delivery mechanism and provides a unique snapshot into the evolution of a Cu + distribution pathway. Finally, we also provide evidence for an interaction between PAA2, the Cu +-ATPase in thylakoids, and the Cu +-chaperone for Cu/Zn superoxide dismutase (CCS), uncovering a Cu + network that has evolved to fine-tune Cu + distribution.« less

  17. 802.11 Wireless Infrastructure To Enhance Medical Response to Disasters

    PubMed Central

    Arisoylu, Mustafa; Mishra, Rajesh; Rao, Ramesh; Lenert, Leslie A.

    2005-01-01

    802.11 (WiFi) is a well established network communications protocol that has wide applicability in civil infrastructure. This paper describes research that explores the design of 802.11 networks enhanced to support data communications in disaster environments. The focus of these efforts is to create network infrastructure to support operations by Metropolitan Medical Response System (MMRS) units and Federally-sponsored regional teams that respond to mass casualty events caused by a terrorist attack with chemical, biological, nuclear or radiological weapons or by a hazardous materials spill. In this paper, we describe an advanced WiFi-based network architecture designed to meet the needs of MMRS operations. This architecture combines a Wireless Distribution Systems for peer-to-peer multihop connectivity between access points with flexible and shared access to multiple cellular backhauls for robust connectivity to the Internet. The architecture offers a high bandwidth data communications infrastructure that can penetrate into buildings and structures while also supporting commercial off-the-shelf end-user equipment such as PDAs. It is self-configuring and is self-healing in the event of a loss of a portion of the infrastructure. Testing of prototype units is ongoing. PMID:16778990

  18. TITAN: inference of copy number architectures in clonal cell populations from tumor whole-genome sequence data.

    PubMed

    Ha, Gavin; Roth, Andrew; Khattra, Jaswinder; Ho, Julie; Yap, Damian; Prentice, Leah M; Melnyk, Nataliya; McPherson, Andrew; Bashashati, Ali; Laks, Emma; Biele, Justina; Ding, Jiarui; Le, Alan; Rosner, Jamie; Shumansky, Karey; Marra, Marco A; Gilks, C Blake; Huntsman, David G; McAlpine, Jessica N; Aparicio, Samuel; Shah, Sohrab P

    2014-11-01

    The evolution of cancer genomes within a single tumor creates mixed cell populations with divergent somatic mutational landscapes. Inference of tumor subpopulations has been disproportionately focused on the assessment of somatic point mutations, whereas computational methods targeting evolutionary dynamics of copy number alterations (CNA) and loss of heterozygosity (LOH) in whole-genome sequencing data remain underdeveloped. We present a novel probabilistic model, TITAN, to infer CNA and LOH events while accounting for mixtures of cell populations, thereby estimating the proportion of cells harboring each event. We evaluate TITAN on idealized mixtures, simulating clonal populations from whole-genome sequences taken from genomically heterogeneous ovarian tumor sites collected from the same patient. In addition, we show in 23 whole genomes of breast tumors that the inference of CNA and LOH using TITAN critically informs population structure and the nature of the evolving cancer genome. Finally, we experimentally validated subclonal predictions using fluorescence in situ hybridization (FISH) and single-cell sequencing from an ovarian cancer patient sample, thereby recapitulating the key modeling assumptions of TITAN. © 2014 Ha et al.; Published by Cold Spring Harbor Laboratory Press.

  19. MMS observations of magnetic reconnection signatures of dissipating ion inertial-scale flux ropes associated with dipolarization events

    NASA Astrophysics Data System (ADS)

    Poh, G.; Slavin, J. A.; Lu, S.; Le, G.; Cassak, P.; Eastwood, J. P.; Ozturk, D. S.; Zou, S.; Nakamura, R.; Baumjohann, W.; Russell, C. T.; Gershman, D. J.; Giles, B. L.; Pollock, C.; Moore, T. E.; Torbert, R. B.; Burch, J. L.

    2017-12-01

    The formation of flux ropes is thought to be an integral part of the process that may have important consequences for the onset and subsequent rate of reconnection in the tail. Earthward flows, i.e. bursty bulk flows (BBFs), generate dipolarization fronts (DFs) as they interact with the closed magnetic flux in their path. Global hybrid simulations and THEMIS observations have shown that earthward-moving flux ropes can undergo magnetic reconnection with the near-Earth dipole field in the downtail region between the Near Earth Neutral Line and the near-Earth dipole field to create DFs-like signatures. In this study, we analyzed sequential "chains" of earthward-moving, ion-scale flux ropes embedded within DFs observed during MMS first tail season. MMS high-resolution plasma measurements indicate that these earthward flux ropes embedded in DFs have a mean bulk flow velocity and diameter of 250 km/s and 1000 km ( 2‒3 ion inertial length λi), respectively. Magnetic reconnection signatures preceding the flux rope/DF encounter were also observed. As the southward-pointing magnetic field in the leading edge of the flux rope reconnects with the northward-pointing geomagnetic field, the characteristic quadrupolar Hall magnetic field in the ion diffusion region and electron outflow jets in the north-south direction are observed. Our results strongly suggest that the earthward moving flux ropes brake and gradually dissipate due to magnetic reconnection with the near Earth magnetic field. We have also examined the occurrence rate of these dissipating flux ropes/DF events as a function of downtail distances.

  20. Revealing stellar brightness profiles by means of microlensing fold caustics

    NASA Astrophysics Data System (ADS)

    Dominik, M.

    2004-09-01

    With a handful of measurements of limb-darkening coefficients, galactic microlensing has already proven to be a powerful technique for studying atmospheres of distant stars. Survey campaigns such as OGLE-III are capable of providing ~10 suitable target stars per year that undergo microlensing events involving passages over the caustic created by a binary lens, which last from a few hours to a few days and allow us to resolve the stellar atmosphere by frequent broad-band photometry. For a caustic exit lasting 12 h and a photometric precision of 1.5 per cent, a moderate sampling interval of 30 min (corresponding to ~25-30 data points) is sufficient for providing a reliable measurement of the linear limb-darkening coefficient Γ with an uncertainty of ~8 per cent, which reduces to ~3 per cent for a reduced sampling interval of 6 min for the surroundings of the end of the caustic exit. While some additional points over the remaining parts of the light curve are highly valuable, a denser sampling in these regions provides little improvement. Unless an accuracy of less than 5 per cent is desired, limb-darkening coefficients for several filters can be obtained or observing time can be spent on other targets during the same night. The adoption of an inappropriate stellar brightness profile as well as the effect of acceleration between source and caustic yield distinguishable characteristic systematics in the model residuals. Acceleration effects are unlikely to affect the light curve significantly for most events, although a free acceleration parameter blurs the limb-darkening measurement if the passage duration cannot be accurately determined.

  1. What can you bring to the table during a hazard event?: Its more than exchanging business cards before a crisis

    NASA Astrophysics Data System (ADS)

    Reddy, C. M.

    2015-12-01

    When a hazard event occurs, it creates an intersection of interests with a wide range of stakeholders with different roles, agendas, responsibilities, metrics of success, time scales, training, experience, comfort levels for uncertainty, and expectations to be involved. It is a cultural melting pot that affords little time for a rich and fruitful melded product to develop. I will argue and present that in my field of oil spills, there have been great strides to overcome this challenge by having these different groups meet and learn about each other. While it is one thing to have "exchanged business cards" prior to a crisis, what do you do when the "response" is greeted with academic scientists who want to help? I will argue that you can distill this other challenge to two simple points: (1) the responders need to communicate what they need, what resources they have, how fast they need answers, and what level of certainty they will accept and (2) academics need to understand that responders have little time for a 60-slide PowerPoint presentation on a potentially unproven approach, a long list of publications, and a discussion on H-factors. Simply, academic researchers need to provide a frank and brief description on what they can do, the logistics it demands, and a willingness to be part of a team that simply wants to "put out the fire", save lives, and reduce damages with the potential for little reward from academia's perspective. It can take eight years to get tenure in academia, but a responder at a hazard event may only have eight hours to make a decision with whatever intelligence they have. And that's the rub: wanting to help is just not enough. Academics have to show that you can make a timely difference for the response. In this talk, I will expand on this distinction and provide potential solutions.

  2. Multivariate Statistical Modelling of Drought and Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Manning, Colin; Widmann, Martin; Vrac, Mathieu; Maraun, Douglas; Bevaqua, Emanuele

    2016-04-01

    Multivariate Statistical Modelling of Drought and Heat Wave Events C. Manning1,2, M. Widmann1, M. Vrac2, D. Maraun3, E. Bevaqua2,3 1. School of Geography, Earth and Environmental Sciences, University of Birmingham, Edgbaston, Birmingham, UK 2. Laboratoire des Sciences du Climat et de l'Environnement, (LSCE-IPSL), Centre d'Etudes de Saclay, Gif-sur-Yvette, France 3. Wegener Center for Climate and Global Change, University of Graz, Brandhofgasse 5, 8010 Graz, Austria Compound extreme events are a combination of two or more contributing events which in themselves may not be extreme but through their joint occurrence produce an extreme impact. Compound events are noted in the latest IPCC report as an important type of extreme event that have been given little attention so far. As part of the CE:LLO project (Compound Events: muLtivariate statisticaL mOdelling) we are developing a multivariate statistical model to gain an understanding of the dependence structure of certain compound events. One focus of this project is on the interaction between drought and heat wave events. Soil moisture has both a local and non-local effect on the occurrence of heat waves where it strongly controls the latent heat flux affecting the transfer of sensible heat to the atmosphere. These processes can create a feedback whereby a heat wave maybe amplified or suppressed by the soil moisture preconditioning, and vice versa, the heat wave may in turn have an effect on soil conditions. An aim of this project is to capture this dependence in order to correctly describe the joint probabilities of these conditions and the resulting probability of their compound impact. We will show an application of Pair Copula Constructions (PCCs) to study the aforementioned compound event. PCCs allow in theory for the formulation of multivariate dependence structures in any dimension where the PCC is a decomposition of a multivariate distribution into a product of bivariate components modelled using copulas. A copula is a multivariate distribution function which allows one to model the dependence structure of given variables separately from the marginal behaviour. We firstly look at the structure of soil moisture drought over the entire of France using the SAFRAN dataset between 1959 and 2009. Soil moisture is represented using the Standardised Precipitation Evapotranspiration Index (SPEI). Drought characteristics are computed at grid point scale where drought conditions are identified as those with an SPEI value below -1.0. We model the multivariate dependence structure of drought events defined by certain characteristics and compute return levels of these events. We initially find that drought characteristics such as duration, mean SPEI and the maximum contiguous area to a grid point all have positive correlations, though the degree to which they are correlated can vary considerably spatially. A spatial representation of return levels then may provide insight into the areas most prone to drought conditions. As a next step, we analyse the dependence structure between soil moisture conditions preceding the onset of a heat wave and the heat wave itself.

  3. Surface Modeling and Grid Generation for Iced Airfoils (SmaggIce)

    NASA Technical Reports Server (NTRS)

    Hammond, Brandy M.

    2004-01-01

    Many of the troubles associated with problem solving are alleviated when there is a model that can be used to represent the problem. Through the Advanced Graphics and Visualization (G-VIS) Laboratory and other facilities located within the Research Analysis Center, the Computer Services Division (CSD) is able to develop and maintain programs and software that allow for the modeling of various situations. For example, the Icing Research Branch is devoted to investigating the effect of ice that forms on the wings and other airfoils of airplanes while in flight. While running tests that physically generate ice and wind on airfoils within the laboratories and wind tunnels on site are done, it would be beneficial if most of the preliminary work could be done outside of the lab. Therefore, individuals from within CSD have collaborated with Icing Research in order to create SmaggIce. This software allows users to create ice patterns on clean airfoils or open files containing a variety of icing situations, manipulate and measure these forms, generate, divide, and merge grids around these elements for more explicit analysis, and specify and rediscretize subcurves. With the projected completion date of Summer 2005, the majority of the focus of the Smagglce team is user-functionality and error handling. My primary responsibility is to test the Graphical User Interface (GUI) in SmaggIce in order to ensure the usability and verify the expected results of the events (buttons, menus, etc.) within the program. However, there is no standardized, systematic way in which to test all the possible combinations or permutations of events, not to mention unsolicited events such as errors. Moreover, scripting tests, if not done properly and with a view towards inevitable revision, can result in more apparent errors within the software and in effect become useless whenever the developers of the program make a slight change in the way a specific process is executed. My task therefore requires a brief yet intense study into GUI coverage criteria and creating algorithms for GUI implementation. Nevertheless, there are still heavily graphical features of SmaggIceSmaggIce that must be either corrected or redesigned before its release. A particular feature of SmaggIce is the ability to smooth out curves created by control points that form an arbitrary shape into something more acquiescent to gridding (while maintaining the integrity of the data). This is done by a mathematical model known as Non-Uniform Rational B-Spline (NURBS) curves. Existing NURBS code is written in FORTRAN-77 with static arrays for holding information. My new assignment is to allow for dynamic memory allocation within the code and to make it possible for the developers to call out functions from the NURBS code using C.

  4. The power of living things: Living memorials as therapeutic landscapes

    Treesearch

    Heather L. McMillen; Lindsay K. Campbell; Erika S. Svendsen

    2017-01-01

    In response to the events of 11 September 2001 (9/11), many communities came together to create living memorials. Many living memorials were established near the crash sites, but others were created across the United States from urban to rural areas, with designs ranging from entire forests to single trees. They were created by surviving family members, supporters of...

  5. Detection of events of public health importance under the international health regulations: a toolkit to improve reporting of unusual events by frontline healthcare workers.

    PubMed

    MacDonald, Emily; Aavitsland, Preben; Bitar, Dounia; Borgen, Katrine

    2011-09-21

    The International Health Regulations (IHR (2005)) require countries to notify WHO of any event which may constitute a public health emergency of international concern. This notification relies on reports of events occurring at the local level reaching the national public health authorities. By June 2012 WHO member states are expected to have implemented the capacity to "detect events involving disease or death above expected levels for the particular time and place" on the local level and report essential information to the appropriate level of public health authority. Our objective was to develop tools to assist European countries improve the reporting of unusual events of public health significance from frontline healthcare workers to public health authorities. We investigated obstacles and incentives to event reporting through a systematic literature review and expert consultations with national public health officials from various European countries. Multi-day expert meetings and qualitative interviews were used to gather experiences and examples of public health event reporting. Feedback on specific components of the toolkit was collected from healthcare workers and public health officials throughout the design process. Evidence from 79 scientific publications, two multi-day expert meetings and seven qualitative interviews stressed the need to clarify concepts and expectations around event reporting in European countries between the frontline and public health authorities. An analytical framework based on three priority areas for improved event reporting (professional engagement, communication and infrastructure) was developed and guided the development of the various tools. We developed a toolkit adaptable to country-specific needs that includes a guidance document for IHR National Focal Points and nine tool templates targeted at clinicians and laboratory staff: five awareness campaign tools, three education and training tools, and an implementation plan. The toolkit emphasizes what to report, the reporting process and the need for follow-up, supported by real examples. This toolkit addresses the importance of mutual exchange of information between frontline healthcare workers and public health authorities. It may potentially increase frontline healthcare workers' awareness of their role in the detection of events of public health concern, improve communication channels and contribute to creating an enabling environment for event reporting. However, the effectiveness of the toolkit will depend on the national body responsible for dissemination and training.

  6. Dynamic temperature and humidity environmental profiles: impact for future emergency and disaster preparedness and response.

    PubMed

    Ferguson, William J; Louie, Richard F; Tang, Chloe S; Paw U, Kyaw Tha; Kost, Gerald J

    2014-02-01

    During disasters and complex emergencies, environmental conditions can adversely affect the performance of point-of-care (POC) testing. Knowledge of these conditions can help device developers and operators understand the significance of temperature and humidity limits necessary for use of POC devices. First responders will benefit from improved performance for on-site decision making. To create dynamic temperature and humidity profiles that can be used to assess the environmental robustness of POC devices, reagents, and other resources (eg, drugs), and thereby, to improve preparedness. Surface temperature and humidity data from the National Climatic Data Center (Asheville, North Carolina USA) was obtained, median hourly temperature and humidity were calculated, and then mathematically stretched profiles were created to include extreme highs and lows. Profiles were created for: (1) Banda Aceh, Indonesia at the time of the 2004 Tsunami; (2) New Orleans, Louisiana USA just before and after Hurricane Katrina made landfall in 2005; (3) Springfield, Massachusetts USA for an ambulance call during the month of January 2009; (4) Port-au-Prince, Haiti following the 2010 earthquake; (5) Sendai, Japan for the March 2011 earthquake and tsunami with comparison to the colder month of January 2011; (6) New York, New York USA after Hurricane Sandy made landfall in 2012; and (7) a 24-hour rescue from Hawaii USA to the Marshall Islands. Profiles were validated by randomly selecting 10 days and determining if (1) temperature and humidity points fell inside and (2) daily variations were encompassed. Mean kinetic temperatures (MKT) were also assessed for each profile. Profiles accurately modeled conditions during emergency and disaster events and enclosed 100% of maximum and minimum temperature and humidity points. Daily variations also were represented well with 88.6% (62/70) of temperature readings and 71.1% (54/70) of relative humidity readings falling within diurnal patterns. Days not represented well primarily had continuously high humidity. Mean kinetic temperature was useful for severity ranking. Simulating temperature and humidity conditions clearly reveals operational challenges encountered during disasters and emergencies. Understanding of environmental stresses and MKT leads to insights regarding operational robustness necessary for safe and accurate use of POC devices and reagents. Rescue personnel should understand these principles before performing POC testing in adverse environments.

  7. Practitioner Expertise: Creating Quality within the Daily Tumble of Events in Youth Settings

    ERIC Educational Resources Information Center

    Larson, Reed W.; Rickman, Aimee N.; Gibbons, Colleen M.; Walker, Kathrin C.

    2009-01-01

    Practitioners in youth settings experience life on the ground as a tumble of events, shaped by a confluence of youth needs, institutional expectations, and other inputs. The quality of the setting is determined in part by practitioners' expertise in shaping and responding to these events. The situations that arise in practice, and how staff…

  8. Students' Use of Languaging in Rewriting Events from "The Things They Carried"

    ERIC Educational Resources Information Center

    Beach, Richard

    2017-01-01

    This article describes high school students' responses to events in the novel, "The Things They Carried," leading to their collaborative rewriting to create their own narrative versions of these events. It draws on "enactivist" theory of languaging, an approach to language that focuses on its use as social actions to enact and…

  9. Hydrologic data summary for the St. Lucie River Estuary, Martin and St. Lucie Counties, Florida, 1998-2001

    USGS Publications Warehouse

    Byrne, Michael J.; Patino, Eduardo

    2004-01-01

    A hydrologic analysis was made at three canal sites and four tidal sites along the St. Lucie River Estuary in southeastern Florida from 1998 to 2001. The data included for analysis are stage, 15-minute flow, salinity, water temperature, turbidity, and suspended-solids concentration. During the period of record, the estuary experienced a drought, major storm events, and high-water discharge from Lake Okeechobee. Flow mainly occurred through the South Fork of the St. Lucie River; however, when flow increased through control structures along the C-23 and C-24 Canals, the North Fork was a larger than usual contributor of total freshwater inflow to the estuary. At one tidal site (Steele Point), the majority of flow was southward toward the St. Lucie Inlet; at a second tidal site (Indian River Bridge), the majority of flow was northward into the Indian River Lagoon. Large-volume stormwater discharge events greatly affected the St. Lucie River Estuary. Increased discharge typically was accompanied by salinity decreases that resulted in water becoming and remaining fresh throughout the estuary until the discharge events ended. Salinity in the estuary usually returned to prestorm levels within a few days after the events. Turbidity decreased and salinity began to increase almost immediately when the gates at the control structures closed. Salinity ranged from less than 1 to greater than 35 parts per thousand during the period of record (1998-2001), and typically varied by several parts per thousand during a tidal cycle. Suspended-solids concentrations were observed at one canal site (S-80) and two tidal sites (Speedy Point and Steele Point) during a discharge event in April and May 2000. Results suggest that most deposition of suspended-solids concentration occurs between S-80 and Speedy Point. The turbidity data collected also support this interpretation. The ratio of inorganic to organic suspended-solids concentration observed at S-80, Speedy Point, and Steele Point during the discharge event indicates that most flocculation of suspended-solids concentration occurs between Speedy Point and Steele Point.

  10. Automation of radiation treatment planning : Evaluation of head and neck cancer patient plans created by the Pinnacle3 scripting and Auto-Planning functions.

    PubMed

    Speer, Stefan; Klein, Andreas; Kober, Lukas; Weiss, Alexander; Yohannes, Indra; Bert, Christoph

    2017-08-01

    Intensity-modulated radiotherapy (IMRT) techniques are now standard practice. IMRT or volumetric-modulated arc therapy (VMAT) allow treatment of the tumor while simultaneously sparing organs at risk. Nevertheless, treatment plan quality still depends on the physicist's individual skills, experiences, and personal preferences. It would therefore be advantageous to automate the planning process. This possibility is offered by the Pinnacle 3 treatment planning system (Philips Healthcare, Hamburg, Germany) via its scripting language or Auto-Planning (AP) module. AP module results were compared to in-house scripts and manually optimized treatment plans for standard head and neck cancer plans. Multiple treatment parameters were scored to judge plan quality (100 points = optimum plan). Patients were initially planned manually by different physicists and re-planned using scripts or AP. Script-based head and neck plans achieved a mean of 67.0 points and were, on average, superior to manually created (59.1 points) and AP plans (62.3 points). Moreover, they are characterized by reproducibility and lower standard deviation of treatment parameters. Even less experienced staff are able to create at least a good starting point for further optimization in a short time. However, for particular plans, experienced planners perform even better than scripts or AP. Experienced-user input is needed when setting up scripts or AP templates for the first time. Moreover, some minor drawbacks exist, such as the increase of monitor units (+35.5% for scripted plans). On average, automatically created plans are superior to manually created treatment plans. For particular plans, experienced physicists were able to perform better than scripts or AP; thus, the benefit is greatest when time is short or staff inexperienced.

  11. Impact of High-Reliability Education on Adverse Event Reporting by Registered Nurses.

    PubMed

    McFarland, Diane M; Doucette, Jeffrey N

    Adverse event reporting is one strategy to identify risks and improve patient safety, but, historically, adverse events are underreported by registered nurses (RNs) because of fear of retribution and blame. A program was provided on high reliability to examine whether education would impact RNs' willingness to report adverse events. Although the findings were not statistically significant, they demonstrated a positive impact on adverse event reporting and support the need to create a culture of high reliability.

  12. Flow topology of rare back flow events and critical points in turbulent channels and toroidal pipes

    NASA Astrophysics Data System (ADS)

    Chin, C.; Vinuesa, R.; Örlü, R.; Cardesa, J. I.; Noorani, A.; Schlatter, P.; Chong, M. S.

    2018-04-01

    A study of the back flow events and critical points in the flow through a toroidal pipe at friction Reynolds number Re τ ≈ 650 is performed and compared with the results in a turbulent channel flow at Re τ ≈ 934. The statistics and topological properties of the back flow events are analysed and discussed. Conditionally-averaged flow fields in the vicinity of the back flow event are obtained, and the results for the torus show a similar streamwise wall-shear stress topology which varies considerably for the spanwise wall-shear stress when compared to the channel flow. The comparison between the toroidal pipe and channel flows also shows fewer back flow events and critical points in the torus. This cannot be solely attributed to differences in Reynolds number, but is a clear effect of the secondary flow present in the toroidal pipe. A possible mechanism is the effect of the secondary flow present in the torus, which convects momentum from the inner to the outer bend through the core of the pipe, and back from the outer to the inner bend through the pipe walls. In the region around the critical points, the skin-friction streamlines and vorticity lines exhibit similar flow characteristics with a node and saddle pair for both flows. These results indicate that back flow events and critical points are genuine features of wall-bounded turbulence, and are not artifacts of specific boundary or inflow conditions in simulations and/or measurement uncertainties in experiments.

  13. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  14. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  15. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  16. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  17. 14 CFR 437.59 - Key flight-safety event limitations.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... suborbital rocket's instantaneous impact point, including its expected dispersion, is over an unpopulated or... rocket engine, (2) Any staging event, or (3) Any envelope expansion. (b) A permittee must conduct each reusable suborbital rocket flight so that the reentry impact point does not loiter over a populated area. ...

  18. American Heart Association’s Life’s Simple 7: Avoiding Heart Failure and Preserving Cardiac Structure and Function

    PubMed Central

    Folsom, Aaron R.; Shah, Amil M.; Lutsey, Pamela L.; Roetker, Nicholas S.; Alonso, Alvaro; Avery, Christy L.; Miedema, Michael D.; Konety, Suma; Chang, Patricia P.; Solomon, Scott D.

    2015-01-01

    BACKGROUND Many people may underappreciate the role of lifestyle in avoiding heart failure. We estimated whether greater adherence in middle age to American Heart Association’s Life’s Simple 7 guidelines -- on smoking, body mass, physical activity, diet, cholesterol, blood pressure, and glucose -- is associated with lower lifetime risk of heart failure and greater preservation of cardiac structure and function in old age. METHODS We studied the population-based Atherosclerosis Risk in Communities Study cohort of 13,462 adults aged 45-64 years in 1987-89. From the 1987-89 risk factor measurements, we created a Life’s Simple 7 score (range 0-14, giving 2 points for ideal, 1 point for intermediate, and 0 points for poor components). We identified 2,218 incident heart failure events using surveillance of hospital discharge and death codes through 2011. In addition, in 4,855 participants free of clinical cardiovascular disease in 2011-13, we performed echocardiography from which we quantified left ventricular hypertrophy and diastolic dysfunction. RESULTS One in four participants (25.5%) developed heart failure through age 85. Yet, this lifetime heart failure risk was 14.4% for those with a middle-age Life’s Simple 7 score of 10-14 (optimal), 26.8% for a score of 5-9 (average), and 48.6% for a score of 0-4 (inadequate). Among those with no clinical cardiovascular event, the prevalence of left ventricular hypertrophy in late life was approximately 40% as common, and diastolic dysfunction was approximately 60% as common, among those with an optimal middle-age Life’s Simple 7 score compared with an inadequate score. CONCLUSIONS Greater achievement of American Heart Association’s Life’s Simple 7 in middle-age is associated with a lower lifetime occurrence of heart failure and greater preservation of cardiac structure and function. PMID:25908393

  19. Co-designing for quality: Creating a user-driven tool to improve quality in youth mental health services.

    PubMed

    Hackett, Christina L; Mulvale, Gillian; Miatello, Ashleigh

    2018-04-29

    Although high quality mental health care for children and youth is a goal of many health systems, little is known about the dimensions of quality mental health care from users' perspectives. We engaged young people, caregivers and service providers to share experiences, which shed light on quality dimensions for youth mental health care. Using experience-based co-design, we collected qualitative data from young people aged 16-24 with a mental disorder (n = 19), identified caregivers (n = 12) and service providers (n = 14) about their experiences with respect to youth mental health services. Experience data were collected using multiple approaches including interviews, a suite of online and smartphone applications (n = 22), and a co-design event (n = 16) and analysed to extract touch points. These touch points were used to prioritize and co-design a user-driven prototype of a questionnaire to provide feedback to service providers. Young people, caregiver and service provider reports of service experiences were used to identify aspects of care quality at eight mental health service contact points: Access to mental health care; Transfer to/from hospital; Intake into hospital; Services provided; Assessment and treatment; Treatment environment; and Caregiver involvement in care. In some cases, low quality care was harmful to users and their caregivers. Young people co-designed a prototype of a user-driven feedback questionnaire to improve quality of service experiences that was supported by service providers and caregivers at the co-design event. By using EBCD to capture in-depth data regarding experiences of young people, their caregivers and service providers, study participants have begun to establish a baseline for acceptable quality of mental health care for young people. © 2018 The Authors. Health Expectations published by John Wiley & Sons Ltd.

  20. One-shot Synesthesia.

    PubMed

    Kirschner, Alexandra; Nikolić, Danko

    2017-01-01

    Synesthesia is commonly thought to be a phenomenon of fixed associations between an outside inducer and a vivid concurrent experience. Hence, it has been proposed that synesthesia occurs due to additional connections in the brain with which synesthetes are born. Here we show that synesthesia can be a much richer and more flexible phenomenon with a capability to creatively construct novel synesthetic experiences as events unfold in people's lives. We describe here cases of synesthetes who occasionally generate novel synesthetic experience, called one-shot synesthesias. These synesthetic experiences seem to share all the properties with the classical synesthetic associations except that they occur extremely rarely, people recalling only a few events over the lifetime. It appears that these one-shots are not created at random but are instead responses to specific life events. We contrast the properties of those rare synesthetic events with other, more commonly known forms of synesthesia that also create novel synesthetic experiences, but at a high rate-sometimes creating novel experiences every few seconds. We argue that one-shot synesthesias indicate that synesthetic associations are by their nature not prewired at birth but are dynamically constructed through mental operations and according to the needs of a synesthetic mind. Our conclusions have implications for understanding the biological underpinnings of synesthesia and the role the phenomenon plays in the lives of people endowed with synesthetic capacities.

  1. Misconceptions of Synthetic Biology: Lessons from an Interdisciplinary Summer School

    NASA Technical Reports Server (NTRS)

    Verseux, Cyprien; Acevedo-Rocha, Carlos G.; Chizzolini, Fabio; Rothschild, Lynn J.

    2016-01-01

    In 2014, an international group of scholars from various fields analysed the "societal dimensions" of synthetic biology in an interdisciplinary summer school. Here, we report and discuss the biologists' observations on the general perception of synthetic biology by non-biologists who took part in this event. Most attendees mainly associated synthetic biology with contributions from the best-known public figures of the field, rarely mentioning other scientists. Media extrapolations of those contributions appeared to have created unrealistic expectations and irrelevant fears that were widely disconnected from the current research in synthetic biology. Another observation was that when debating developments in synthetic biology, semantics strongly mattered: depending on the terms used to present an application of synthetic biology, attendees reacted in radically different ways. For example, using the term "GMOs" (genetically modified organisms) rather than the term "genetic engineering" led to very different reactions. Stimulating debates also happened with participants having unanticipated points of view, for instance biocentrist ethicists who argued that engineered microbes should not be used for human purposes. Another communication challenge emerged from the connotations and inaccuracies surrounding the word "life", which impaired constructive debates, thus leading to misconceptions about the abilities of scientists to engineer or even create living organisms. Finally, it appeared that synthetic biologists tend to overestimate the knowledge of non-biologists, further affecting communication. The motivation and ability of synthetic biologists to communicate their work outside their research field needs to be fostered, notably towards policymakers who need a more accurate and technical understanding of the field to make informed decisions. Interdisciplinary events gathering scholars working in and around synthetic biology are an effective tool in addressing those issues.

  2. Comparative Dosimetric Estimates of a 25 keV Electron Micro-beam with three Monte Carlo Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mainardi, Enrico; Donahue, Richard J.; Blakely, Eleanor A.

    2002-09-11

    The calculations presented compare the different performances of the three Monte Carlo codes PENELOPE-1999, MCNP-4C and PITS, for the evaluation of Dose profiles from a 25 keV electron micro-beam traversing individual cells. The overall model of a cell is a water cylinder equivalent for the three codes but with a different internal scoring geometry: hollow cylinders for PENELOPE and MCNP, whereas spheres are used for the PITS code. A cylindrical cell geometry with scoring volumes with the shape of hollow cylinders was initially selected for PENELOPE and MCNP because of its superior simulation of the actual shape and dimensions ofmore » a cell and for its improved computer-time efficiency if compared to spherical internal volumes. Some of the transfer points and energy transfer that constitute a radiation track may actually fall in the space between spheres, that would be outside the spherical scoring volume. This internal geometry, along with the PENELOPE algorithm, drastically reduced the computer time when using this code if comparing with event-by-event Monte Carlo codes like PITS. This preliminary work has been important to address dosimetric estimates at low electron energies. It demonstrates that codes like PENELOPE can be used for Dose evaluation, even with such small geometries and energies involved, which are far below the normal use for which the code was created. Further work (initiated in Summer 2002) is still needed however, to create a user-code for PENELOPE that allows uniform comparison of exact cell geometries, integral volumes and also microdosimetric scoring quantities, a field where track-structure codes like PITS, written for this purpose, are believed to be superior.« less

  3. Construction and Updating of Event Models in Auditory Event Processing

    ERIC Educational Resources Information Center

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  4. Assessing the Regional Frequency, Intensity, and Spatial Extent of Tropical Cyclone Rainfall

    NASA Astrophysics Data System (ADS)

    Bosma, C.; Wright, D.; Nguyen, P.

    2017-12-01

    While the strength of a hurricane is generally classified based on its wind speed, the unprecedented rainfall-driven flooding experienced in southeastern Texas during Hurricane Harvey clearly highlights the need for better understanding of the hazards associated with extreme rainfall from hurricanes and other tropical systems. In this study, we seek to develop a framework for describing the joint probabilistic and spatio-temporal properties of extreme rainfall from hurricanes and other tropical systems. Furthermore, we argue that commonly-used terminology - such as the "500-year storm" - fail to convey the true properties of tropical cyclone rainfall occurrences in the United States. To quantify the magnitude and spatial extent of these storms, a database consisting of hundreds of unique rainfall volumetric shapes (or "voxels") was created. Each voxel is a four-dimensional object, created by connecting, in both space and time, gridded rainfall observations from the daily, gauge-based NOAA CPC-Unified precipitation dataset. Individual voxels were then associated with concurrent tropical cyclone tracks from NOAA's HURDAT-2 archive, to create distinct representations of the rainfall associated with every Atlantic tropical system making landfall over (or passing near) the United States since 1948. Using these voxels, a series of threshold-excess extreme value models were created to estimate the recurrence intervals of extreme tropical cyclone rainfall, both nationally and locally, for single and multi-day timescales. This voxel database also allows for the "indexing" of past events, placing recent extremes - such as the 50+ inches of rain observed during Hurricane Harvey - into a national context and emphasizing how rainfall totals that are rare at the point scale may be more frequent from a regional perspective.

  5. Optimal Synthesis of Compliant Mechanisms using Subdivision and Commercial FEA (DETC2004-57497)

    NASA Technical Reports Server (NTRS)

    Hull, Patrick V.; Canfield, Stephen

    2004-01-01

    The field of distributed-compliance mechanisms has seen significant work in developing suitable topology optimization tools for their design. These optimal design tools have grown out of the techniques of structural optimization. This paper will build on the previous work in topology optimization and compliant mechanism design by proposing an alternative design space parameterization through control points and adding another step to the process, that of subdivision. The control points allow a specific design to be represented as a solid model during the optimization process. The process of subdivision creates an additional number of control points that help smooth the surface (for example a C(sup 2) continuous surface depending on the method of subdivision chosen) creating a manufacturable design free of some traditional numerical instabilities. Note that these additional control points do not add to the number of design parameters. This alternative parameterization and description as a solid model effectively and completely separates the design variables from the analysis variables during the optimization procedure. The motivation behind this work is to create an automated design tool from task definition to functional prototype created on a CNC or rapid-prototype machine. This paper will describe the proposed compliant mechanism design process and will demonstrate the procedure on several examples common in the literature.

  6. Seismic catalog condensation with applications to multifractal analysis of South Californian seismicity

    NASA Astrophysics Data System (ADS)

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2014-05-01

    Latest advances in the instrumentation field have increased the station coverage and lowered event detection thresholds. This has resulted in a vast increase in the number of located events with each year. The abundance of data comes as a double edged sword: while it facilitates more robust statistics and provides better confidence intervals, it also paralyzes computations whose execution times grow exponentially with the number of data points. In this study, we present a novel method that assesses the relative importance of each data point, reduces the size of datasets while preserving the information content. For a given seismic catalog, the goal is to express the same spatial probability density distribution with fewer data points. To achieve this, we exploit the fact that seismic catalogs are not optimally encoded. This coding deficiency is the result of the sequential data entry where new events are added without taking into account previous ones. For instance, if there are several events with identical parameters occurring at the same location, these could be grouped together rather than occupying the same memory space as if they were distinct events. Following this reasoning, the proposed condensation methodology is implemented by grouping all event according to their overall variance, starting from the group with the highest variance (worst location uncertainty), each event is sampled by a number of sample points, these points are then used to calculate which better located events are able to express these probable locations with a higher likelihood. Based on these likelihood comparisons, weights from poorly located events are successively transferred to better located ones. As a result of the process, a large portion of the events (~30%) ends up with zero weights (thus being fully represented by events increasing their weights), while the information content (i.e the sum of all weights) remains preserved. The resulting condensed catalog not only provides more optimally encoding but is also regularized with respect to the local information quality. By investigating the locations of mass enrichment and depletion at different scales, we observe that the areas of increased mass are in good agreement with reported surface fault traces. We also conduct multifractal spatial analysis on condensed catalogs and investigate different spatial scaling regimes made clearer by reducing the effect of location uncertainty.

  7. Guidelines for time-to-event end point definitions in sarcomas and gastrointestinal stromal tumors (GIST) trials: results of the DATECAN initiative (Definition for the Assessment of Time-to-event Endpoints in CANcer trials)†.

    PubMed

    Bellera, C A; Penel, N; Ouali, M; Bonvalot, S; Casali, P G; Nielsen, O S; Delannes, M; Litière, S; Bonnetain, F; Dabakuyo, T S; Benjamin, R S; Blay, J-Y; Bui, B N; Collin, F; Delaney, T F; Duffaud, F; Filleron, T; Fiore, M; Gelderblom, H; George, S; Grimer, R; Grosclaude, P; Gronchi, A; Haas, R; Hohenberger, P; Issels, R; Italiano, A; Jooste, V; Krarup-Hansen, A; Le Péchoux, C; Mussi, C; Oberlin, O; Patel, S; Piperno-Neumann, S; Raut, C; Ray-Coquard, I; Rutkowski, P; Schuetze, S; Sleijfer, S; Stoeckle, E; Van Glabbeke, M; Woll, P; Gourgou-Bourgade, S; Mathoulin-Pélissier, S

    2015-05-01

    The use of potential surrogate end points for overall survival, such as disease-free survival (DFS) or time-to-treatment failure (TTF) is increasingly common in randomized controlled trials (RCTs) in cancer. However, the definition of time-to-event (TTE) end points is rarely precise and lacks uniformity across trials. End point definition can impact trial results by affecting estimation of treatment effect and statistical power. The DATECAN initiative (Definition for the Assessment of Time-to-event End points in CANcer trials) aims to provide recommendations for definitions of TTE end points. We report guidelines for RCT in sarcomas and gastrointestinal stromal tumors (GIST). We first carried out a literature review to identify TTE end points (primary or secondary) reported in publications of RCT. An international multidisciplinary panel of experts proposed recommendations for the definitions of these end points. Recommendations were developed through a validated consensus method formalizing the degree of agreement among experts. Recommended guidelines for the definition of TTE end points commonly used in RCT for sarcomas and GIST are provided for adjuvant and metastatic settings, including DFS, TTF, time to progression and others. Use of standardized definitions should facilitate comparison of trials' results, and improve the quality of trial design and reporting. These guidelines could be of particular interest to research scientists involved in the design, conduct, reporting or assessment of RCT such as investigators, statisticians, reviewers, editors or regulatory authorities. © The Author 2014. Published by Oxford University Press on behalf of the European Society for Medical Oncology. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  8. 3-D Deformation Field Of The 2010 El Mayor-Cucapah (Mexico) Earthquake From Matching Before To After Aerial Lidar Point Clouds

    NASA Astrophysics Data System (ADS)

    Hinojosa-Corona, A.; Nissen, E.; Arrowsmith, R.; Krishnan, A. K.; Saripalli, S.; Oskin, M. E.; Arregui, S. M.; Limon, J. F.

    2012-12-01

    The Mw 7.2 El Mayor-Cucapah earthquake (EMCE) of 4 April 2010 generated a ~110 km long, NW-SE trending rupture, with normal and right-lateral slip in the order of 2-3m in the Sierra Cucapah, the northern half, where the surface rupture has the most outstanding expression. Vertical and horizontal surface displacements produced by the EMCE have been addressed separately by other authors with a variety of aerial and satellite remote sensing techniques. Slip variation along fault and post-seismic scarp erosion and diffusion have been estimated in other studies using terrestrial LiDAR (TLS) on segments of the rupture. To complement these other studies, we computed the 3D deformation field by comparing pre- to post-event point clouds from aerial LiDAR surveys. The pre-event LiDAR with lower point density (0.013-0.033 pts m-2) required filtering and post-processing before comparing with the denser (9-18 pts m-2) more accurate post event dataset. The 3-dimensional surface displacement field was determined using an adaptation of the Iterative Closest Point (ICP) algorithm, implemented in the open source Point Cloud Library (PCL). The LiDAR datasets are first split into a grid of windows, and for each one, ICP iteratively converges on the rigid body transformation (comprising a translation and a rotation) that best aligns the pre- to post-event points. Testing on synthetic datasets perturbed with displacements of known magnitude showed that windows with dimensions of 100-200m gave the best results for datasets with these densities. Here we present the deformation field with detailed displacements in segments of the surface rupture where its expression was recognized by ICP from the point cloud matching, mainly the scarcely vegetated Sierra Cucapah with the Borrego and Paso Superior fault segments the most outstanding, where we are able to compare our results with values measured in the field and results from TLS reported in other works. EMC simulated displacement field for a 2m right lateral normal (east block down) slip on the pre-event point cloud along the Borrego fault on Sierra Cucapah. Shaded DEM from post-event point cloud as backdrop.

  9. Low-Latency Line Tracking Using Event-Based Dynamic Vision Sensors

    PubMed Central

    Everding, Lukas; Conradt, Jörg

    2018-01-01

    In order to safely navigate and orient in their local surroundings autonomous systems need to rapidly extract and persistently track visual features from the environment. While there are many algorithms tackling those tasks for traditional frame-based cameras, these have to deal with the fact that conventional cameras sample their environment with a fixed frequency. Most prominently, the same features have to be found in consecutive frames and corresponding features then need to be matched using elaborate techniques as any information between the two frames is lost. We introduce a novel method to detect and track line structures in data streams of event-based silicon retinae [also known as dynamic vision sensors (DVS)]. In contrast to conventional cameras, these biologically inspired sensors generate a quasicontinuous stream of vision information analogous to the information stream created by the ganglion cells in mammal retinae. All pixels of DVS operate asynchronously without a periodic sampling rate and emit a so-called DVS address event as soon as they perceive a luminance change exceeding an adjustable threshold. We use the high temporal resolution achieved by the DVS to track features continuously through time instead of only at fixed points in time. The focus of this work lies on tracking lines in a mostly static environment which is observed by a moving camera, a typical setting in mobile robotics. Since DVS events are mostly generated at object boundaries and edges which in man-made environments often form lines they were chosen as feature to track. Our method is based on detecting planes of DVS address events in x-y-t-space and tracing these planes through time. It is robust against noise and runs in real time on a standard computer, hence it is suitable for low latency robotics. The efficacy and performance are evaluated on real-world data sets which show artificial structures in an office-building using event data for tracking and frame data for ground-truth estimation from a DAVIS240C sensor. PMID:29515386

  10. Characterizing Mega-Earthquake Related Tsunami on Subduction Zones without Large Historical Events

    NASA Astrophysics Data System (ADS)

    Williams, C. R.; Lee, R.; Astill, S.; Farahani, R.; Wilson, P. S.; Mohammed, F.

    2014-12-01

    Due to recent large tsunami events (e.g., Chile 2010 and Japan 2011), the insurance industry is very aware of the importance of managing its exposure to tsunami risk. There are currently few tools available to help establish policies for managing and pricing tsunami risk globally. As a starting point and to help address this issue, Risk Management Solutions Inc. (RMS) is developing a global suite of tsunami inundation footprints. This dataset will include both representations of historical events as well as a series of M9 scenarios on subductions zones that have not historical generated mega earthquakes. The latter set is included to address concerns about the completeness of the historical record for mega earthquakes. This concern stems from the fact that the Tohoku Japan earthquake was considerably larger than had been observed in the historical record. Characterizing the source and rupture pattern for the subduction zones without historical events is a poorly constrained process. In many case, the subduction zones can be segmented based on changes in the characteristics of the subducting slab or major ridge systems. For this project, the unit sources from the NOAA propagation database are utilized to leverage the basin wide modeling included in this dataset. The length of the rupture is characterized based on subduction zone segmentation and the slip per unit source can be determined based on the event magnitude (i.e., M9) and moment balancing. As these events have not occurred historically, there is little to constrain the slip distribution. Sensitivity tests on the potential rupture pattern have been undertaken comparing uniform slip to higher shallow slip and tapered slip models. Subduction zones examined include the Makran Trench, the Lesser Antilles and the Hikurangi Trench. The ultimate goal is to create a series of tsunami footprints to help insurers understand their exposures at risk to tsunami inundation around the world.

  11. Using waveform cross correlation for automatic recovery of aftershock sequences

    NASA Astrophysics Data System (ADS)

    Bobrov, Dmitry; Kitov, Ivan; Rozhkov, Mikhail

    2017-04-01

    Aftershock sequences of the largest earthquakes are difficult to recover. There can be several hundred mid-sized aftershocks per hour within a few hundred km from each other recorded by the same stations. Moreover, these events generate thousands of reflected/refracted phases having azimuth and slowness close to those from the P-waves. Therefore, aftershock sequences with thousands of events represent a major challenge for automatic and interactive processing at the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Organization (CTBTO). Standard methods of detection and phase association do not use all information contained in signals. As a result, wrong association of the first and later phases, both regular and site specific, produces enormous number of wrong event hypotheses and destroys valid event hypotheses in automatic IDC processing. In turn, the IDC analysts have to reject false and recreate valid hypotheses wasting precious human resources. At the current level of the IDC catalogue completeness, the method of waveform cross correlation (WCC) can resolve most of detection and association problems fully utilizing the similarity of waveforms generated by aftershocks. Array seismic stations of the International monitoring system (IMS) can enhance the performance of the WCC method: reduce station-specific detection thresholds, allow accurate estimate of signal attributes, including relative magnitude, and effectively suppress irrelevant arrivals. We have developed and tested a prototype of an aftershock tool matching all IDC processing requirements and merged it with the current IDC pipeline. This tool includes creation of master events consisting of real or synthetic waveform templates at ten and more IMS stations; cross correlation (CC) of real-time waveforms with these templates, association of arrivals detected at CC-traces in event hypotheses; building events matching the IDC quality criteria; and resolution of conflicts between events hypotheses created by neighboring master-events. The final cross correlation standard event lists (XSEL) is a start point for interactive analysis with standard tools. We present select results for the biggest earthquakes, like Sumatra 2004 and Tohoku 2011, as well as for several smaller events with hundreds of aftershocks. The sensitivity and resolution of the aftershock tool is demonstrated on the example of mb=2.2 aftershock found after the September 9, 2016 DPRK test.

  12. Astronomical aspects of cosmic threats: new problems and approaches to asteroid—comet hazard following the chelyabinsk event of February 15, 2013

    NASA Astrophysics Data System (ADS)

    Shustov, B. M.; Shugarov, A. S.; Naroenkov, S. A.; Prokhorov, M. E.

    2015-10-01

    A new definition of hazardous celestial bodies (HCBs) is introduced, in which the lower limit of the size of a HCB is reduced to 10 m. A new definition for threatening and collisional orbits of DCBs is introduced. The main astronomical factors that must be taken into account when creating systems for the detection of HCBs are analyzed. The most important of these are the uniformity of the distribution of points (regions) for the appearance of HCBs on the celestial sphere in near-Earth space and the practical limit for the velocity of approach of a HCB of 20 km/s (for 90% of bodies). It is shown that the creation of a system for the nearby detection of asteroids and comets arriving from the daytime sky requires the use of a space-based system. A concept for such a system, in which one or several optical telescopes are placed in the vicinity of the libration point L1 for the Sun—Earth system, is developed. Preliminary plans for such a system, called the System for the Detection of Daytime Asteroids (SDDA), are briefly described.

  13. Discerning Trends in Performance Across Multiple Events

    NASA Technical Reports Server (NTRS)

    Slater, Simon; Hiltz, Mike; Rice, Craig

    2006-01-01

    Mass Data is a computer program that enables rapid, easy discernment of trends in performance data across multiple flights and ground tests. The program can perform Fourier analysis and other functions for the purposes of frequency analysis and trending of all variables. These functions facilitate identification of past use of diagnosed systems and of anomalies in such systems, and enable rapid assessment of related current problems. Many variables, for computation of which it is usually necessary to perform extensive manual manipulation of raw downlist data, are automatically computed and made available to all users, regularly eliminating the need for what would otherwise be an extensive amount of engineering analysis. Data from flight, ground test, and simulation are preprocessed and stored in one central location for instantaneous access and comparison for diagnostic and trending purposes. Rules are created so that an event log is created for every flight, making it easy to locate information on similar maneuvers across many flights. The same rules can be created for test sets and simulations, and are searchable, so that information on like events is easily accessible.

  14. Simultaneous measurement of passage through the restriction point and MCM loading in single cells

    PubMed Central

    Håland, T. W.; Boye, E.; Stokke, T.; Grallert, B.; Syljuåsen, R. G.

    2015-01-01

    Passage through the Retinoblastoma protein (RB1)-dependent restriction point and the loading of minichromosome maintenance proteins (MCMs) are two crucial events in G1-phase that help maintain genome integrity. Deregulation of these processes can cause uncontrolled proliferation and cancer development. Both events have been extensively characterized individually, but their relative timing and inter-dependence remain less clear. Here, we describe a novel method to simultaneously measure MCM loading and passage through the restriction point. We exploit that the RB1 protein is anchored in G1-phase but is released when hyper-phosphorylated at the restriction point. After extracting cells with salt and detergent before fixation we can simultaneously measure, by flow cytometry, the loading of MCMs onto chromatin and RB1 binding to determine the order of the two events in individual cells. We have used this method to examine the relative timing of the two events in human cells. Whereas in BJ fibroblasts released from G0-phase MCM loading started mainly after the restriction point, in a significant fraction of exponentially growing BJ and U2OS osteosarcoma cells MCMs were loaded in G1-phase with RB1 anchored, demonstrating that MCM loading can also start before the restriction point. These results were supported by measurements in synchronized U2OS cells. PMID:26250117

  15. Trends in flash flood events versus convective precipitation in the Mediterranean region: The case of Catalonia

    NASA Astrophysics Data System (ADS)

    Llasat, Maria Carmen; Marcos, Raul; Turco, Marco; Gilabert, Joan; Llasat-Botija, Montserrat

    2016-10-01

    The aim of this paper is to analyse the potential relationship between flash flood events and convective precipitation in Catalonia, as well as any related trends. The paper starts with an overview of flash floods and their trends in the Mediterranean region, along with their associated factors, followed by the definition of, identification of, and trends in convective precipitation. After this introduction the paper focuses on the north-eastern Iberian Peninsula, for which there is a long-term precipitation series (since 1928) of 1-min precipitation from the Fabra Observatory, as well as a shorter (1996-2011) but more extensive precipitation series (43 rain gauges) of 5-min precipitation. Both series have been used to characterise the degree of convective contribution to rainfall, introducing the β parameter as the ratio between convective precipitation versus total precipitation in any period. Information about flood events was obtained from the INUNGAMA database (a flood database created by the GAMA team), with the aim of finding any potential links to convective precipitation. These flood data were gathered using information on damage where flood is treated as a multifactorial risk, and where any trend or anomaly might have been caused by one or more factors affecting hazard, vulnerability or exposure. Trend analysis has shown an increase in flash flood events. The fact that no trends were detected in terms of extreme values of precipitation on a daily scale, nor on the associated ETCCDI (Expert Team on Climate Change Detection and Indices) extreme index, could point to an increase in vulnerability, an increase in exposure, or changes in land use. However, the summer increase in convective precipitation was concentrated in less torrential events, which could partially explain this positive trend in flash flood events. The β parameter has been also used to characterise the type of flood event according to the features of the precipitation. The highest values correspond to short and local events, usually with daily β values above 0.5, while the minimum threshold of daily β for catastrophic flash floods is 0.31.

  16. 75 FR 57597 - Revised Proposal for Revisions to the Schedules of Civil Penalties for a Violation of a Federal...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-21

    ... just substantially increase the likelihood that one of these events could occur. Conversely, in this..., explosions, acts of God, and other events involving the operation of on-track equipment (standing or moving... following events, but does not create an imminent hazard of death or injury to individuals or cause an...

  17. Metapopulation viability of an endangered shorebird depends on dispersal and human-created habitats: Piping plovers (Charadrius melodus) and prairie rivers

    USGS Publications Warehouse

    Catlin, Daniel H.; Zeigler, Sara; Bomberger Brown, M.; Dinan, Lauren R.; Fraser, James D.; Hunt, Kelsi L.; Jorgensen, Joel G.

    2016-01-01

    We found that functional connectivity, as measured by the rate of dispersal among subpopulations, increased as a result of the high flow event in our study metapopulation. Plovers also increased reproductive output following this event. Although the study metapopulation had a low overall probability of extinction, metapopulation persistence depended on anthropogenically created habitats that provided a small but stable source of nesting habitat and dispersers through time. However, all subpopulations remained small, even if persistent, making them individually vulnerable to extinction through stochastic events. Given the highly dynamic nature of habitat availability in this system, maintaining several subpopulations within the metapopulation and stable sources of habitat will be critical, and this species will likely remain conservation-reliant.

  18. Method for the depth corrected detection of ionizing events from a co-planar grids sensor

    DOEpatents

    De Geronimo, Gianluigi [Syosset, NY; Bolotnikov, Aleksey E [South Setauket, NY; Carini, Gabriella [Port Jefferson, NY

    2009-05-12

    A method for the detection of ionizing events utilizing a co-planar grids sensor comprising a semiconductor substrate, cathode electrode, collecting grid and non-collecting grid. The semiconductor substrate is sensitive to ionizing radiation. A voltage less than 0 Volts is applied to the cathode electrode. A voltage greater than the voltage applied to the cathode is applied to the non-collecting grid. A voltage greater than the voltage applied to the non-collecting grid is applied to the collecting grid. The collecting grid and the non-collecting grid are summed and subtracted creating a sum and difference respectively. The difference and sum are divided creating a ratio. A gain coefficient factor for each depth (distance between the ionizing event and the collecting grid) is determined, whereby the difference between the collecting electrode and the non-collecting electrode multiplied by the corresponding gain coefficient is the depth corrected energy of an ionizing event. Therefore, the energy of each ionizing event is the difference between the collecting grid and the non-collecting grid multiplied by the corresponding gain coefficient. The depth of the ionizing event can also be determined from the ratio.

  19. Unconscious symmetrical inferences: A role of consciousness in event integration.

    PubMed

    Alonso, Diego; Fuentes, Luis J; Hommel, Bernhard

    2006-06-01

    Explicit and implicit learning have been attributed to different learning processes that create different types of knowledge structures. Consistent with that claim, our study provides evidence that people integrate stimulus events differently when consciously aware versus unaware of the relationship between the events. In a first, acquisition phase participants sorted words into two categories (A and B), which were fully predicted by task-irrelevant primes-the labels of two other, semantically unrelated categories (C and D). In a second, test phase participants performed a lexical decision task, in which all word stimuli stemmed from the previous prime categories (C and D) and the (now nonpredictive) primes were the labels of the previous target categories (A and B). Reliable priming effects in the second phase demonstrated that bidirectional associations between the respective categories had been formed in the acquisition phase (A<-->C and B<-->D), but these effects were found only in participants that were unaware of the relationship between the categories! We suggest that unconscious, implicit learning of event relationships results in the rather unsophisticated integration (i.e., bidirectional association) of the underlying event representations, whereas explicit learning takes the meaning of the order of the events into account, and thus creates unidirectional associations.

  20. Promoting Undergraduate Research through Integrative Learning

    ERIC Educational Resources Information Center

    Lewis, Elise C.

    2017-01-01

    Educators in higher education often seek innovative pedagogies to include in their classrooms. This article describes an integrative learning experience and details the planning, implementation, considerations, and benefits of creating a major-specific undergraduate research day. The event created an opportunity for students to gain confidence and…

  1. Assessing climate change and health vulnerability at the local level: Travis County, Texas.

    PubMed

    Prudent, Natasha; Houghton, Adele; Luber, George

    2016-10-01

    We created a measure to help comprehend population vulnerability to potential flooding and excessive heat events using health, built environment and social factors. Through principal component analysis (PCA), we created non-weighted sum index scores of literature-reviewed social and built environment characteristics. We created baseline poor health measures using 1999-2005 age-adjusted cardiovascular and combined diabetes and hypertension mortality rates to correspond with social-built environment indices. We mapped US Census block groups by linked age-adjusted mortality and a PCA-created social-built environment index. The goal was to measure flooding and excessive heat event vulnerability as proxies for population vulnerability to climate change for Travis County, Texas. This assessment identified communities where baseline poor health, social marginalisation and built environmental impediments intersected. Such assessments may assist targeted interventions and improve emergency preparedness in identified vulnerable communities, while fostering resilience through the focus of climate change adaptation policies at the local level. No claim to original US government works. Journal compilation © 2016 Overseas Development Institute.

  2. SN 2009ip: CONSTRAINING THE LATEST EXPLOSION PROPERTIES BY ITS LATE-PHASE LIGHT CURVE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moriya, Takashi J., E-mail: moriyatk@astro.uni-bonn.de

    We constrain the explosion and circumstellar properties at the 2012b event of SN 2009ip based on its recently reported late-phase bolometric light curve. The explosion energy and ejected mass at the 2012b event are estimated as 0.01 M{sub ⊙} and 2 × 10{sup 49} erg, respectively. The circumstellar medium is assumed to have two components: an inner shell and an outer wind. The inner shell, which is likely created at the 2012a event, has a mass of 0.2 M{sub ⊙}. The outer wind is created by the wind mass loss before the 2012a mass ejection, and the progenitor is estimatedmore » to have had a mass-loss rate of about 0.1 M{sub ⊙} yr{sup −1} with a wind velocity of 550 km s{sup −1} before the 2012a event. The estimated explosion energy and ejected mass indicate that the 2012b event is not caused by a regular SN.« less

  3. On the significance of future trends in flood frequencies

    NASA Astrophysics Data System (ADS)

    Bernhardt, M.; Schulz, K.; Wieder, O.

    2015-12-01

    Floods are a significant threat for alpine headwater catchments and for the forelands. The formation of significant flood events is thereby often coupled on processes occurring in the alpine zone. Rain on snow events are just one example. The prediction of flood risks or trends of flood risks is of major interest to people under direct threat, policy and decision makers as well as for insurance companies. A lot of research was and is currently done in view of detecting future trends in flood extremes or return periods. From a pure physically based point of view, there is strong evidence that those trends exist. But, the central point question is if trends in flood events or other extreme events could be detected from a statistical point of view and on the basis of the available data. This study will investigate this question on the basis of different target parameters and by using long term measurements.

  4. Duluth Entertainment Convention Center (DECC) special events traffic flow study : traffic data analysis and signal timing coordination

    DOT National Transportation Integrated Search

    2003-06-01

    Following special events at the Duluth Entertainment Convention Center (DECC) (e.g., conventions, concerts, graduation ceremonies), high volumes of traffic exiting the DECC create substantial congestion at adjacent intersections. The purpose of this ...

  5. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  6. Point-Source Contributions to the Water Quality of an Urban Stream

    NASA Astrophysics Data System (ADS)

    Little, S. F. B.; Young, M.; Lowry, C.

    2014-12-01

    Scajaquada Creek, which runs through the heart of the city of Buffalo, is a prime example of the ways in which human intervention and local geomorphology can impact water quality and urban hydrology. Beginning in the 1920's, the Creek has been partially channelized and connected to Buffalo's combined sewer system (CSS). At Forest Lawn Cemetery, where this study takes place, Scajaquada Creek emerges from a 3.5-mile tunnel built to route stream flow under the city. Collocated with the tunnel outlet is a discharge point for Buffalo's CSS, combined sewer outlet (CSO) #53. It is at this point that runoff and sanitary sewage discharge regularly during rain events. Initially, this study endeavored to create a spatial and temporal picture for this portion of the Creek, monitoring such parameters as conductivity, dissolved oxygen, pH, temperature, and turbidity, in addition to measuring Escherichia coli (E. coli) concentrations. As expected, these factors responded directly to seasonality, local geomorphology, and distance from the point source (CSO #53), displaying a overall, linear response. However, the addition of nitrate and phosphate testing to the study revealed an entirely separate signal from that previously observed. Concentrations of these parameters did not respond to location in the same manner as E. coli. Instead of decreasing with distance from the CSO, a distinct periodicity was observed, correlating with a series of outflow pipes lining the stream banks. It is hypothesized that nitrate and phosphate occurring in this stretch of Scajaquada Creek originate not from the CSO, but from fertilizers used to maintain the lawns within the subwatershed. These results provide evidence of the complexity related to water quality issues in urban streams as a result of point- and nonpoint-source hydrologic inputs.

  7. The Year of the Solar System: An E/PO Community's Approach to Sharing Planetary Science

    NASA Astrophysics Data System (ADS)

    Shipp, S. S.; Boonstra, D.; Shupla, C.; Dalton, H.; Scalice, D.; Planetary Science E/Po Community

    2010-12-01

    YSS offers the opportunity to raise awareness, build excitement, and make connections with educators, students and the public about planetary science activities. The planetary science education and public outreach (E/PO) community is engaging and educating their audiences through ongoing mission and program activities. Based on discussion with partners, the community is presenting its products in the context of monthly thematic topics that are tied to the big questions of planetary science: how did the Sun’s family of planets and bodies originate and how have they evolved; and how did life begin and evolve on Earth, has it evolved elsewhere in our solar system, and what are characteristics that lead to the origins of life? Each month explores different compelling aspects of the solar system - its formation, volcanism, ice, life. Resources, activities, and events are interwoven in thematic context, and presented with ideas through which formal and informal educators can engage their audiences. The month-to-month themes place the big questions in a logical sequence of deepening learning experiences - and highlight mission milestones and viewing events. YSS encourages active participation and communication with its audiences. It includes nation-wide activities, such as a Walk Through the Solar System, held between October 2010 to March 2011, in which museums, libraries, science centers, schools, planetariums, amateur astronomers, and others are kicking off YSS by creating their own scale models of the solar system and sharing their events through online posting of pictures, video, and stories. YSS offers the E/PO community the opportunity to collaborate with each other and partners. The thematic approach leverages existing products, providing a home and allowing a “shelf life” that can outlast individual projects and missions. The broad themes highlight missions and programs multiple times. YSS also leverages existing online resources and social media. Hosted on the popular and long-lived Solar System Exploration website (http://solarsystem.nasa.gov/yss), multiple points of entry lead to YSS, ensuring sustained accessibility of thematic topics. Likewise, YSS is being shared through social media avenues of existing missions and programs, reaching a large audience without investment in building a fan-base on YSS-specific social media conduits. Create and share your own YSS event with the tools and resources offered on the website. Join the celebration!

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweezy, Jeremy Ed

    A photon next-event fluence estimator at a point has been implemented in the Monte Carlo Application Toolkit (MCATK). The next-event estimator provides an expected value estimator for the flux at a point due to all source and collision events. An advantage of the next-event estimator over track-length estimators, which are normally employed in MCATK, is that flux estimates can be made in locations that have no random walk particle tracks. The next-event estimator allows users to calculate radiographs and estimate response for detectors outside of the modeled geometry. The next-event estimator is not yet accessable through the MCATK FlatAPI formore » C and Fortran. The next-event estimator in MCATK has been tested against MCNP6 using 5 suites of test problems. No issues were found in the MCATK implementation. One issue was found in the exclusion radius approximation in MCNP6. The theory, implementation, and testing are described in this document.« less

  9. Physical Habitat Characteristics on the North and South Forks of the Shenandoah River, VA in 2002-2007

    USGS Publications Warehouse

    Krstolic, Jennifer L.; Hayes, Donald C.

    2010-01-01

    Data collected with the GeoXT Trimble GPS unit using ArcPad 6.1. (summer 2006-2007). Files were created within a geodatabase to create a data dictionary for use in ArcPad during data collection. Drop down lists for habitat type, substrate, depth, width, length, and descriptions were included. Data files produced on the GeoXT were point shapefiles that could be checked back into the geodatabase and viewable as a layer. Points were gathered while canoeing along the South Fork Shenandoah River. Each location marked a change in meso-scale habitat type. GPS points were supplemented with GIS-derived points in areas where manual measurements were made. The points were used to generate a line coverage. This coverage represents physical habitat at a meso-scale (width of stream).

  10. Darapladib for preventing ischemic events in stable coronary heart disease.

    PubMed

    White, Harvey D; Held, Claes; Stewart, Ralph; Tarka, Elizabeth; Brown, Rebekkah; Davies, Richard Y; Budaj, Andrzej; Harrington, Robert A; Steg, P Gabriel; Ardissino, Diego; Armstrong, Paul W; Avezum, Alvaro; Aylward, Philip E; Bryce, Alfonso; Chen, Hong; Chen, Ming-Fong; Corbalan, Ramon; Dalby, Anthony J; Danchin, Nicolas; De Winter, Robbert J; Denchev, Stefan; Diaz, Rafael; Elisaf, Moses; Flather, Marcus D; Goudev, Assen R; Granger, Christopher B; Grinfeld, Liliana; Hochman, Judith S; Husted, Steen; Kim, Hyo-Soo; Koenig, Wolfgang; Linhart, Ales; Lonn, Eva; López-Sendón, José; Manolis, Athanasios J; Mohler, Emile R; Nicolau, José C; Pais, Prem; Parkhomenko, Alexander; Pedersen, Terje R; Pella, Daniel; Ramos-Corrales, Marco A; Ruda, Mikhail; Sereg, Mátyás; Siddique, Saulat; Sinnaeve, Peter; Smith, Peter; Sritara, Piyamitr; Swart, Henk P; Sy, Rody G; Teramoto, Tamio; Tse, Hung-Fat; Watson, David; Weaver, W Douglas; Weiss, Robert; Viigimaa, Margus; Vinereanu, Dragos; Zhu, Junren; Cannon, Christopher P; Wallentin, Lars

    2014-05-01

    Elevated lipoprotein-associated phospholipase A2 activity promotes the development of vulnerable atherosclerotic plaques, and elevated plasma levels of this enzyme are associated with an increased risk of coronary events. Darapladib is a selective oral inhibitor of lipoprotein-associated phospholipase A2. In a double-blind trial, we randomly assigned 15,828 patients with stable coronary heart disease to receive either once-daily darapladib (at a dose of 160 mg) or placebo. The primary end point was a composite of cardiovascular death, myocardial infarction, or stroke. Secondary end points included the components of the primary end point as well as major coronary events (death from coronary heart disease, myocardial infarction, or urgent coronary revascularization for myocardial ischemia) and total coronary events (death from coronary heart disease, myocardial infarction, hospitalization for unstable angina, or any coronary revascularization). During a median follow-up period of 3.7 years, the primary end point occurred in 769 of 7924 patients (9.7%) in the darapladib group and 819 of 7904 patients (10.4%) in the placebo group (hazard ratio in the darapladib group, 0.94; 95% confidence interval [CI], 0.85 to 1.03; P=0.20). There were also no significant between-group differences in the rates of the individual components of the primary end point or in all-cause mortality. Darapladib, as compared with placebo, reduced the rate of major coronary events (9.3% vs. 10.3%; hazard ratio, 0.90; 95% CI, 0.82 to 1.00; P=0.045) and total coronary events (14.6% vs. 16.1%; hazard ratio, 0.91; 95% CI, 0.84 to 0.98; P=0.02). In patients with stable coronary heart disease, darapladib did not significantly reduce the risk of the primary composite end point of cardiovascular death, myocardial infarction, or stroke. (Funded by GlaxoSmithKline; STABILITY ClinicalTrials.gov number, NCT00799903.).

  11. Nuclear reactions in shock wave front during supernova events

    NASA Technical Reports Server (NTRS)

    Lavrukhina, A. K.

    1985-01-01

    The new unique isotopic anomalous coponent of Xe(XeX) was found in the carbonaceous chondrites. It is enriched in light shielded isotopes (124Xe and 126Xe) and in heavy nonshielded isotopes (134Xe and 136Xe. All characteristics of Xe-X can be explained by a model of nucleosynthesis of the Xe isotopes in shock wave front passed through the He envelope during supernova events. The light isotopes are created by p process and the heavy isotopes are created by n process (slow r process). They were captured with high temperature carbon grains condensing by supernova shock waves.

  12. Gaining efficiency by centralising the corporate business resiliency process.

    PubMed

    Martinez, Robert

    2017-06-01

    Organisations have compiled many business continuity plans over the years in response to uncontrollable events and natural disasters. As the types of threats increase, even more plans are being created. Unfortunately, many corporations do not communicate the existence of these various plans outside of their centre of excellence. Creating a centralised oversight of your business resiliency process brings many benefits, including greater awareness, a bigger pool of expertise, common terminology and reducing the chances of redundant efforts. Having an overarching corporate response plan in place makes it possible to have high-level leadership trained and ready in case an extreme event occurs.

  13. Three-dimensional modeling in the study of subsidence in mining Acquaresi (Sardinia South - West) - Francesco Muntoni (1) Teresa Balvis (2) Paolo Bevilacqua (3) (1) Geological, Mining Park of Sardinia - Via Monteverdi, 16 09016 - Iglesias (2) freelance (3) Department of Engineering and Architecture - University of Trieste, Via Valerio 10 - Trieste

    NASA Astrophysics Data System (ADS)

    Muntoni, F.

    2013-12-01

    The effects of subsidence and subsequent landslides in mining areas are very frequent, the study examines the proposed mining area of Acquaresi (Sardinia South - West), interested in the years between 1991 and 2003 by major subsidence phenomena and consequent events landslides. The valley of Acquaresi is particularly important, not only for its mines, but also for the aspect related to the geomorphological evolution morphotectonic in the context of Paleozoic lithologies, which have a rectangular structure parallel to the coastline. To make measurements and analysis of the evolution of human morphostructural and throughout the industry, it was considered appropriate to create a three-dimensional model that would allow a synoptic view with the different information available to the industry. E 'was created a model using the points listed extrapolated from the Regional Technical Map scale 1:10,000, the map at scale 1:2000 dell'IGEA and the values of a detailed survey of the study area, measured at a scale 1: 500. How MicroStation CAD software was used, with whom it is made of a TIN high detail taking into account then is, if possible, of quoted points, roads, major infrastructure, contour lines (lines-intermediate-auxiliary), buildings and lines coast. The model was supported and shaped (draping) image obtained by integrating the color orthophotos of the area in 1:10,000 scale of the Autonomous Region of Sardinia and photos to scale 1:2,000 made to run dall'IGEA spa at the last event of the landslide. The use of aerial photographs, a scale similar to that of cartography, has allowed us to achieve excellent results by superimposing the frames of the areas of interest on models made, with views that appear to be consistent with the technical papers, with a maximum error of less than that of the reference mapping. Moreover, to emphasize the tectonic lineations, morphological aspects and changes in landscape and environment, it was considered appropriate to use a three-dimensional model, thanks to software used in this trial, with a high detail 3D visualization. Starting from the Regional Technical Map has been possible to realize the DEM file, then perform an interpolation with a point layer containing elevation values recorded separately and then superimpose the orthophoto to 3D surface. It was also decided to use a terrain model DTM knitted irregular TIN compared to a regular grid pattern GRID, because the first best response to the need to have a shirt that exploited all possible points present and identifiable in the territory. With the use of a TIN was thus possible to insert also the points detected by the GPS in the country to verify the area of detachment of the landslide, thus being able to increase the detail in the area of observation. Getting a noticeable "jump" quality: moving from a two-dimensional to a three-dimensional display. The model thus obtained has allowed a very good point of the area: they are easy to locate the outcrops of the different lithological structures, facilitating the study and evaluation for interventions of recovery.

  14. Ontology-Driven Business Modelling: Improving the Conceptual Representation of the REA Ontology

    NASA Astrophysics Data System (ADS)

    Gailly, Frederik; Poels, Geert

    Business modelling research is increasingly interested in exploring how domain ontologies can be used as reference models for business models. The Resource Event Agent (REA) ontology is a primary candidate for ontology-driven modelling of business processes because the REA point of view on business reality is close to the conceptual modelling perspective on business models. In this paper Ontology Engineering principles are employed to reengineer REA in order to make it more suitable for ontology-driven business modelling. The new conceptual representation of REA that we propose uses a single representation formalism, includes a more complete domain axiomatizat-ion (containing definitions of concepts, concept relations and ontological axioms), and is proposed as a generic model that can be instantiated to create valid business models. The effects of these proposed improvements on REA-driven business modelling are demonstrated using a business modelling example.

  15. [A systemic risk analysis of hospital management processes by medical employees--an effective basis for improving patient safety].

    PubMed

    Sobottka, Stephan B; Eberlein-Gonska, Maria; Schackert, Gabriele; Töpfer, Armin

    2009-01-01

    Due to the knowledge gap that exists between patients and health care staff the quality of medical treatment usually cannot be assessed securely by patients. For an optimization of safety in treatment-related processes of medical care, the medical staff needs to be actively involved in preventive and proactive quality management. Using voluntary, confidential and non-punitive systematic employee surveys, vulnerable topics and areas in patient care revealing preventable risks can be identified at an early stage. Preventive measures to continuously optimize treatment quality can be defined by creating a risk portfolio and a priority list of vulnerable topics. Whereas critical incident reporting systems are suitable for continuous risk assessment by detecting safety-relevant single events, employee surveys permit to conduct a systematic risk analysis of all treatment-related processes of patient care at any given point in time.

  16. Analysis of angle effect on particle flocculation in branch flow

    NASA Astrophysics Data System (ADS)

    Prasad, Karthik; Fink, Kathryn; Liepmann, Dorian

    2014-11-01

    Hollow point microneedle drug delivery systems are known to be highly susceptible to blockage, owing to their very small structures. This problem has been especially noted when delivering suspended particle solutions, such as vaccines. Attempts to reduce particle flocculation in such devices through surface treatments of the particles have been largely unsuccessful. Furthermore, the particle clog only forms at the mouths of the microneedle structures, leaving the downstream walls clear. This implies that the sudden change in length scales alter the hydrodynamic interactions, creating the conditions for particle flocculation. However, while it is known that particle flocculation occurs, the physics behind the event are obscure. We utilize micro-PIV to observe how the occurrence and formation of particle flocculation changes in relation to the angle encountered by particle laden flow into microfluidic branch structures. The results offer the ability to optimize particle flocculation in MEMS devices, increasing device efficacy and longevity.

  17. Sporulation in the Budding Yeast Saccharomyces cerevisiae

    PubMed Central

    Neiman, Aaron M.

    2011-01-01

    In response to nitrogen starvation in the presence of a poor carbon source, diploid cells of the yeast Saccharomyces cerevisiae undergo meiosis and package the haploid nuclei produced in meiosis into spores. The formation of spores requires an unusual cell division event in which daughter cells are formed within the cytoplasm of the mother cell. This process involves the de novo generation of two different cellular structures: novel membrane compartments within the cell cytoplasm that give rise to the spore plasma membrane and an extensive spore wall that protects the spore from environmental insults. This article summarizes what is known about the molecular mechanisms controlling spore assembly with particular attention to how constitutive cellular functions are modified to create novel behaviors during this developmental process. Key regulatory points on the sporulation pathway are also discussed as well as the possible role of sporulation in the natural ecology of S. cerevisiae. PMID:22084423

  18. ARC-1994-AC94-0353-2

    NASA Image and Video Library

    1994-07-01

    Photo Artwork composite by JPL This depiction of comet Shoemaker-Levy 9 impacting Jupiter is shown from several perspectives. IMAGE A is shown from the perspective of Earth based observers. IMAGE B shows the perspective from Galileo spacecraft which can observe the impact point directly. IMAGE C is shown from the Voyager 2 spacecraft, which may observe the event from its unique position at the outer reaches of the solar system. IMAGE D depicts a generic view from Jupiter's south pole. For visual appeal, most of the large cometary fragments are shown close to one another in this image. At the time of Jupiter impact, the fragments will be separated from one another by serveral times the distances shown. This image was created by D.A. Seal of JPL's Mission Design Section using orbital computations provIded by P.W. Chodas and D.K. Yeomans of JPL's Navigation Section.

  19. Breaking the icosahedra in boron carbide

    PubMed Central

    Xie, Kelvin Y.; An, Qi; Sato, Takanori; Breen, Andrew J.; Ringer, Simon P.; Goddard, William A.; Cairney, Julie M.; Hemker, Kevin J.

    2016-01-01

    Findings of laser-assisted atom probe tomography experiments on boron carbide elucidate an approach for characterizing the atomic structure and interatomic bonding of molecules associated with extraordinary structural stability. The discovery of crystallographic planes in these boron carbide datasets substantiates that crystallinity is maintained to the point of field evaporation, and characterization of individual ionization events gives unexpected evidence of the destruction of individual icosahedra. Statistical analyses of the ions created during the field evaporation process have been used to deduce relative atomic bond strengths and show that the icosahedra in boron carbide are not as stable as anticipated. Combined with quantum mechanics simulations, this result provides insight into the structural instability and amorphization of boron carbide. The temporal, spatial, and compositional information provided by atom probe tomography makes it a unique platform for elucidating the relative stability and interactions of primary building blocks in hierarchically crystalline materials. PMID:27790982

  20. Predicting the occurrence of embolic events: an analysis of 1456 episodes of infective endocarditis from the Italian Study on Endocarditis (SEI).

    PubMed

    Rizzi, Marco; Ravasio, Veronica; Carobbio, Alessandra; Mattucci, Irene; Crapis, Massimo; Stellini, Roberto; Pasticci, Maria Bruna; Chinello, Pierangelo; Falcone, Marco; Grossi, Paolo; Barbaro, Francesco; Pan, Angelo; Viale, Pierluigi; Durante-Mangoni, Emanuele

    2014-04-29

    Embolic events are a major cause of morbidity and mortality in patients with infective endocarditis. We analyzed the database of the prospective cohort study SEI in order to identify factors associated with the occurrence of embolic events and to develop a scoring system for the assessment of the risk of embolism. We retrospectively analyzed 1456 episodes of infective endocarditis from the multicenter study SEI. Predictors of embolism were identified. Risk factors identified at multivariate analysis as predictive of embolism in left-sided endocarditis, were used for the development of a risk score: 1 point was assigned to each risk factor (total risk score range: minimum 0 points; maximum 2 points). Three categories were defined by the score: low (0 points), intermediate (1 point), or high risk (2 points); the probability of embolic events per risk category was calculated for each day on treatment (day 0 through day 30). There were 499 episodes of infective endocarditis (34%) that were complicated by ≥ 1 embolic event. Most embolic events occurred early in the clinical course (first week of therapy: 15.5 episodes per 1000 patient days; second week: 3.7 episodes per 1000 patient days). In the total cohort, the factors associated with the occurrence of embolism at multivariate analysis were prosthetic valve localization (odds ratio, 1.84), right-sided endocarditis (odds ratio, 3.93), Staphylococcus aureus etiology (odds ratio, 2.23) and vegetation size ≥ 13 mm (odds ratio, 1.86). In left-sided endocarditis, Staphylococcus aureus etiology (odds ratio, 2.1) and vegetation size ≥ 13 mm (odds ratio, 2.1) were independently associated with embolic events; the 30-day cumulative incidence of embolism varied with risk score category (low risk, 12%; intermediate risk, 25%; high risk, 38%; p < 0.001). Staphylococcus aureus etiology and vegetation size are associated with an increased risk of embolism. In left-sided endocarditis, a simple scoring system, which combines etiology and vegetation size with time on antimicrobials, might contribute to a better assessment of the risk of embolism, and to a more individualized analysis of indications and contraindications for early surgery.

  1. Confidence intervals for the first crossing point of two hazard functions.

    PubMed

    Cheng, Ming-Yen; Qiu, Peihua; Tan, Xianming; Tu, Dongsheng

    2009-12-01

    The phenomenon of crossing hazard rates is common in clinical trials with time to event endpoints. Many methods have been proposed for testing equality of hazard functions against a crossing hazards alternative. However, there has been relatively few approaches available in the literature for point or interval estimation of the crossing time point. The problem of constructing confidence intervals for the first crossing time point of two hazard functions is considered in this paper. After reviewing a recent procedure based on Cox proportional hazard modeling with Box-Cox transformation of the time to event, a nonparametric procedure using the kernel smoothing estimate of the hazard ratio is proposed. The proposed procedure and the one based on Cox proportional hazard modeling with Box-Cox transformation of the time to event are both evaluated by Monte-Carlo simulations and applied to two clinical trial datasets.

  2. Developing points-based risk-scoring systems in the presence of competing risks.

    PubMed

    Austin, Peter C; Lee, Douglas S; D'Agostino, Ralph B; Fine, Jason P

    2016-09-30

    Predicting the occurrence of an adverse event over time is an important issue in clinical medicine. Clinical prediction models and associated points-based risk-scoring systems are popular statistical methods for summarizing the relationship between a multivariable set of patient risk factors and the risk of the occurrence of an adverse event. Points-based risk-scoring systems are popular amongst physicians as they permit a rapid assessment of patient risk without the use of computers or other electronic devices. The use of such points-based risk-scoring systems facilitates evidence-based clinical decision making. There is a growing interest in cause-specific mortality and in non-fatal outcomes. However, when considering these types of outcomes, one must account for competing risks whose occurrence precludes the occurrence of the event of interest. We describe how points-based risk-scoring systems can be developed in the presence of competing events. We illustrate the application of these methods by developing risk-scoring systems for predicting cardiovascular mortality in patients hospitalized with acute myocardial infarction. Code in the R statistical programming language is provided for the implementation of the described methods. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  3. Discrete Event Simulation Model of the Polaris 2.1 Gamma Ray Imaging Radiation Detection Device

    DTIC Science & Technology

    2016-06-01

    out China, Pakistan, and India as having a minimalist point of view with regards to nuclear weapons. For those in favor of this approach, he does...Referee event graph The referee listens to the start and stops of the mover and determines whether or not the Polaris has entered or exited the...are highlighted in Figure 17: • Polaris start point • Polaris end point • Polaris original waypoints • Polaris ad hoc waypoints • Number of

  4. Rhythm Patterns Interaction - Synchronization Behavior for Human-Robot Joint Action

    PubMed Central

    Mörtl, Alexander; Lorenz, Tamara; Hirche, Sandra

    2014-01-01

    Interactive behavior among humans is governed by the dynamics of movement synchronization in a variety of repetitive tasks. This requires the interaction partners to perform for example rhythmic limb swinging or even goal-directed arm movements. Inspired by that essential feature of human interaction, we present a novel concept and design methodology to synthesize goal-directed synchronization behavior for robotic agents in repetitive joint action tasks. The agents’ tasks are described by closed movement trajectories and interpreted as limit cycles, for which instantaneous phase variables are derived based on oscillator theory. Events segmenting the trajectories into multiple primitives are introduced as anchoring points for enhanced synchronization modes. Utilizing both continuous phases and discrete events in a unifying view, we design a continuous dynamical process synchronizing the derived modes. Inverse to the derivation of phases, we also address the generation of goal-directed movements from the behavioral dynamics. The developed concept is implemented to an anthropomorphic robot. For evaluation of the concept an experiment is designed and conducted in which the robot performs a prototypical pick-and-place task jointly with human partners. The effectiveness of the designed behavior is successfully evidenced by objective measures of phase and event synchronization. Feedback gathered from the participants of our exploratory study suggests a subjectively pleasant sense of interaction created by the interactive behavior. The results highlight potential applications of the synchronization concept both in motor coordination among robotic agents and in enhanced social interaction between humanoid agents and humans. PMID:24752212

  5. Correlation of Meiotic DSB Formation and Transcription Initiation Around Fission Yeast Recombination Hotspots.

    PubMed

    Yamada, Shintaro; Okamura, Mika; Oda, Arisa; Murakami, Hiroshi; Ohta, Kunihiro; Yamada, Takatomi

    2017-06-01

    Meiotic homologous recombination, a critical event for ensuring faithful chromosome segregation and creating genetic diversity, is initiated by programmed DNA double-strand breaks (DSBs) formed at recombination hotspots. Meiotic DSB formation is likely to be influenced by other DNA-templated processes including transcription, but how DSB formation and transcription interact with each other has not been understood well. In this study, we used fission yeast to investigate a possible interplay of these two events. A group of hotspots in fission yeast are associated with sequences similar to the cyclic AMP response element and activated by the ATF/CREB family transcription factor dimer Atf1-Pcr1. We first focused on one of those hotspots, ade6-3049 , and Atf1. Our results showed that multiple transcripts, shorter than the ade6 full-length messenger RNA, emanate from a region surrounding the ade6-3049 hotspot. Interestingly, we found that the previously known recombination-activation region of Atf1 is also a transactivation domain, whose deletion affected DSB formation and short transcript production at ade6-3049 These results point to a possibility that the two events may be related to each other at ade6-3049 In fact, comparison of published maps of meiotic transcripts and hotspots suggested that hotspots are very often located close to meiotically transcribed regions. These observations therefore propose that meiotic DSB formation in fission yeast may be connected to transcription of surrounding regions. Copyright © 2017 by the Genetics Society of America.

  6. The Use of Simulation to Reduce the Domain of "Black Swans" with Application to Hurricane Impacts to Power Systems.

    PubMed

    Berner, Christine L; Staid, Andrea; Flage, Roger; Guikema, Seth D

    2017-10-01

    Recently, the concept of black swans has gained increased attention in the fields of risk assessment and risk management. Different types of black swans have been suggested, distinguishing between unknown unknowns (nothing in the past can convincingly point to its occurrence), unknown knowns (known to some, but not to relevant analysts), or known knowns where the probability of occurrence is judged as negligible. Traditional risk assessments have been questioned, as their standard probabilistic methods may not be capable of predicting or even identifying these rare and extreme events, thus creating a source of possible black swans. In this article, we show how a simulation model can be used to identify previously unknown potentially extreme events that if not identified and treated could occur as black swans. We show that by manipulating a verified and validated model used to predict the impacts of hazards on a system of interest, we can identify hazard conditions not previously experienced that could lead to impacts much larger than any previous level of impact. This makes these potential black swan events known and allows risk managers to more fully consider them. We demonstrate this method using a model developed to evaluate the effect of hurricanes on energy systems in the United States; we identify hurricanes with potentially extreme impacts, storms well beyond what the historic record suggests is possible in terms of impacts. © 2016 Society for Risk Analysis.

  7. A homemade sand-volcano in a gassy alluvial plain (Medolla, Italy): when shallow drilling triggers violent degassing

    NASA Astrophysics Data System (ADS)

    Capaccioni, Bruno; Coltorti, Massimo; Todesco, Micol; Cremoni, Stefano; Di Giuseppe, Dario; Faccini, Barbara; Tessari, Umberto

    2017-04-01

    Sand volcanoes are remarkable geological features which form when shallow, water-saturated sand deposits are set in motion and reach the surface. This commonly occurs during earthquakes, as a result of liquefaction of waterlogged bodies, but some of these sand emissions are unrelated to seismic events. We present the case of a sand eruption triggered by a Cone Penetration Test (CPT) near Medolla (Italy), on the 10th of October 2014. A large amount of natural gas (CO2 and CH4)was erupted together with a mixture of water and sand, creating a sand volcano. The event was recorded and its evolution and final result were analyzed from several points of view. Our multidisciplinary approach involved morphological and sedimentological studies on the sand-volcano, chemical and isotopic analysis of discharged gases, repeated measurements of gas flux on the drill hole and of diffuse degassing in the surrounding area and numerical modelling of the aquifer feeding the discharge. Our results suggest that a geyser discharging a mixture of gas and water, capable of building a sand volcano, requires the presence of a shallow pressurized reservoir (1.2 MPa) where water coexists with a small amount of exsolved gas (a volume fraction of 0.05). The violent degassing occurred in Medolla confirms the role that a free gas phase may have in favoring the mobilization of liquid water and loose deposits, even in the absence of a seismic event.

  8. The Next Generation of NASA Night Sky Network: A Searchable Nationwide Database of Astronomy Events

    NASA Astrophysics Data System (ADS)

    Ames, Z.; Berendsen, M.; White, V.

    2010-08-01

    With support from NASA, the Astronomical Society of the Pacific (ASP) first developed the Night Sky Network (NSN) in 2004. The NSN was created in response to research conducted by the Institute for Learning Innovation (ILI) to determine what type of support amateur astronomers could use to increase the efficiency and extent of their educational outreach programs. Since its creation, the NSN has grown to include an online searchable database of toolkit resources, Presentation Skills Videos covering topics such as working with kids and how to answer difficult questions, and a searchable nationwide calendar of astronomy events that supports club organization. The features of the NSN have allowed the ASP to create a template that amateur science organizations might use to create a similar support network for their members and the public.

  9. Event-by-event PET image reconstruction using list-mode origin ensembles algorithm

    NASA Astrophysics Data System (ADS)

    Andreyev, Andriy

    2016-03-01

    There is a great demand for real time or event-by-event (EBE) image reconstruction in emission tomography. Ideally, as soon as event has been detected by the acquisition electronics, it needs to be used in the image reconstruction software. This would greatly speed up the image reconstruction since most of the data will be processed and reconstructed while the patient is still undergoing the scan. Unfortunately, the current industry standard is that the reconstruction of the image would not start until all the data for the current image frame would be acquired. Implementing an EBE reconstruction for MLEM family of algorithms is possible, but not straightforward as multiple (computationally expensive) updates to the image estimate are required. In this work an alternative Origin Ensembles (OE) image reconstruction algorithm for PET imaging is converted to EBE mode and is investigated whether it is viable alternative for real-time image reconstruction. In OE algorithm all acquired events are seen as points that are located somewhere along the corresponding line-of-responses (LORs), together forming a point cloud. Iteratively, with a multitude of quasi-random shifts following the likelihood function the point cloud converges to a reflection of an actual radiotracer distribution with the degree of accuracy that is similar to MLEM. New data can be naturally added into the point cloud. Preliminary results with simulated data show little difference between regular reconstruction and EBE mode, proving the feasibility of the proposed approach.

  10. Imaging Fracking Zones by Microseismic Reverse Time Migration for Downhole Microseismic Monitoring

    NASA Astrophysics Data System (ADS)

    Lin, Y.; Zhang, H.

    2015-12-01

    Hydraulic fracturing is an engineering tool to create fractures in order to better recover oil and gas from low permeability reservoirs. Because microseismic events are generally associated with fracturing development, microseismic monitoring has been used to evaluate the fracking process. Microseismic monitoring generally relies on locating microseismic events to understand the spatial distribution of fractures. For the multi-stage fracturing treatment, fractures created in former stages are strong scatterers in the medium and can induce strong scattering waves on the waveforms for microseismic events induced during later stages. In this study, we propose to take advantage of microseismic scattering waves to image fracking zones by using seismic reverse time migration method. For downhole microseismic monitoring that involves installing a string of seismic sensors in a borehole near the injection well, the observation geometry is actually similar to the VSP (vertical seismic profile) system. For this reason, we adapt the VSP migration method for the common shot gather to the common event gather. Microseismic reverse-time migration method involves solving wave equation both forward and backward in time for each microseismic event. At current stage, the microseismic RTM is based on 2D acoustic wave equation (Zhang and Sun, 2008), solved by the finite-difference method with PML absorbing boundary condition applied to suppress the reflections of artificial boundaries. Additionally, we use local wavefield decomposition instead of cross-correlation imaging condition to suppress the imaging noise. For testing the method, we create a synthetic dataset for a downhole microseismic monitoring system with multiple fracking stages. It shows that microseismic migration using individual event is able to clearly reveal the fracture zone. The shorter distance between fractures and the microseismic event the clearer the migration image is. By summing migration images for many events, it can better reveal the fracture development during the hydraulic fracturing treatment. The synthetic test shows that microseismic migration is able to characterize the fracturing zone along with microseismic events. We will extend the method from 2D to 3D as well as from acoustic to elastic and apply it to real microseismic data.

  11. 78 FR 38039 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-25

    ... its estimates for the enforcement authority exclusive to it regarding the class of motor vehicle... on the same page, but in no event smaller than 12-point type, or if provided by electronic means... same page, but in no event smaller than 8-point type. The long notice shall begin with a heading in...

  12. SEE Transient Response of Crane Interpoint Single Output Point of Load DC-DC Converters

    NASA Technical Reports Server (NTRS)

    Sanders, Anthony B.; Chen, Dakai; Kim, Hak S.; Phan, Anthony M.

    2011-01-01

    This study was undertaken to determine the single event effect and transient susceptibility of the Crane Interpoint Maximum Flexible Power (MFP) Single Output Point of Load DC/DC Converters for transient interruptions in the output signal and for destructive and non destructive events induced by exposing it to a heavy ion beam..

  13. 78 FR 41300 - Special Local Regulations and Safety Zones; Marine Events in Captain of the Port Long Island...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-10

    ... the events start and finish points. The Coast Guard received information about the Riverhead Rocks Triathlon from the event sponsor, Event Power, on May 2, 2013. Event Power held the Riverhead Rocks... difficulty of rescheduling the early morning start of the swim event with the desired high tide cycle. While...

  14. Program Converts VAX Floating-Point Data To UNIX

    NASA Technical Reports Server (NTRS)

    Alves, Marcos; Chapman, Bruce; Chu, Eugene

    1996-01-01

    VAX Floating Point to Host Floating Point Conversion (VAXFC) software converts non-ASCII files to unformatted floating-point representation of UNIX machine. This is done by reading bytes bit by bit, converting them to floating-point numbers, then writing results to another file. Useful when data files created by VAX computer must be used on other machines. Written in C language.

  15. Prediction of Intensity Change Subsequent to Concentric Eyewall Events

    NASA Astrophysics Data System (ADS)

    Mauk, Rachel Grant

    Concentric eyewall events have been documented numerous times in intense tropical cyclones over the last two decades. During a concentric eyewall event, an outer (secondary) eyewall forms around the inner (primary) eyewall. Improved instrumentation on aircraft and satellites greatly increases the likelihood of detecting an event. Despite the increased ability to detect such events, forecasts of intensity changes during and after these events remain poor. When concentric eyewall events occur near land, accurate intensity change predictions are especially critical to ensure proper emergency preparations and staging of recovery assets. A nineteen-year (1997-2015) database of concentric eyewall events is developed by analyzing microwave satellite imagery, aircraft- and land-based radar, and other published documents. Events are identified in both the North Atlantic and eastern North Pacific basins. TCs are categorized as single (1 event), serial (>= 2 events) and super-serial (>= 3 events). Key findings here include distinct spatial patterns for single and serial Atlantic TCs, a broad seasonal distribution for eastern North Pacific TCs, and apparent ENSO-related variability in both basins. The intensity change subsequent to the concentric eyewall event is calculated from the HURDAT2 database at time points relative to the start and to the end of the event. Intensity change is then categorized as Weaken (≤ -10 kt), Maintain (+/- 5 kt), and Strengthen (≥ 10 kt). Environmental conditions in which each event occurred are analyzed based on the SHIPS diagnostic files. Oceanic, dynamic, thermodynamic, and TC status predictors are selected for testing in a multiple discriminant analysis procedure to determine which variables successfully discriminate the intensity change category and the occurrence of additional concentric eyewall events. Intensity models are created for 12 h, 24 h, 36 h, and 48 h after the concentric eyewall events end. Leave-one-out cross validation is performed on each set of discriminators to generate classifications, which are then compared to observations. For each model, the top combinations achieve 80-95% overall accuracy in classifying TCs based on the environmental characteristics, although Maintain systems are frequently misclassified. The third part of this dissertation employs the Weather Research and Forecasting model to further investigate concentric eyewall events. Two serial Atlantic concentric eyewall cases (Katrina 2005 and Wilma 2005) are selected from the original study set, and WRF simulations performed using several model designs. Despite strong evidence from multiple sources that serial concentric eyewalls formed in both hurricanes, the WRF simulations did not produce identifiable concentric eyewall structures for Katrina, and only transient structures for Wilma. Possible reasons for the lack of concentric eyewall formation are discussed, including model resolution, microphysics, and data sources.

  16. INNOVATIVE BIODIESEL PRODUCTION: A SOLUTION TO THE SCIENTIFIC, TECHNICAL, AND EDUCATIONAL CHALLENGES OF SUSTAINABILITY

    EPA Science Inventory

    Loyola's STEP students completed over 20 team projects: Developed a business plan for biodiesel production, created the LUC biodiesel website, created the Bio­shorts documentaries, tabled at environmental events, publicized and put on two Biodiesel Forums (2nd one pending,...

  17. 32 CFR 518.7 - FOIA terms defined.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... fees in the event of a waiver denial. Written requests may be received by postal service or other... maintained primarily for the convenience of an agency employee, and not distributed to other agency employees... Government service; private materials brought into, created, or received in the office that were not created...

  18. My White House Salute.

    ERIC Educational Resources Information Center

    Trede, Mildred

    1992-01-01

    The topic of Presidents and First Ladies provides a thematic approach to the teaching of writing skills and thinking skills in the academic areas of language arts, mathematics, social studies, and science. Activities include, among others, creating a graph of Presidential characteristics and creating a play about an event in one President's life.…

  19. Born from a flood: The Salton Sea and its story of survival

    DOE PAGES

    Tompson, Andrew F. B.

    2016-02-02

    The Salton Sea is a terminal lake located in the deepest point of the topographically closed Salton Trough in southeastern California. It is currently the largest lake in area in the state. It was created by a flooding event along the Colorado River in 1905–1907, similar to the way historical floods over past centuries created ephemeral incarnations of ancient Lake Cahuilla in the same location. Its position at the center of today’s Imperial Valley, a hot and arid locale home to some of the most productive irrigated agricultural lands in the United States, has ensured its ongoing survival through amore » delicate balance between agricultural runoff, its principal form of input, and vast evaporation losses. Nevertheless, its parallel role as a recreational resource and important wildlife habitat, established over its first century of existence, is threatened by increasing salinity decreasing water quality, and reduced water allocations from the Colorado River that feeds the valley’s agriculture. Furthermore, the Salton Sea faces an increasingly uncertain future that will be influenced by reduced water imports from the Colorado River, demands for additional water sources to support farming and energy industries in the valley, and needs to stabilize the lake salinity, maintain recreational resources, and preserve what have become important ecosystems and wildlife habitats.« less

  20. Born from a flood: The Salton Sea and its story of survival

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tompson, Andrew F. B.

    The Salton Sea is a terminal lake located in the deepest point of the topographically closed Salton Trough in southeastern California. It is currently the largest lake in area in the state. It was created by a flooding event along the Colorado River in 1905–1907, similar to the way historical floods over past centuries created ephemeral incarnations of ancient Lake Cahuilla in the same location. Its position at the center of today’s Imperial Valley, a hot and arid locale home to some of the most productive irrigated agricultural lands in the United States, has ensured its ongoing survival through amore » delicate balance between agricultural runoff, its principal form of input, and vast evaporation losses. Nevertheless, its parallel role as a recreational resource and important wildlife habitat, established over its first century of existence, is threatened by increasing salinity decreasing water quality, and reduced water allocations from the Colorado River that feeds the valley’s agriculture. Furthermore, the Salton Sea faces an increasingly uncertain future that will be influenced by reduced water imports from the Colorado River, demands for additional water sources to support farming and energy industries in the valley, and needs to stabilize the lake salinity, maintain recreational resources, and preserve what have become important ecosystems and wildlife habitats.« less

  1. Fusion of nonclinical and clinical data to predict human drug safety.

    PubMed

    Johnson, Dale E

    2013-03-01

    Adverse drug reactions continue to be a major cause of morbidity in both patients receiving therapeutics and in drug R&D programs. Predicting and possibly eliminating these adverse events remains a high priority in industry, government agencies and healthcare systems. With small molecule candidates, the fusion of nonclinical and clinical data is essential in establishing an overall system that creates a true translational science approach. Several new advances are taking place that attempt to create a 'patient context' mechanism early in drug research and development and ultimately into the marketplace. This 'life-cycle' approach has as its core the development of human-oriented, nonclinical end points and the incorporation of clinical knowledge at the drug design stage. The next 5 years should witness an explosion of what the author views as druggable and safe chemical space, pharmacosafety molecular targets and the most important aspect, an understanding of unique susceptibilities in patients developing adverse drug reactions. Our current knowledge of clinical safety relies completely on pharmacovigilance data from approved and marketed drugs, with a few exceptions of drugs failing in clinical trials. Massive data repositories now and soon to be available via cloud computing should stimulate a major effort in expanding our view of clinical drug safety and its incorporation into early drug research and development.

  2. Impact of Fission Neutron Energies on Reactor Antineutrino Spectra

    NASA Astrophysics Data System (ADS)

    Hermanek, Keith; Littlejohn, Bryce; Gustafson, Ian

    2017-09-01

    Recent measurements of the reactor antineutrino spectra (Double Chooz, Reno, and Daya Bay) have shown a discrepancy in the 5-7 MeV region when compared to current theoretical models (Vogel and Huber-Mueller). There are numerous theories pertaining to this antineutrino anomaly, including theories that point to new physics beyond the standard model. In the paper ``Possible Origins and Implications of the Shoulder in Reactor Neutrino Spectra'' by A. Hayes et al., explanations for this anomaly are suggested. One theory is that there are interactions from fast and epithermal incident neutrons which are significant enough to create more events in the 5-7 MeV by a noticeable amount. In our research, we used the Oklo software network created by Dan Dwyer. This generates ab initio antineutrino and beta decay spectra based on standard fission yield databases ENDF, JENDL, JEFF, and the beta decay transition database ENSDF-6. Utilizing these databases as inputs, we show with reasonable assumptions one can prove contributions of fast and epithermal neutrons is less than 3% in the 5-7 MeV region. We also discovered rare isotopes are present in beta decay chains but not well measured and have no corresponding database information, and studied its effect onto the spectrum.

  3. ARTS. Accountability Reporting and Tracking System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J.F.; Faccio, R.M.

    ARTS is a micro based prototype of the data elements, screens, and information processing rules that apply to the Accountability Reporting Program. The system focuses on the Accountability Event. The Accountability Event is an occurrence of incurring avoidable costs. The system must be able to CRUD (Create, Retrieve, Update, Delete) instances of the Accountability Event. Additionally, the system must provide for a review committee to update the `event record` with findings and determination information. Lastly, the system must provide for financial representatives to perform a cost reporting process.

  4. Immediate Adverse Events in Interventional Pain Procedures: A Multi-Institutional Study.

    PubMed

    Carr, Carrie M; Plastaras, Christopher T; Pingree, Matthew J; Smuck, Matthew; Maus, Timothy P; Geske, Jennifer R; El-Yahchouchi, Christine A; McCormick, Zachary L; Kennedy, David J

    2016-12-01

    Interventional procedures directed toward sources of pain in the axial and appendicular musculoskeletal system are performed with increasing frequency. Despite the presence of evidence-based guidelines for such procedures, there are wide variations in practice. Case reports of serious complications such as spinal cord infarction or infection from spine injections lack appropriate context and create a misleading view of the risks of appropriately performed interventional pain procedures. To evaluate adverse event rate for interventional spine procedures performed at three academic interventional spine practices. Quality assurance databases at three academic interventional pain management practices that utilize evidence-based guidelines [1] were interrogated for immediate complications from interventional pain procedures. Review of the electronic medical record verified or refuted the occurrence of a complication. Same-day emergency department transfers or visits were also identified by a records search. Immediate complication data were available for 26,061 consecutive procedures. A radiology practice performed 19,170 epidural steroid (primarily transforaminal), facet, sacroiliac, and trigger point injections (2006-2013). A physiatry practice performed 6,190 spine interventions (2004-2009). A second physiatry practice performed 701 spine procedures (2009-2010). There were no major complications (permanent neurologic deficit or clinically significant bleeding [e.g., epidural hematoma]) with any procedure. Overall complication rate was 1.9% (493/26,061). Vasovagal reactions were the most frequent event (1.1%). Nineteen patients (<0.1%) were transferred to emergency departments for: allergic reactions, chest pain, symptomatic hypertension, and a vasovagal reaction. This study demonstrates that interventional pain procedures are safely performed with extremely low immediate adverse event rates when evidence-based guidelines are observed. © 2016 American Academy of Pain Medicine. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. How a Country-Wide Seismological Network Can Improve Understanding of Seismicity and Seismic Hazard -- The Example of Bhutan

    NASA Astrophysics Data System (ADS)

    Hetényi, G.; Diehl, T.; Singer, J.; Kissling, E. H.; Clinton, J. F.; Wiemer, S.

    2015-12-01

    The Eastern Himalayas are home to a seemingly complex seismo-tectonic evolution. The rate of instrumental seismicity is lower than the average along the orogen, there is no record of large historical events, but both paleoseismology and GPS studies point to potentially large (M>8) earthquakes. Due to the lack of a permanent seismic monitoring system in the area, our current level of understanding is inappropriate to create a reliable quantitative seismic hazard model for the region. Existing maps are based on questionable hypotheses and show major inconsistencies when compared to each other. Here we present results on national and regional scales from a 38-station broadband seismological network we operated for almost 2 years in the Kingdom of Bhutan. A thorough, state-of-the-art analysis of local and regional earthquakes builds a comprehensive catalogue that reveals significantly (2-to-3 orders of magnitude) more events than detected from global networks. The seismotectonic analysis reveals new patterns of seismic activity as well as striking differences over relatively short distances within the Himalayas, only partly explained by surface observations such as geology. We compare a priori and a posteriori (BMC) magnitude of completeness maps and show that our network was able to detect all felt events during its operation. Some of these events could be felt at surprisingly large distances. Based on our experiment and experience, we draft the pillars on which a permanent seismological observatory for Bhutan could be constructed. Such a continuous monitoring system of seismic activity could then lead to a reliable quantitative seismic hazard model for Bhutan and surrounding regions, and serve as a base to improve building codes and general preparedness.

  6. Coincidence in Time of the Imbrium Basin Impact and Apollo 15 KREEP Volcanic Flows: The Case for Impact-Induced Melting

    NASA Technical Reports Server (NTRS)

    Ryder, Graham

    1994-01-01

    On the Earth there is no firm evidence that impacts can induce volcanic activity. However, the Moon does provide a very likely example of volcanism induced by an immense impact: the Imbrium basin-forming event was immediately succeeded by a crustal partial melting event that released basalt flows characterized by K, rare-earth elements (REE), P, and other trace elements (KREEP) over a wide area creating the Apennine Bench Formation. Impact total melting is inconsistent with the chemistry and petrography of these Apollo 15 KREEP basalts, which are quite unlike the impact melts recognized at Taurus-Littrow as the products of the Serenitatis impact. The Imbrium impact and the KREEP volcanic events are indistinguishable in radiometric age, and thus the volcanism occurred less than about 20 Ma later than the impact (less than about 0.5% of lunar history). The sample record indicates that such KREEP volcanism had not occurred in the region prior to that time, and demonstrates that it never occurred again. Such coincidence in time implies a genetic relationship between the two events, and impact-induced partial melting or release appears to be the only feasible process. Nonetheless, the characteristics of the Apollo 15 KREEP basalts suggest large-degree crustal melting that is not easy to reconcile with the inability of lunar pressure release alone to induce partial melting unless the source was already almost at its melting point. The earliest history of the surface of the Earth, at a time of greater internal heat production and basin-forming impacts, could have been greatly influenced by impact-induced melting.

  7. High-frequency flux transfer events detected near Mercury

    NASA Astrophysics Data System (ADS)

    Schultz, Colin

    2013-01-01

    The physical process that creates connections between the magnetic fields emanating from the Sun and a planet—a process known as magnetic reconnection—creates a portal through which solar plasma can penetrate the planetary magnetic field. The opening of these portals, known as flux transfer events (FTEs), takes place roughly every 8 minutes at Earth and spawns a rope of streaming plasma that is typically about half of the radius of the Earth. As early as 1985, scientists analyzing the Mariner 10 observations, collected during their 1974-1975 flybys, have known that FTEs also occur at Mercury. However, using the measurements returned from the MErcury Surface, Space ENvironment, GEochemistry, and Ranging (MESSENGER) spacecraft now orbiting Mercury, Slavin et al. found that Mercurial flux transfer events are proportionally much larger, stronger, and more frequent than those at Earth.

  8. Satellite images of the September 2013 flood event in Lyons, Colorado

    USGS Publications Warehouse

    Cole, Christopher J.; Friesen, Beverly A.; Wilds, Stanley; Noble, Suzanne; Warner, Harumi; Wilson, Earl M.

    2013-01-01

    The U.S. Geological Survey (USGS) Special Applications Science Center (SASC) produced an image base map showing high-resolution remotely sensed data over Lyons, Colorado—a city that was severely affected by the flood event that occurred throughout much of the Colorado Front Range in September of 2013. The 0.5-meter WorldView-2 data products were created from imagery collected by DigitalGlobe on September 13 and September 24, 2013, during and following the flood event. The images shown on this map were created to support flood response efforts, specifically for use in determining damage assessment and mitigation decisions. The raw, unprocessed imagery were orthorectified and pan-sharpened to enhance mapping accuracy and spatial resolution, and reproduced onto a cartographic base map. These maps are intended to provide a snapshot representation of post-flood ground conditions, which may be useful to decisionmakers and the general public. The SASC also provided data processing and analysis support for other Colorado flood-affected areas by creating cartographic products, geo-corrected electro-optical and radar image mosaics, and GIS water cover files for use by the Colorado National Guard, the National Park Service, the U.S. Forest Service, and the flood response community. All products for this International Charter event were uploaded to the USGS Hazards Data Distribution System (HDDS) website (http://hdds.usgs.gov/hdds2/) for distribution.

  9. One-shot Synesthesia

    PubMed Central

    Kirschner, Alexandra; Nikolić, Danko

    2017-01-01

    Abstract Synesthesia is commonly thought to be a phenomenon of fixed associations between an outside inducer and a vivid concurrent experience. Hence, it has been proposed that synesthesia occurs due to additional connections in the brain with which synesthetes are born. Here we show that synesthesia can be a much richer and more flexible phenomenon with a capability to creatively construct novel synesthetic experiences as events unfold in people’s lives. We describe here cases of synesthetes who occasionally generate novel synesthetic experience, called one-shot synesthesias. These synesthetic experiences seem to share all the properties with the classical synesthetic associations except that they occur extremely rarely, people recalling only a few events over the lifetime. It appears that these one-shots are not created at random but are instead responses to specific life events. We contrast the properties of those rare synesthetic events with other, more commonly known forms of synesthesia that also create novel synesthetic experiences, but at a high rate—sometimes creating novel experiences every few seconds. We argue that one-shot synesthesias indicate that synesthetic associations are by their nature not prewired at birth but are dynamically constructed through mental operations and according to the needs of a synesthetic mind. Our conclusions have implications for understanding the biological underpinnings of synesthesia and the role the phenomenon plays in the lives of people endowed with synesthetic capacities. PMID:29188078

  10. From daily to sub-daily time steps - Creating a high temporal and spatial resolution climate reference data set for hydrological modeling and bias-correction of RCM data

    NASA Astrophysics Data System (ADS)

    Willkofer, Florian; Wood, Raul R.; Schmid, Josef; von Trentini, Fabian; Ludwig, Ralf

    2016-04-01

    The ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) focuses on the effects of climate change on hydro-meteorological extreme events and their implications for water management in Bavaria and Québec. It builds on the conjoint analysis of a large ensemble of the CRCM5, driven by 50 members of the CanESM2, and the latest information provided through the CORDEX-initiative, to better assess the influence of natural climate variability and climatic change on the dynamics of extreme events. A critical point in the entire project is the preparation of a meteorological reference dataset with the required temporal (1-6h) and spatial (500m) resolution to be able to better evaluate hydrological extreme events in mesoscale river basins. For Bavaria a first reference data set (daily, 1km) used for bias-correction of RCM data was created by combining raster based data (E-OBS [1], HYRAS [2], MARS [3]) and interpolated station data using the meteorological interpolation schemes of the hydrological model WaSiM [4]. Apart from the coarse temporal and spatial resolution, this mosaic of different data sources is considered rather inconsistent and hence, not applicable for modeling of hydrological extreme events. Thus, the objective is to create a dataset with hourly data of temperature, precipitation, radiation, relative humidity and wind speed, which is then used for bias-correction of the RCM data being used as driver for hydrological modeling in the river basins. Therefore, daily data is disaggregated to hourly time steps using the 'Method of fragments' approach [5], based on available training stations. The disaggregation chooses fragments of daily values from observed hourly datasets, based on similarities in magnitude and behavior of previous and subsequent events. The choice of a certain reference station (hourly data, provision of fragments) for disaggregating daily station data (application of fragments) is crucial and several methods will be tested to achieve a profound spatial interpolation. This entire methodology shall be applicable for existing or newly developed datasets. References [1] Haylock, M.R., N. Hofstra, A.M.G. Klein Tank, E.J. Klok, P.D. Jones and M. New. A European daily high-resolution gridded dataset of surface temperature and precipitation. J. Geophys. Res (Atmospheres) (2008), 113, D20119, doi:10.1029/2008JD10201. [2] Rauthe, M., Steiner, H., Riediger, U., Mazurkiewicz, A. and A. Gratzki. A Central European precipitation climatology - Part I: Generation and validation of a high-resolution gridded daily data set (HYRAS). Meteorologische Zeitschrift (2013), 22/3, p.238-256. [3] MARS-AGRI4CAST. AGRI4CAST Interpolated Meteorological Data. http://mars.jrc.ec.europa.eu/mars/ About-us/AGRI4CAST/Data-distribution/AGRI4CAST-Interpolated-Meteorological-Data. 2007, last accessed May 10th, 2013. [4] Schulla, J. Model Description WaSiM - Water balance Simulation Model. 2015, available at: http://wasim.ch/en/products/wasim_description.htm. [5] Sharma, A. and S. Srikanthan. Continuous Rainfall Simulation: A Nonparametric Alternative. 30th Hydrology and Water Resources Symposium, Launceston, Tasmania, 4-7 December, 2006.

  11. CONCENTRATED AMBIENT AIR POLLUTION CREATES OXIDATIVE STRESS IN CNS MICROGLIA.

    EPA Science Inventory

    Nanometer size particles carry free radical activity on their surface and can produce oxidative stress (OS)-mediated damage upon impact to target cells. The initiating event of phage cell activation (i.e., the oxidative burst) is unknown, although many proximal events have been i...

  12. The Top 10 Events Creating Gifted Education for the New Century.

    ERIC Educational Resources Information Center

    Roberts, Julia Link

    1999-01-01

    This article describes events that have shaped gifted education, including: deployment of Sputnik, the Marland Report, advocacy organizations, curriculum differentiation, brain research, gifted residential schools, the Richardson Study, the Javits Gifted and Talented Students Act, "National Excellence: A Case for Developing America's…

  13. The Chelyabinsk event

    NASA Astrophysics Data System (ADS)

    Borovička, Jiří

    2016-10-01

    On February 15, 2013, 3:20 UT, an asteroid of the size of about 19 meters and mass of 12,000 metric tons entered the Earth's atmosphere unexpectedly near the border of Kazakhstan and Russia. It was the largest confirmed Earth impactor since the Tunguska event in 1908. The body moved approximately westwards with a speed of 19 km s-1, on a trajectory inclined 18 degrees to the surface, creating a fireball of steadily increasing brightness. Eleven seconds after the first sightings, the fireball reached its maximum brightness. At that point, it was located less than 40 km south from Chelyabinsk, a Russian city of population more than one million, at an altitude of 30 km. For people directly underneath, the fireball was 30 times brighter than the Sun. The cosmic body disrupted into fragments; the largest of them was visible for another five seconds before it disappeared at an altitude of 12.5 km, when it was decelerated to 3 km s-1. Fifty six second later, that ~600 kg fragment landed in Lake Chebarkul and created a 8 m wide hole in the ice. Small meteorites landed in an area 80 km long and several km wide and caused no damage. The meteorites were classified as LL ordinary chondrites and were interesting by the presence of two phases, light and dark. More material remained, however, in the atmosphere forming a dust trail up to 2 km wide and extending along the fireball trajectory from altitude 18 to 70 km. The dust then circled the Earth within few days and formed a ring around the northern hemisphere. In Chelyabinsk and its surroundings a very strong blast wave arrived 90 - 150 s after the fireball passage (depending on location). The wave was produced by the supersonic flight of the body and broke ~10% of windows in Chelyabinsk (~40% of buildings were affected). More than 1600 people were injured, mostly from broken glass. The whole event was well documented by video cameras, seismic and infrasonic records, and satellite observations. The total energy was 500 kT TNT (2 × 1015 J).

  14. Post-Structural Methodology at the Quilting Point: Intercultural Encounters.

    PubMed

    Gillett, Grant

    Lacan's quilting point connects a network of signifiers with the lived world as a place of voices, memory, and adaptation "seen in" the mirror of language. Crossing cultures can obscure the ways we make sense of the world. Some planes of signification, in aiming to be universal in their knowledge (such as the natural sciences), try to track objects and events independent of our thoughts about them and the ways that signifiers may slide past each other. However, cross-structural comparison and the analysis of cross cultural encounters cannot treat its objects of interest that way. Thus we need a theory and methodology that effectively connects the multilayered discourses of subjectivities from diverse cultures and allows triangulation between them in relation to points of shared experience. At such points we need a critical attitude to our own framework and an openness to the uneasy reflective equilibrium that uncovers assumptions and modes of thinking that will hamper us. Quilting points are such points where different discourses converge on a single event or set of events so as to mark "vertical" connections allowing tentative alignments between ways of meaning so that we can begin to build real cross-cultural understanding.

  15. Autonomous Detection of Eruptions, Plumes, and Other Transient Events in the Outer Solar System

    NASA Astrophysics Data System (ADS)

    Bunte, M. K.; Lin, Y.; Saripalli, S.; Bell, J. F.

    2012-12-01

    The outer solar system abounds with visually stunning examples of dynamic processes such as eruptive events that jettison materials from satellites and small bodies into space. The most notable examples of such events are the prominent volcanic plumes of Io, the wispy water jets of Enceladus, and the outgassing of comet nuclei. We are investigating techniques that will allow a spacecraft to autonomously detect those events in visible images. This technique will allow future outer planet missions to conduct sustained event monitoring and automate prioritization of data for downlink. Our technique detects plumes by searching for concentrations of large local gradients in images. Applying a Scale Invariant Feature Transform (SIFT) to either raw or calibrated images identifies interest points for further investigation based on the magnitude and orientation of local gradients in pixel values. The interest points are classified as possible transient geophysical events when they share characteristics with similar features in user-classified images. A nearest neighbor classification scheme assesses the similarity of all interest points within a threshold Euclidean distance and classifies each according to the majority classification of other interest points. Thus, features marked by multiple interest points are more likely to be classified positively as events; isolated large plumes or multiple small jets are easily distinguished from a textured background surface due to the higher magnitude gradient of the plume or jet when compared with the small, randomly oriented gradients of the textured surface. We have applied this method to images of Io, Enceladus, and comet Hartley 2 from the Voyager, Galileo, New Horizons, Cassini, and Deep Impact EPOXI missions, where appropriate, and have successfully detected up to 95% of manually identifiable events that our method was able to distinguish from the background surface and surface features of a body. Dozens of distinct features are identifiable under a variety of viewing conditions and hundreds of detections are made in each of the aforementioned datasets. In this presentation, we explore the controlling factors in detecting transient events and discuss causes of success or failure due to distinct data characteristics. These include the level of calibration of images, the ability to differentiate an event from artifacts, and the variety of event appearances in user-classified images. Other important factors include the physical characteristics of the events themselves: albedo, size as a function of image resolution, and proximity to other events (as in the case of multiple small jets which feed into the overall plume at the south pole of Enceladus). A notable strength of this method is the ability to detect events that do not extend beyond the limb of a planetary body or are adjacent to the terminator or other strong edges in the image. The former scenario strongly influences the success rate of detecting eruptive events in nadir views.

  16. D Modeling of Components of a Garden by Using Point Cloud Data

    NASA Astrophysics Data System (ADS)

    Kumazakia, R.; Kunii, Y.

    2016-06-01

    Laser measurement is currently applied to several tasks such as plumbing management, road investigation through mobile mapping systems, and elevation model utilization through airborne LiDAR. Effective laser measurement methods have been well-documented in civil engineering, but few attempts have been made to establish equally effective methods in landscape engineering. By using point cloud data acquired through laser measurement, the aesthetic landscaping of Japanese gardens can be enhanced. This study focuses on simple landscape simulations for pruning and rearranging trees as well as rearranging rocks, lanterns, and other garden features by using point cloud data. However, such simulations lack concreteness. Therefore, this study considers the construction of a library of garden features extracted from point cloud data. The library would serve as a resource for creating new gardens and simulating gardens prior to conducting repairs. Extracted garden features are imported as 3ds Max objects, and realistic 3D models are generated by using a material editor system. As further work toward the publication of a 3D model library, file formats for tree crowns and trunks should be adjusted. Moreover, reducing the size of created models is necessary. Models created using point cloud data are informative because simply shaped garden features such as trees are often seen in the 3D industry.

  17. Effects of bacterial pollution caused by a strong typhoon event and the restoration of a recreational beach: Transitions of fecal bacterial counts and bacterial flora in beach sand.

    PubMed

    Suzuki, Yoshihiro; Teranishi, Kotaro; Matsuwaki, Tomonori; Nukazawa, Kei; Ogura, Yoshitoshi

    2018-05-28

    To determine the effects of bacteria pollution associated with a strong typhoon event and to assess the restoration of the normal bacterial flora, we used conventional filtration methods and nextgeneration sequencing of 16S rRNA genes to analyze the transition of fecal and total bacterial counts in water and core sand samples collected from a recreational beach. Immediately after the typhoon event, Escherichia coli counts increased to 82 CFU/100 g in the surface beach sand. E. coli was detected through the surface to sand 85-cm deep at the land side point (10-m land side from the high-water line). However, E. coli disappeared within a month from the land side point. The composition of the bacterial flora in the beach sand at the land point was directly influenced by the typhoon event. Pseudomonas was the most prevalent genus throughout the sand layers (0-102-cm deep) during the typhoon event. After 3 months, the population of Pseudomonas significantly decreased, and the predominant genus in the surface layer was Kaistobacter, although Pseudomonas was the major genus in the 17- to 85-cm layer. When the beach conditions stabilized, the number of pollutant Pseudomonas among the 10 most abundant genera decreased to lower than the limit of detection. The bacterial population of the sand was subsequently restored to the most populous pre-event orders at the land point. A land-side beach, where users directly contact the sand, was significantly affected by bacterial pollution caused by a strong typhoon event. We show here that the normal bacterial flora of the surface sand was restored within 1 month. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Importance of correlations and fluctuations on the initial source eccentricity in high-energy nucleus-nucleus collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alver, B.; Ballintijn, M.; Busza, W.

    2008-01-15

    In relativistic heavy-ion collisions, anisotropic collective flow is driven, event by event, by the initial eccentricity of the matter created in the nuclear overlap zone. Interpretation of the anisotropic flow data thus requires a detailed understanding of the effective initial source eccentricity of the event sample. In this paper, we investigate various ways of defining this effective eccentricity using the Monte Carlo Glauber (MCG) approach. In particular, we examine the participant eccentricity, which quantifies the eccentricity of the initial source shape by the major axes of the ellipse formed by the interaction points of the participating nucleons. We show thatmore » reasonable variation of the density parameters in the Glauber calculation, as well as variations in how matter production is modeled, do not significantly modify the already established behavior of the participant eccentricity as a function of collision centrality. Focusing on event-by-event fluctuations and correlations of the distributions of participating nucleons, we demonstrate that, depending on the achieved event-plane resolution, fluctuations in the elliptic flow magnitude v{sub 2} lead to most measurements being sensitive to the root-mean-square rather than the mean of the v{sub 2} distribution. Neglecting correlations among participants, we derive analytical expressions for the participant eccentricity cumulants as a function of the number of participating nucleons, N{sub part}, keeping nonnegligible contributions up to O(1/N{sub part}{sup 3}). We find that the derived expressions yield the same results as obtained from mixed-event MCG calculations which remove the correlations stemming from the nuclear collision process. Most importantly, we conclude from the comparison with MCG calculations that the fourth-order participant eccentricity cumulant does not approach the spatial anisotropy obtained assuming a smooth nuclear matter distribution. In particular, for the Cu+Cu system, these quantities deviate from each other by almost a factor of 2 over a wide range in centrality. This deviation reflects the essential role of participant spatial correlations in the interaction of two nuclei.« less

  19. Accuracy of open-source software segmentation and paper-based printed three-dimensional models.

    PubMed

    Szymor, Piotr; Kozakiewicz, Marcin; Olszewski, Raphael

    2016-02-01

    In this study, we aimed to verify the accuracy of models created with the help of open-source Slicer 3.6.3 software (Surgical Planning Lab, Harvard Medical School, Harvard University, Boston, MA, USA) and the Mcor Matrix 300 paper-based 3D printer. Our study focused on the accuracy of recreating the walls of the right orbit of a cadaveric skull. Cone beam computed tomography (CBCT) of the skull was performed (0.25-mm pixel size, 0.5-mm slice thickness). Acquired DICOM data were imported into Slicer 3.6.3 software, where segmentation was performed. A virtual model was created and saved as an .STL file and imported into Netfabb Studio professional 4.9.5 software. Three different virtual models were created by cutting the original file along three different planes (coronal, sagittal, and axial). All models were printed with a Selective Deposition Lamination Technology Matrix 300 3D printer using 80 gsm A4 paper. The models were printed so that their cutting plane was parallel to the paper sheets creating the model. Each model (coronal, sagittal, and axial) consisted of three separate parts (∼200 sheets of paper each) that were glued together to form a final model. The skull and created models were scanned with a three-dimensional (3D) optical scanner (Breuckmann smart SCAN) and were saved as .STL files. Comparisons of the orbital walls of the skull, the virtual model, and each of the three paper models were carried out with GOM Inspect 7.5SR1 software. Deviations measured between the models analysed were presented in the form of a colour-labelled map and covered with an evenly distributed network of points automatically generated by the software. An average of 804.43 ± 19.39 points for each measurement was created. Differences measured in each point were exported as a .csv file. The results were statistically analysed using Statistica 10, with statistical significance set at p < 0.05. The average number of points created on models for each measurement was 804.43 ± 19.39; however, deviation in some of the generated points could not be calculated, and those points were excluded from further calculations. From 94% to 99% of the measured absolute deviations were <1 mm. The mean absolute deviation between the skull and virtual model was 0.15 ± 0.11 mm, between the virtual and printed models was 0.15 ± 0.12 mm, and between the skull and printed models was 0.24 ± 0.21 mm. Using the optical scanner and specialized inspection software for measurements of accuracy of the created parts is recommended, as it allows one not only to measure 2-dimensional distances between anatomical points but also to perform more clinically suitable comparisons of whole surfaces. However, it requires specialized software and a very accurate scanner in order to be useful. Threshold-based, manually corrected segmentation of orbital walls performed with 3D Slicer software is accurate enough to be used for creating a virtual model of the orbit. The accuracy of the paper-based Mcor Matrix 300 3D printer is comparable to those of other commonly used 3-dimensional printers and allows one to create precise anatomical models for clinical use. The method of dividing the model into smaller parts and sticking them together seems to be quite accurate, although we recommend it only for creating small, solid models with as few parts as possible to minimize shift associated with gluing. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  20. The detection and analysis of point processes in biological signals

    NASA Technical Reports Server (NTRS)

    Anderson, D. J.; Correia, M. J.

    1977-01-01

    A pragmatic approach to the detection and analysis of discrete events in biomedical signals is taken. Examples from both clinical and basic research are provided. Introductory sections discuss not only discrete events which are easily extracted from recordings by conventional threshold detectors but also events embedded in other information carrying signals. The primary considerations are factors governing event-time resolution and the effects limits to this resolution have on the subsequent analysis of the underlying process. The analysis portion describes tests for qualifying the records as stationary point processes and procedures for providing meaningful information about the biological signals under investigation. All of these procedures are designed to be implemented on laboratory computers of modest computational capacity.

  1. The 1989 Solar Maximum Mission event list

    NASA Technical Reports Server (NTRS)

    Dennis, B. R.; Licata, J. P.; Tolbert, A. K.

    1992-01-01

    This document contains information on solar burst and transient activity observed by the Solar Maximum Mission (SMM) during 1989 pointed observations. Data from the following SMM experiments are included: (1) Gamma Ray Spectrometer, (2) Hard X-Ray Burst Spectrometer, (3) Flat Crystal Spectrometer, (4) Bent Crystal Spectrometer, (5) Ultraviolet Spectrometer Polarimeter, and (6) Coronagraph/Polarimeter. Correlative optical, radio, and Geostationary Operational Satellite (GOES) X-ray data are also presented. Where possible, bursts or transients observed in the various wavelengths were grouped into discrete flare events identified by unique event numbers. Each event carries a qualifier denoting the quality or completeness of the observations. Spacecraft pointing coordinates and flare site angular displacement values from sun center are also included.

  2. The 1988 Solar Maximum Mission event list

    NASA Technical Reports Server (NTRS)

    Dennis, B. R.; Licata, J. P.; Tolbert, A. K.

    1992-01-01

    Information on solar burst and transient activity observed by the Solar Maximum Mission (SMM) during 1988 pointed observations is presented. Data from the following SMM experiments are included: (1) gamma ray spectrometer; (2) hard x ray burst spectrometer; (3) flat crystal spectrometers; (4) bent crystal spectrometer; (5) ultraviolet spectrometer polarimeter; and (6) coronagraph/polarimeter. Correlative optical, radio, and Geostationary Operational Environmental Satellite (GOES) x ray data are also presented. Where possible, bursts, or transients observed in the various wavelengths were grouped into discrete flare events identified by unique event numbers. Each event carries a qualifier denoting the quality or completeness of the observation. Spacecraft pointing coordinates and flare site angular displacement values from sun center are also included.

  3. The 1984 - 1987 Solar Maximum Mission event list

    NASA Technical Reports Server (NTRS)

    Dennis, B. R.; Licata, J. P.; Nelson, J. J.; Tolbert, A. K.

    1992-01-01

    Information on solar burst and transient activity observed by the Solar Maximum Mission (SMM) during 1984-1987 pointed observations is presented. Data from the following SMM experiments are included: (1) gamma ray spectrometer; (2) hard x-ray burst spectrometer; (3) flat crystal spectrometer; (4) bent crystal spectrometer; (5) ultraviolet spectrometer polarimeter; and (6) coronograph/polarimeter. Correlative optical, radio, and Geostationary Operational Environmental Satellite (GOES) x ray data are also presented. Where possible, bursts or transients observed in the various wavelengths were grouped into discrete flare events identified by unique event numbers. Each event carries a qualifier denoting the quality or completeness of the observations. Spacecraft pointing coordinates and flare site angular displacement values from sun center are also included.

  4. Continental-Scale Estimates of Runoff Using Future Climate ...

    EPA Pesticide Factsheets

    Recent runoff events have had serious repercussions to both natural ecosystems and human infrastructure. Understanding how shifts in storm event intensities are expected to change runoff responses are valuable for local, regional, and landscape planning. To address this challenge, relative changes in runoff using predicted future climate conditions were estimated over different biophysical areas for the CONterminous U.S. (CONUS). Runoff was estimated using the Curve Number (CN) developed by the USDA Soil Conservation Service (USDA, 1986). A seamless gridded dataset representing a CN for existing land use/land cover (LULC) across the CONUS was used along with two different storm event grids created specifically for this effort. The two storm event grids represent a 2- and a 100-year, 24-hour storm event under current climate conditions. The storm event grids were generated using a compilation of county-scale Texas USGS Intensity-Duration-Frequency (IDF) data (provided by William Asquith, USGS, Lubbock, Texas), and NOAA Atlas-2 and NOAA Atlas-14 gridded data sets. Future CN runoff was predicted using extreme storm events grids created using a method based on Kao and Ganguly (2011) where precipitation extremes reflect changes in saturated water vapor pressure of the atmosphere in response to temperature changes. The Clausius-Clapeyron relationship establishes that the total water vapor mass of fully saturated air increases with increasing temperature, leading to

  5. Pointing Sets the Stage for Learning Language--and Creating Language

    ERIC Educational Resources Information Center

    Goldin-Meadow, Susan

    2007-01-01

    Tomasello, Carpenter, and Liszkowski (2007) have argued that pointing gestures do much more than single out objects in the world. Pointing gestures function as part of a system of shared intentionality even at early stages of development. As such, pointing gestures form the platform on which linguistic communication rests, paving the way for later…

  6. Some Social Dimensions of Entrepreneurship.

    ERIC Educational Resources Information Center

    Shapero, Albert

    Data from a wide range of disciplines can be used to create a paradigm describing the factors that enter into the creation of entrepreneurial events. Throughout the literature, entrepreneurial events are defined in terms of operational factors, such as initiative taking, bringing together resources, management, relative autonomy, and risk taking.…

  7. Meet the Teacher

    ERIC Educational Resources Information Center

    Kirker, Sara Schmickle

    2008-01-01

    This article describes how to create the life-size teacher portraits that are displayed during an annual "Meet the Teacher" event held to introduce students and families to the facility and staff of the Apple Glen Elementary School in Bentonville, Arkansas. Several months prior to this event, students are asked to closely observe their…

  8. Policy Expansion of School Choice in the American States

    ERIC Educational Resources Information Center

    Wong, Kenneth K.; Langevin, Warren E.

    2007-01-01

    This research study explores the policy expansion of school choice within the methodological approach of event history analysis. The first section provides a comparative overview of state adoption of public school choice laws. After creating a statistical portrait of the contemporary landscape for school choice, the authors introduce event history…

  9. The role of domain expertise and judgment in dealing with unexpected events

    NASA Astrophysics Data System (ADS)

    Kochan, Janeen Adrion

    Unexpected events, particularly those creating surprise, interrupt ongoing mental and behavioral processes, creating an increased potential for unwanted outcomes to the situation. Human reactions to unexpected events vary. One can hypothesize a number of reasons for this variation, including level of domain expertise, previous experience with similar events, emotional connotation, and the contextual surround of the event. Whereas interrupting ongoing activities and focusing attention temporarily on a surprising event may be a useful evolutionary response to a threatening situation, the same process may be maladaptive in today's highly dynamic world. The purpose of this study was to investigate how different aspects of expertise affected one's ability to detect and react to an unexpected event. It was hypothesized that there were two general types of expertise, domain expertise and judgment (Hammond, 2000), which influenced one's performance on dealing with an unexpected event. The goal of the research was to parse out the relative contribution of domain expertise, so the role of judgment could be revealed. The research questions for this study were: (a) Can we identify specific knowledges and skills which enhance one's ability to deal with unexpected events? (b) Are these skills "automatically" included in domain expertise? (c) How does domain expertise improve or deter one's reaction and response to unexpected events? (d) What role does judgment play in responding to surprise? The general hypothesis was that good judgment would influence the process of surprise at different stages and in different ways than would domain expertise. The conclusions from this research indicated that good judgment had a significant positive effect in helping pilots deal with unexpected events. This was most pronounced when domain expertise was low.

  10. Accuracy Assessment of a Complex Building 3d Model Reconstructed from Images Acquired with a Low-Cost Uas

    NASA Astrophysics Data System (ADS)

    Oniga, E.; Chirilă, C.; Stătescu, F.

    2017-02-01

    Nowadays, Unmanned Aerial Systems (UASs) are a wide used technique for acquisition in order to create buildings 3D models, providing the acquisition of a high number of images at very high resolution or video sequences, in a very short time. Since low-cost UASs are preferred, the accuracy of a building 3D model created using this platforms must be evaluated. To achieve results, the dean's office building from the Faculty of "Hydrotechnical Engineering, Geodesy and Environmental Engineering" of Iasi, Romania, has been chosen, which is a complex shape building with the roof formed of two hyperbolic paraboloids. Seven points were placed on the ground around the building, three of them being used as GCPs, while the remaining four as Check points (CPs) for accuracy assessment. Additionally, the coordinates of 10 natural CPs representing the building characteristic points were measured with a Leica TCR 405 total station. The building 3D model was created as a point cloud which was automatically generated based on digital images acquired with the low-cost UASs, using the image matching algorithm and different software like 3DF Zephyr, Visual SfM, PhotoModeler Scanner and Drone2Map for ArcGIS. Except for the PhotoModeler Scanner software, the interior and exterior orientation parameters were determined simultaneously by solving a self-calibrating bundle adjustment. Based on the UAS point clouds, automatically generated by using the above mentioned software and GNSS data respectively, the parameters of the east side hyperbolic paraboloid were calculated using the least squares method and a statistical blunder detection. Then, in order to assess the accuracy of the building 3D model, several comparisons were made for the facades and the roof with reference data, considered with minimum errors: TLS mesh for the facades and GNSS mesh for the roof. Finally, the front facade of the building was created in 3D based on its characteristic points using the PhotoModeler Scanner software, resulting a CAD (Computer Aided Design) model. The results showed the high potential of using low-cost UASs for building 3D model creation and if the building 3D model is created based on its characteristic points the accuracy is significantly improved.

  11. [Guidelines for the management of point-of care testing nonconformities according to the EN ISO 22870].

    PubMed

    Houlbert, C; Annaix, V; Szymanowicz, A; Vassault, A; Guimont, M C; Pernet, P

    2012-02-01

    In this paper, guidelines are proposed to fulfill the requirements of EN ISO 22870 standard regarding the management of point-of-care testing (POCT) nonconformities. In the first part, the main nonconformities that may affect POCT are given, the means for resolution and the control of adverse events are proposed. In the second part, we propose recommendations in case of unavailability of a point-of-care testing device from the occurring of the adverse event, to the restarting of the device.

  12. High spatial resolution detection of low-energy electrons using an event-counting method, application to point projection microscopy

    NASA Astrophysics Data System (ADS)

    Salançon, Evelyne; Degiovanni, Alain; Lapena, Laurent; Morin, Roger

    2018-04-01

    An event-counting method using a two-microchannel plate stack in a low-energy electron point projection microscope is implemented. 15 μm detector spatial resolution, i.e., the distance between first-neighbor microchannels, is demonstrated. This leads to a 7 times better microscope resolution. Compared to previous work with neutrons [Tremsin et al., Nucl. Instrum. Methods Phys. Res., Sect. A 592, 374 (2008)], the large number of detection events achieved with electrons shows that the local response of the detector is mainly governed by the angle between the hexagonal structures of the two microchannel plates. Using this method in point projection microscopy offers the prospect of working with a greater source-object distance (350 nm instead of 50 nm), advancing toward atomic resolution.

  13. A Bivariate return period for levee failure monitoring

    NASA Astrophysics Data System (ADS)

    Isola, M.; Caporali, E.

    2017-12-01

    Levee breaches are strongly linked with the interaction processes among water, soil and structure, thus many are the factors that affect the breach development. One of the main is the hydraulic load, characterized by intensity and duration, i.e. by the flood event hydrograph. On the magnitude of the hydraulic load is based the levee design, generally without considering the fatigue failure due to the load duration. Moreover, many are the cases in which the levee breach are characterized by flood of magnitude lower than the design one. In order to implement the strategies of flood risk management, we built here a procedure based on a multivariate statistical analysis of flood peak and volume together with the analysis of the past levee failure events. Particularly, in order to define the probability of occurrence of the hydraulic load on a levee, a bivariate copula model is used to obtain the bivariate joint distribution of flood peak and volume. Flood peak is the expression of the load magnitude, while the volume is the expression of the stress over time. We consider the annual flood peak and the relative volume. The volume is given by the hydrograph area between the beginning and the end of event. The beginning of the event is identified as an abrupt rise of the discharge by more than 20%. The end is identified as the point from which the receding limb is characterized by the baseflow, using a nonlinear reservoir algorithm as baseflow separation technique. By this, with the aim to define warning thresholds we consider the past levee failure events and the relative bivariate return period (BTr) compared with the estimation of a traditional univariate model. The discharge data of 30 hydrometric stations of Arno River in Tuscany, Italy, in the period 1995-2016 are analysed. The database of levee failure events, considering for each event the location as well as the failure mode, is also created. The events were registered in the period 2000-2014 by EEA-Europe Environment Agency, the Italian Civil Protection and ISPRA (the Italian National Institute for Environmental Protection and Research). Only two levee failures events occurred in the sub-basin of Era River have been detected and analysed. The estimated return period with the univariate model of flood peak is greater than 2 and 5 years while the BTr is greater of 25 and 30 years respectively.

  14. Failure Forecasting in Triaxially Stressed Sandstones

    NASA Astrophysics Data System (ADS)

    Crippen, A.; Bell, A. F.; Curtis, A.; Main, I. G.

    2017-12-01

    Precursory signals to fracturing events have been observed to follow power-law accelerations in spatial, temporal, and size distributions leading up to catastrophic failure. In previous studies this behavior was modeled using Voight's relation of a geophysical precursor in order to perform `hindcasts' by solving for failure onset time. However, performing this analysis in retrospect creates a bias, as we know an event happened, when it happened, and we can search data for precursors accordingly. We aim to remove this retrospective bias, thereby allowing us to make failure forecasts in real-time in a rock deformation laboratory. We triaxially compressed water-saturated 100 mm sandstone cores (Pc= 25MPa, Pp = 5MPa, σ = 1.0E-5 s-1) to the point of failure while monitoring strain rate, differential stress, AEs, and continuous waveform data. Here we compare the current `hindcast` methods on synthetic and our real laboratory data. We then apply these techniques to increasing fractions of the data sets to observe the evolution of the failure forecast time with precursory data. We discuss these results as well as our plan to mitigate false positives and minimize errors for real-time application. Real-time failure forecasting could revolutionize the field of hazard mitigation of brittle failure processes by allowing non-invasive monitoring of civil structures, volcanoes, and possibly fault zones.

  15. Monitoring risk: post marketing surveillance and signal detection.

    PubMed

    Dart, Richard C

    2009-12-01

    The primary goal of postmarketing surveillance is to provide information for risk assessment of a drug. Drugs affecting the central nervous system form a unique group of products for surveillance because they are often misused, abused, and diverted. These medications include opioid analgesics, stimulants, sedative-hypnotics, muscle relaxants, anticonvulsants and other drug classes. Their adverse events are difficult to monitor because the perpetrator often attempts to conceal the misuse, abuse and diversion of the product. A postmarketing surveillance system for prescription drugs of abuse in the U.S. should include product specific information that is accurate, immediately available, geographically specific and includes all areas of the country. Most producers of branded opioid analgesic products have created systems that measure abuse from multiple vantage points: criminal justice, treatment professionals, susceptible patient populations and acute health events. In the past, the U.S. government has not established similar requirements for the same products produced by generic manufacturers. However, the Food and Drug Administration Amendments Act of 2007 includes generic opioid analgesic products by requiring that all products containing potent opioid drugs perform rigorous surveillance and risk management. While general risk management guidance has been developed by FDA, more specific analyses and guidance are needed to improve surveillance methodology for drugs which are misused, abused, diverted.

  16. Who Will Be the First? Creating a Just Community in the Kindergarten.

    ERIC Educational Resources Information Center

    Aufenanger, Stefan

    Rules for selecting participants of games are the starting point for creating a just community among preschool-aged children. At first glance, it seems impossible to create a just community among young children whose social perspective-taking is undifferentiated and egocentric, and whose sense of positive justice is similarly limited. However,…

  17. Playing with Process: Video Game Choice as a Model of Behavior

    ERIC Educational Resources Information Center

    Waelchli, Paul

    2010-01-01

    Popular culture experience in video games creates avenues to practice information literacy skills and model research in a real-world setting. Video games create a unique popular culture experience where players can invest dozens of hours on one game, create characters to identify with, organize skill sets and plot points, collaborate with people…

  18. On Mathematical Proving

    NASA Astrophysics Data System (ADS)

    Stefaneas, Petros; Vandoulakis, Ioannis M.

    2015-12-01

    This paper outlines a logical representation of certain aspects of the process of mathematical proving that are important from the point of view of Artificial Intelligence. Our starting-point is the concept of proof-event or proving, introduced by Goguen, instead of the traditional concept of mathematical proof. The reason behind this choice is that in contrast to the traditional static concept of mathematical proof, proof-events are understood as processes, which enables their use in Artificial Intelligence in such contexts, in which problem-solving procedures and strategies are studied. We represent proof-events as problem-centered spatio-temporal processes by means of the language of the calculus of events, which captures adequately certain temporal aspects of proof-events (i.e. that they have history and form sequences of proof-events evolving in time). Further, we suggest a "loose" semantics for the proof-events, by means of Kolmogorov's calculus of problems. Finally, we expose the intented interpretations for our logical model from the fields of automated theorem-proving and Web-based collective proving.

  19. Characteristic of the postseismic deformation following the 2011 Sanriku-Oki earthquake (Mw 7.2) by comparing the 1989 and 1992 Sanriku-Oki events

    NASA Astrophysics Data System (ADS)

    Ohta, Yusaku; Hino, Ryota; Ariyoshi, Keisuke; Matsuzawa, Toru; Mishina, Masaaki; Sato, Tadahiro; Tachibana, Kenji; Demachi, Tomotsugu; Miura, Satoshi

    2013-04-01

    The March 11, 2011, moment magnitude (Mw) 9.0 Tohoku earthquake (hereafter referred to as the mainshock) generated a large tsunami, which caused devastating damage and the loss of more than 15,800 lives. On March 9, 2011 at 2:45 (UTC), an M7.3 interplate earthquake (hereafter referred to as the foreshock) occurred ~45 km northeast of the epicenter of the Mw9.0 mainshock. The focal mechanism estimated by the National Research Institute for Earth Science and Disaster Prevention (NIED) incorporates reverse fault motion with a west-northwest to east-southeast compression axis. This foreshock preceded the 2011 Tohoku earthquake by 51 h. Kato et al. [Science, 2012] pointed out aftershock migration after the foreshock along the trench axis toward the epicenter of the Mw9.0 mainshock on the basis of an earthquake catalog, which was created using a waveform correlation technique. They also estimated aseismic slip amount by the repeating earthquake analysis. Ohta et al. [GRL, 2012] proposed a coseismic and postseismic afterslip model of the foreshock based on a GPS network and ocean bottom pressure gauge sites. The estimated coseismic slip and afterslip areas show complementary spatial distributions. The slip amount for the afterslip is roughly consistent with that determined by repeating earthquake analysis carried out by Kato et al. [2012]. Ohta et al. [2012] also pointed out a volumetric strainmeter time series suggests that this event advanced with a rapid decay time constant compared with other typical large earthquakes. For verification of this exception, we investigated the postseismic deformation characteristic following the 1989 and 1992 Sanriku-Oki earthquake, which occurred 100-150 km north of the epicenter of the 2011 Sanriku-Oki event. We used four components extensometer of the Tohoku University at Miyako (39.59N, 141.98E) on the Sanriku coast for these events. To extract the characteristics of the postseismic deformation, we fitted the logarithmic function. The estimated decay time constant was relatively small compared with the typical interplate earthquakes in a similar fashion to 2011 Sanriku-Oki event. Our result suggests that the short decay time of the postseismic deformation is characteristic of this region. The exact reason of short decay time for these afterslips is unclear at present, but it was possibly controlled by the frictional property on the plate interface, especially effective normal stress controlled by fluid.

  20. ALP-RISK, a smartphone app for collecting data on geomorphic phenomena at high altitude in the Mont Blanc region

    NASA Astrophysics Data System (ADS)

    Ravanel, Ludovic; Deline, Philip

    2014-05-01

    A network of observers (mountain guides, hut keepers and mountaineers) has been created from 2005 for the Mont Blanc massif in order to acquire data on rockfall in permafrost-affected rock walls. This network, fully operational since 2007, is based on observation sheets or oral communications and has documented nearly 350 events with volume between 100 and 45,000 m3. Their analysis confirmed and helped to better understand the role of the permafrost degradation as main triggering factor. To i) reinforce this network, ii) facilitate its observation work and iii) develop it as well in space (the whole Mont Blanc region, or eventually the whole western Alps) as in a thematic point of view (all glacial and periglacial brutal phenomena), the Alp-Risk app has been created in the framework of the Alcotra PrévRisk Mont-Blanc project. The latter (2011-13) has been developed to improve the prevention of individual and collective natural hazards around the Mont Blanc massif. The app was created for I-Phones and Androids in three languages (French, English and Italian) and allows, as intuitively and quickly as possible, transmitting data on natural hazards in high mountain (snow and ice avalanche, landslides and rockfalls, landslides, moraine destabilization, water pocket outburst flood, torrential flood, and others) to both practitioners (observations available directly on the app via an interface web), scientists, and possibly local managers. Alp-Risk thus constitutes a new step for participatory science in the Mont Blanc region.

  1. Testing the shorter and variable recurrence interval hypothesis along the Cholame segment of the San Andreas Fault

    NASA Astrophysics Data System (ADS)

    Williams, A.; Arrowsmith, R.; Rockwell, T. K.; Akciz, S. O.; Grant Ludwig, L.

    2016-12-01

    The Cholame segment of the San Andreas Fault interacts with the Parkfield segment to the northwest with its creep and M6 earthquakes and the locked Carrizo segment to the southeast. Although offset reconstructions exist for this 75 km reach, rupture behavior is poorly characterized, limiting seismic hazard evaluation. Here we present new paleoseismic results from 2 fault perpendicular 26 m long trenches connected by a 15 m long fault parallel trench. The site is located south of the Parkfield segment 20 km southeast of Highway 46. Site geomorphology is characterized by several 50 m offset drainages northwest of the trenches, small shutter ridges and sag ponds, and alluvial fans crossing the fault. Fault zone stratigraphy consists of alternating finely bedded sands, silts, and gravels, and bioturbated soil horizons. The strata record 3-4 earthquakes and possible deformation post-1857, similar to the LY4 site 38 km southeast. E4, E3 and the most recent earthquake (MRE) are well supported by evidence of decreasing vertical offset up-sequence, capped fissure fill and colluvial wedges, which create small horst and graben structures. Units display vertical offsets ranging from 60 cm at the base to 12 cm near the MRE horizon, small colluvial wedges, and sag deposits within the 4 m wide fault zone. E2—the penultimate-is less certain, supported only by the decreasing offset in the stratigraphic sequence. The E4 event horizon is a gradational clayey silt sag deposit capped by discontinuous gravel, 18 cm at its thickest point and extending 4.8 m across the fault zone. The E3 and E2 event horizons are capped by thin bedded silty clay, and bounded by discontinuous burn horizons. The MRE horizon extends 6 m across the main fault zone, and consists of a silty clay sag deposit capped by very fine, bedded sand and coarse gravel, 22 cm at its thickest point and overlying a burn horizon. If the MRE is indeed the 1857 event, it has significant potential in correlation with the high quality rupture records at Bidart (70 km southeast), and Frazier Mountain (180 km southeast). This site contains abundant detrital charcoal in many of the units and burn horizons at or near event horizons providing great potential for bracketing the age of these paleoearthquakes.

  2. Point pattern analysis applied to flood and landslide damage events in Switzerland (1972-2009)

    NASA Astrophysics Data System (ADS)

    Barbería, Laura; Schulte, Lothar; Carvalho, Filipe; Peña, Juan Carlos

    2017-04-01

    Damage caused by meteorological and hydrological extreme events depends on many factors, not only on hazard, but also on exposure and vulnerability. In order to reach a better understanding of the relation of these complex factors, their spatial pattern and underlying processes, the spatial dependency between values of damage recorded at sites of different distances can be investigated by point pattern analysis. For the Swiss flood and landslide damage database (1972-2009) first steps of point pattern analysis have been carried out. The most severe events have been selected (severe, very severe and catastrophic, according to GEES classification, a total number of 784 damage points) and Ripley's K-test and L-test have been performed, amongst others. For this purpose, R's library spatstat has been used. The results confirm that the damage points present a statistically significant clustered pattern, which could be connected to prevalence of damages near watercourses and also to rainfall distribution of each event, together with other factors. On the other hand, bivariate analysis shows there is no segregated pattern depending on process type: flood/debris flow vs landslide. This close relation points to a coupling between slope and fluvial processes, connectivity between small-size and middle-size catchments and the influence of spatial distribution of precipitation, temperature (snow melt and snow line) and other predisposing factors such as soil moisture, land-cover and environmental conditions. Therefore, further studies will investigate the relationship between the spatial pattern and one or more covariates, such as elevation, distance from watercourse or land use. The final goal will be to perform a regression model to the data, so that the adjusted model predicts the intensity of the point process as a function of the above mentioned covariates.

  3. Reflecting on the Great Black Migration by Creating a Newspaper

    ERIC Educational Resources Information Center

    Hines, Angela

    2008-01-01

    This article describes the ways in which the author guided her third- and fourth-grade students in the use of historical fiction and primary and secondary sources (letters, historical newspapers, census data, photos) to think and write critically about provocative historical events. In creating their own newspaper, students learned to summarize…

  4. Creating an Effective Newsletter

    ERIC Educational Resources Information Center

    Shackelford, Ray; Griffis, Kurt

    2006-01-01

    Newsletters are an important resource or form of media. They offer a cost-effective way to keep people informed, as well as to promote events and programs. Production of a newsletter makes an excellent project, relevant to real-world communication, for technology students. This article presents an activity on how to create a short newsletter. The…

  5. Creating Micro-Videos to Demonstrate Technology Learning

    ERIC Educational Resources Information Center

    Frydenberg, Mark; Andone, Diana

    2016-01-01

    Short videos, also known as micro-videos, have emerged as a platform for sharing ideas, experiences, and life events on online social networks. This paper shares preliminary results of a study involving students from two universities who created six-second videos using the Vine mobile app to explain or illustrate technology concepts. An analysis…

  6. Novel Method of Storing and Reconstructing Events at Fermilab E-906/SeaQuest Using a MySQL Database

    NASA Astrophysics Data System (ADS)

    Hague, Tyler

    2010-11-01

    Fermilab E-906/SeaQuest is a fixed target experiment at Fermi National Accelerator Laboratory. We are investigating the antiquark asymmetry in the nucleon sea. By examining the ratio of the Drell- Yan cross sections of proton-proton and proton-deuterium collisions we can determine the asymmetry ratio. An essential feature in the development of the analysis software is to update the event reconstruction to modern software tools. We are doing this in a unique way by doing a majority of the calculations within an SQL database. Using a MySQL database allows us to take advantage of off-the-shelf software without sacrificing ROOT compatibility and avoid network bottlenecks with server-side data selection. Using our raw data we create stubs, or partial tracks, at each station which are pieced together to create full tracks. Our reconstruction process uses dynamically created SQL statements to analyze the data. These SQL statements create tables that contain the final reconstructed tracks as well as intermediate values. This poster will explain the reconstruction process and how it is being implemented.

  7. Effects of extended-release niacin with laropiprant in high-risk patients.

    PubMed

    Landray, Martin J; Haynes, Richard; Hopewell, Jemma C; Parish, Sarah; Aung, Theingi; Tomson, Joseph; Wallendszus, Karl; Craig, Martin; Jiang, Lixin; Collins, Rory; Armitage, Jane

    2014-07-17

    Patients with evidence of vascular disease are at increased risk for subsequent vascular events despite effective use of statins to lower the low-density lipoprotein (LDL) cholesterol level. Niacin lowers the LDL cholesterol level and raises the high-density lipoprotein (HDL) cholesterol level, but its clinical efficacy and safety are uncertain. After a prerandomization run-in phase to standardize the background statin-based LDL cholesterol-lowering therapy and to establish participants' ability to take extended-release niacin without clinically significant adverse effects, we randomly assigned 25,673 adults with vascular disease to receive 2 g of extended-release niacin and 40 mg of laropiprant or a matching placebo daily. The primary outcome was the first major vascular event (nonfatal myocardial infarction, death from coronary causes, stroke, or arterial revascularization). During a median follow-up period of 3.9 years, participants who were assigned to extended-release niacin-laropiprant had an LDL cholesterol level that was an average of 10 mg per deciliter (0.25 mmol per liter as measured in the central laboratory) lower and an HDL cholesterol level that was an average of 6 mg per deciliter (0.16 mmol per liter) higher than the levels in those assigned to placebo. Assignment to niacin-laropiprant, as compared with assignment to placebo, had no significant effect on the incidence of major vascular events (13.2% and 13.7% of participants with an event, respectively; rate ratio, 0.96; 95% confidence interval [CI], 0.90 to 1.03; P=0.29). Niacin-laropiprant was associated with an increased incidence of disturbances in diabetes control that were considered to be serious (absolute excess as compared with placebo, 3.7 percentage points; P<0.001) and with an increased incidence of diabetes diagnoses (absolute excess, 1.3 percentage points; P<0.001), as well as increases in serious adverse events associated with the gastrointestinal system (absolute excess, 1.0 percentage point; P<0.001), musculoskeletal system (absolute excess, 0.7 percentage points; P<0.001), skin (absolute excess, 0.3 percentage points; P=0.003), and unexpectedly, infection (absolute excess, 1.4 percentage points; P<0.001) and bleeding (absolute excess, 0.7 percentage points; P<0.001). Among participants with atherosclerotic vascular disease, the addition of extended-release niacin-laropiprant to statin-based LDL cholesterol-lowering therapy did not significantly reduce the risk of major vascular events but did increase the risk of serious adverse events. (Funded by Merck and others; HPS2-THRIVE ClinicalTrials.gov number, NCT00461630.).

  8. Prediction of Early Recurrent Thromboembolic Event and Major Bleeding in Patients With Acute Stroke and Atrial Fibrillation by a Risk Stratification Schema: The ALESSA Score Study.

    PubMed

    Paciaroni, Maurizio; Agnelli, Giancarlo; Caso, Valeria; Tsivgoulis, Georgios; Furie, Karen L; Tadi, Prasanna; Becattini, Cecilia; Falocci, Nicola; Zedde, Marialuisa; Abdul-Rahim, Azmil H; Lees, Kennedy R; Alberti, Andrea; Venti, Michele; Acciarresi, Monica; D'Amore, Cataldo; Mosconi, Maria Giulia; Cimini, Ludovica Anna; Procopio, Antonio; Bovi, Paolo; Carletti, Monica; Rigatelli, Alberto; Cappellari, Manuel; Putaala, Jukka; Tomppo, Liisa; Tatlisumak, Turgut; Bandini, Fabio; Marcheselli, Simona; Pezzini, Alessandro; Poli, Loris; Padovani, Alessandro; Masotti, Luca; Vannucchi, Vieri; Sohn, Sung-Il; Lorenzini, Gianni; Tassi, Rossana; Guideri, Francesca; Acampa, Maurizio; Martini, Giuseppe; Ntaios, George; Karagkiozi, Efstathia; Athanasakis, George; Makaritsis, Kostantinos; Vadikolias, Kostantinos; Liantinioti, Chrysoula; Chondrogianni, Maria; Mumoli, Nicola; Consoli, Domenico; Galati, Franco; Sacco, Simona; Carolei, Antonio; Tiseo, Cindy; Corea, Francesco; Ageno, Walter; Bellesini, Marta; Colombo, Giovanna; Silvestrelli, Giorgio; Ciccone, Alfonso; Scoditti, Umberto; Denti, Licia; Mancuso, Michelangelo; Maccarrone, Miriam; Orlandi, Giovanni; Giannini, Nicola; Gialdini, Gino; Tassinari, Tiziana; De Lodovici, Maria Luisa; Bono, Giorgio; Rueckert, Christina; Baldi, Antonio; D'Anna, Sebastiano; Toni, Danilo; Letteri, Federica; Giuntini, Martina; Lotti, Enrico Maria; Flomin, Yuriy; Pieroni, Alessio; Kargiotis, Odysseas; Karapanayiotides, Theodore; Monaco, Serena; Baronello, Mario Maimone; Csiba, Laszló; Szabó, Lilla; Chiti, Alberto; Giorli, Elisa; Del Sette, Massimo; Imberti, Davide; Zabzuni, Dorjan; Doronin, Boris; Volodina, Vera; Michel, Patrik; Vanacker, Peter; Barlinn, Kristian; Pallesen, Lars-Peder; Kepplinger, Jessica; Bodechtel, Ulf; Gerber, Johannes; Deleu, Dirk; Melikyan, Gayane; Ibrahim, Faisal; Akhtar, Naveed; Gourbali, Vanessa; Yaghi, Shadi

    2017-03-01

    This study was designed to derive and validate a score to predict early ischemic events and major bleedings after an acute ischemic stroke in patients with atrial fibrillation. The derivation cohort consisted of 854 patients with acute ischemic stroke and atrial fibrillation included in prospective series between January 2012 and March 2014. Older age (hazard ratio 1.06 for each additional year; 95% confidence interval, 1.00-1.11) and severe atrial enlargement (hazard ratio, 2.05; 95% confidence interval, 1.08-2.87) were predictors for ischemic outcome events (stroke, transient ischemic attack, and systemic embolism) at 90 days from acute stroke. Small lesions (≤1.5 cm) were inversely correlated with both major bleeding (hazard ratio, 0.39; P =0.03) and ischemic outcome events (hazard ratio, 0.55; 95% confidence interval, 0.30-1.00). We assigned to age ≥80 years 2 points and between 70 and 79 years 1 point; ischemic index lesion >1.5 cm, 1 point; severe atrial enlargement, 1 point (ALESSA score). A logistic regression with the receiver-operating characteristic graph procedure (C statistic) showed an area under the curve of 0.697 (0.632-0.763; P =0.0001) for ischemic outcome events and 0.585 (0.493-0.678; P =0.10) for major bleedings. The validation cohort consisted of 994 patients included in prospective series between April 2014 and June 2016. Logistic regression with the receiver-operating characteristic graph procedure showed an area under the curve of 0.646 (0.529-0.763; P =0.009) for ischemic outcome events and 0.407 (0.275-0.540; P =0.14) for hemorrhagic outcome events. In acute stroke patients with atrial fibrillation, high ALESSA scores were associated with a high risk of ischemic events but not of major bleedings. © 2017 American Heart Association, Inc.

  9. Comparison of computation time and image quality between full-parallax 4G-pixels CGHs calculated by the point cloud and polygon-based method

    NASA Astrophysics Data System (ADS)

    Nakatsuji, Noriaki; Matsushima, Kyoji

    2017-03-01

    Full-parallax high-definition CGHs composed of more than billion pixels were so far created only by the polygon-based method because of its high performance. However, GPUs recently allow us to generate CGHs much faster by the point cloud. In this paper, we measure computation time of object fields for full-parallax high-definition CGHs, which are composed of 4 billion pixels and reconstruct the same scene, by using the point cloud with GPU and the polygon-based method with CPU. In addition, we compare the optical and simulated reconstructions between CGHs created by these techniques to verify the image quality.

  10. Predicting the occurrence of embolic events: an analysis of 1456 episodes of infective endocarditis from the Italian Study on Endocarditis (SEI)

    PubMed Central

    2014-01-01

    Background Embolic events are a major cause of morbidity and mortality in patients with infective endocarditis. We analyzed the database of the prospective cohort study SEI in order to identify factors associated with the occurrence of embolic events and to develop a scoring system for the assessment of the risk of embolism. Methods We retrospectively analyzed 1456 episodes of infective endocarditis from the multicenter study SEI. Predictors of embolism were identified. Risk factors identified at multivariate analysis as predictive of embolism in left-sided endocarditis, were used for the development of a risk score: 1 point was assigned to each risk factor (total risk score range: minimum 0 points; maximum 2 points). Three categories were defined by the score: low (0 points), intermediate (1 point), or high risk (2 points); the probability of embolic events per risk category was calculated for each day on treatment (day 0 through day 30). Results There were 499 episodes of infective endocarditis (34%) that were complicated by ≥ 1 embolic event. Most embolic events occurred early in the clinical course (first week of therapy: 15.5 episodes per 1000 patient days; second week: 3.7 episodes per 1000 patient days). In the total cohort, the factors associated with the occurrence of embolism at multivariate analysis were prosthetic valve localization (odds ratio, 1.84), right-sided endocarditis (odds ratio, 3.93), Staphylococcus aureus etiology (odds ratio, 2.23) and vegetation size ≥ 13 mm (odds ratio, 1.86). In left-sided endocarditis, Staphylococcus aureus etiology (odds ratio, 2.1) and vegetation size ≥ 13 mm (odds ratio, 2.1) were independently associated with embolic events; the 30-day cumulative incidence of embolism varied with risk score category (low risk, 12%; intermediate risk, 25%; high risk, 38%; p < 0.001). Conclusions Staphylococcus aureus etiology and vegetation size are associated with an increased risk of embolism. In left-sided endocarditis, a simple scoring system, which combines etiology and vegetation size with time on antimicrobials, might contribute to a better assessment of the risk of embolism, and to a more individualized analysis of indications and contraindications for early surgery. PMID:24779617

  11. A modeling framework for investment planning in interdependent infrastructures in multi-hazard environments.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nathanael J. K.; Gearhart, Jared Lee; Jones, Dean A.

    Currently, much of protection planning is conducted separately for each infrastructure and hazard. Limited funding requires a balance of expenditures between terrorism and natural hazards based on potential impacts. This report documents the results of a Laboratory Directed Research & Development (LDRD) project that created a modeling framework for investment planning in interdependent infrastructures focused on multiple hazards, including terrorism. To develop this framework, three modeling elements were integrated: natural hazards, terrorism, and interdependent infrastructures. For natural hazards, a methodology was created for specifying events consistent with regional hazards. For terrorism, we modeled the terrorists actions based on assumptions regardingmore » their knowledge, goals, and target identification strategy. For infrastructures, we focused on predicting post-event performance due to specific terrorist attacks and natural hazard events, tempered by appropriate infrastructure investments. We demonstrate the utility of this framework with various examples, including protection of electric power, roadway, and hospital networks.« less

  12. MAGNETIC NULL POINTS IN KINETIC SIMULATIONS OF SPACE PLASMAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olshevsky, Vyacheslav; Innocenti, Maria Elena; Cazzola, Emanuele

    2016-03-01

    We present a systematic attempt to study magnetic null points and the associated magnetic energy conversion in kinetic particle-in-cell simulations of various plasma configurations. We address three-dimensional simulations performed with the semi-implicit kinetic electromagnetic code iPic3D in different setups: variations of a Harris current sheet, dipolar and quadrupolar magnetospheres interacting with the solar wind, and a relaxing turbulent configuration with multiple null points. Spiral nulls are more likely created in space plasmas: in all our simulations except lunar magnetic anomaly (LMA) and quadrupolar mini-magnetosphere the number of spiral nulls prevails over the number of radial nulls by a factor of 3–9.more » We show that often magnetic nulls do not indicate the regions of intensive energy dissipation. Energy dissipation events caused by topological bifurcations at radial nulls are rather rare and short-lived. The so-called X-lines formed by the radial nulls in the Harris current sheet and LMA simulations are rather stable and do not exhibit any energy dissipation. Energy dissipation is more powerful in the vicinity of spiral nulls enclosed by magnetic flux ropes with strong currents at their axes (their cross sections resemble 2D magnetic islands). These null lines reminiscent of Z-pinches efficiently dissipate magnetic energy due to secondary instabilities such as the two-stream or kinking instability, accompanied by changes in magnetic topology. Current enhancements accompanied by spiral nulls may signal magnetic energy conversion sites in the observational data.« less

  13. Effect of starting point formation on the crystallization of amorphous silicon films by flash lamp annealing

    NASA Astrophysics Data System (ADS)

    Sato, Daiki; Ohdaira, Keisuke

    2018-04-01

    We succeed in the crystallization of hydrogenated amorphous silicon (a-Si:H) films by flash lamp annealing (FLA) at a low fluence by intentionally creating starting points for the trigger of explosive crystallization (EC). We confirm that a partly thick a-Si part can induce the crystallization of a-Si films. A periodic wavy structure is observed on the surface of polycrystalline silicon (poly-Si) on and near the thick parts, which is a clear indication of the emergence of EC. Creating partly thick a-Si parts can thus be effective for the control of the starting point of crystallization by FLA and can realize the crystallization of a-Si with high reproducibility. We also compare the effects of creating thick parts at the center and along the edge of the substrates, and a thick part along the edge of the substrates leads to the initiation of crystallization at a lower fluence.

  14. Creating a mobile subject guide to improve access to point-of-care resources for medical students: a case study

    PubMed Central

    Boruff, Jill T; Bilodeau, Edward

    2012-01-01

    Question: Can a mobile optimized subject guide facilitate medical student access to mobile point-of-care tools? Setting: The guide was created at a library at a research-intensive university with six teaching hospital sites. Objectives: The team created a guide facilitating medical student access to point-of-care tools directly on mobile devices to provide information allowing them to access and set up resources with little assistance. Methods: Two librarians designed a mobile optimized subject guide for medicine and conducted a survey to test its usefulness. Results: Web analytics and survey results demonstrate that the guide is used and the students are satisfied. Conclusion: The library will continue to use the subject guide as its primary means of supporting mobile devices. It remains to be seen if the mobile guide facilitates access for those who do not need assistance and want direct access to the resources. Internet access in the hospitals remains an issue. PMID:22272160

  15. Creating a mobile subject guide to improve access to point-of-care resources for medical students: a case study.

    PubMed

    Boruff, Jill T; Bilodeau, Edward

    2012-01-01

    Can a mobile optimized subject guide facilitate medical student access to mobile point-of-care tools? The guide was created at a library at a research-intensive university with six teaching hospital sites. The team created a guide facilitating medical student access to point-of-care tools directly on mobile devices to provide information allowing them to access and set up resources with little assistance. Two librarians designed a mobile optimized subject guide for medicine and conducted a survey to test its usefulness. Web analytics and survey results demonstrate that the guide is used and the students are satisfied. The library will continue to use the subject guide as its primary means of supporting mobile devices. It remains to be seen if the mobile guide facilitates access for those who do not need assistance and want direct access to the resources. Internet access in the hospitals remains an issue.

  16. Creating stimuli for the study of biological-motion perception.

    PubMed

    Dekeyser, Mathias; Verfaillie, Karl; Vanrie, Jan

    2002-08-01

    In the perception of biological motion, the stimulus information is confined to a small number of lights attached to the major joints of a moving person. Despite this drastic degradation of the stimulus information, the human visual apparatus organizes the swarm of moving dots into a vivid percept of a moving biological creature. Several techniques have been proposed to create point-light stimuli: placing dots at strategic locations on photographs or films, video recording a person with markers attached to the body, computer animation based on artificial synthesis, and computer animation based on motion-capture data. A description is given of the technique we are currently using in our laboratory to produce animated point-light figures. The technique is based on a combination of motion capture and three-dimensional animation software (Character Studio, Autodesk, Inc., 1998). Some of the advantages of our approach are that the same actions can be shown from any viewpoint, that point-light versions, as well as versions with a full-fleshed character, can be created of the same actions, and that point lights can indicate the center of a joint (thereby eliminating several disadvantages associated with other techniques).

  17. Remembering from any angle: The flexibility of visual perspective during retrieval

    PubMed Central

    Rice, Heather J.; Rubin, David C.

    2010-01-01

    When recalling autobiographical memories, individuals often experience visual images associated with the event. These images can be constructed from two different perspectives: first person, in which the event is visualized from the viewpoint experienced at encoding, or third person, in which the event is visualized from an external vantage point. Using a novel technique to measure visual perspective, we examined where the external vantage point is situated in third-person images. Individuals in two studies were asked to recall either 10 or 15 events from their lives and describe the perspectives they experienced. Wide variation in spatial locations was observed within third-person perspectives, with the location of these perspectives depending on the event being recalled. Results suggest remembering from an external viewpoint may be more common than previous studies have demonstrated. PMID:21109466

  18. Understanding Learning: Assessment in the Turning Points School

    ERIC Educational Resources Information Center

    Center for Collaborative Education, 2005

    2005-01-01

    Turning Points helps middle schools create challenging, caring, and equitable learning communities that meet the needs of young adolescents as they reach the "turning point" between childhood and adulthood. Based on more than a decade of research and experience, this comprehensive school reform model focuses on improving student learning through…

  19. How Do Academic Disciplines Use PowerPoint?

    ERIC Educational Resources Information Center

    Garrett, Nathan

    2016-01-01

    How do academic disciplines use PowerPoint? This project analyzed PowerPoint files created by an academic publisher to supplement textbooks. An automated analysis of 30,263 files revealed clear differences by disciplines. Single-paradigm "hard" disciplines used less complex writing but had more words than multi-paradigm "soft"…

  20. A New Approach to Create Image Control Networks in ISIS

    NASA Astrophysics Data System (ADS)

    Becker, K. J.; Berry, K. L.; Mapel, J. A.; Walldren, J. C.

    2017-06-01

    A new approach was used to create a feature-based control point network that required the development of new tools in the Integrated Software for Imagers and Spectrometers (ISIS3) system to process very large datasets.

  1. Arrhythmia Associated with Buprenorphine and Methadone Reported to the Food and Drug Administration

    PubMed Central

    Kao, David P; Haigney, Mark CP; Mehler, Philip S; Krantz, Mori J

    2015-01-01

    Aim To assess the relative frequency of reporting of adverse events involving ventricular arrhythmia, cardiac arrest, QTc prolongation, or torsade de pointes to the US Food and Drug Administration (FDA) between buprenorphine and methadone. Design Retrospective pharmacoepidemiologic study Setting Adverse drug events spontaneously reported to the FDA between 1969-June 2011 originating in 196 countries (71% events from the US). Cases Adverse event cases mentioning methadone (n=14,915) or buprenorphine (n=7,283) were evaluated against all other adverse event cases (n= 4,796,141). Measurements The primary outcome was the composite of ventricular arrhythmia or cardiac arrest. The secondary outcome was the composite of QTc prolongation or torsade de pointes. The proportional reporting ratio (PRR) was used to identify disproportionate reporting defined as a PRR>2, χ2 error>4, with ≥3 cases. Findings There were 132 (1.8%) ventricular arrhythmia/cardiac arrest and 19 (0.3%) QTc prolongation/torsade de pointes cases associated with buprenorphine compared with 1729 (11.6%) ventricular arrhythmia/cardiac arrest and 390 (2.6%) QTc prolongation/torsade de pointes cases involving methadone. PRRs associated with buprenorphine were not significant for ventricular arrhythmia/cardiac arrest (1.1 95% confidence interval (CI) 0.9–1.3, χ2=1.2) or QTc prolongation/torsade de pointes (1.0 95% CI 0.7–1.9, χ2=0.0006), but were for methadone (7.2 95% CI 6.9–7.5, χ2=9160; 10.6 95% CI 9.7–11.8, χ2=3305, respectively). Conclusion In spontaneously reported adverse events, methadone is associated with disproportionate reporting of cardiac arrhythmias, whereas buprenorphine is not. Although these findings probably reflect clinically relevant differences, a causal connection cannot be presumed and disproportionality analysis cannot quantify absolute risk per treatment episode. Population-based studies to definitively quantify differential incidence rates are warranted. PMID:26075588

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jason L. Wright; Milos Manic

    Time synchronization and event time correlation are important in wireless sensor networks. In particular, time is used to create a sequence events or time line to answer questions of cause and effect. Time is also used as a basis for determining the freshness of received packets and the validity of cryptographic certificates. This paper presents secure method of time synchronization and event time correlation for TESLA-based hierarchical wireless sensor networks. The method demonstrates that events in a TESLA network can be accurately timestamped by adding only a few pieces of data to the existing protocol.

  3. The application of vector concepts on two skew lines

    NASA Astrophysics Data System (ADS)

    Alghadari, F.; Turmudi; Herman, T.

    2018-01-01

    The purpose of this study is knowing how to apply vector concepts on two skew lines in three-dimensional (3D) coordinate and its utilization. Several mathematical concepts have a related function for the other, but the related between the concept of vector and 3D have not applied in learning classroom. In fact, there are studies show that female students have difficulties in learning of 3D than male. It is because of personal spatial intelligence. The relevance of vector concepts creates both learning achievement and mathematical ability of male and female students enables to be balanced. The distance like on a cube, cuboid, or pyramid whose are drawn on the rectangular coordinates of a point in space. Two coordinate points of the lines can be created a vector. The vector of two skew lines has the shortest distance and the angle. Calculating of the shortest distance is started to create two vectors as a representation of line by vector position concept, next to determining a norm-vector of two vector which was obtained by cross-product, and then to create a vector from two combination of pair-points which was passed by two skew line, the shortest distance is scalar orthogonal projection of norm-vector on a vector which is a combination of pair-points. While calculating the angle are used two vectors as a representation of line to dot-product, and the inverse of cosine is yield. The utilization of its application on mathematics learning and orthographic projection method.

  4. Life-span retrieval of public events: Reminiscence bump for high-impact events, recency for others.

    PubMed

    Tekcan, Ali I; Boduroglu, Aysecan; Mutlutürk, Aysu; Aktan Erciyes, Aslı

    2017-10-01

    Although substantial evidence exists showing a reliable reminiscence bump for personal events, data regarding retrieval distributions for public events have been equivocal. The primary aim of the present study was to address life-span retrieval distributions of different types of public events in comparison to personal events, and to test whether the existing accounts of the bump can explain the distribution of public events. We asked a large national sample to report the most important, happiest, and saddest personal events and the most important, happiest, saddest, most proud, most fearful, and most shameful public events. We found a robust bump corresponding to the third decade of life for the happiest and the most important positive but not for the saddest and most important negative personal events. For the most important public events, a bump emerged only for the two most frequently mentioned events. Distributions of public events cued with emotions were marked by recency. These results point to potential differences in retrieval of important personal and public events. While the life-script framework well accounts for the findings regarding important personal events, a chronologically retroactive search seem to guide retrieval of public events. Reminiscence bump observed for the two public events suggest that age-at-event affects recall of public events to the degree that the events are high-impact ones that dominate nation's collective memory. Results provide further evidence that the bump is not unitary and points to importance of event type and memory elicitation method with regard to competing explanations of the phenomenon.

  5. Criteria for use of composite end points for competing risks-a systematic survey of the literature with recommendations.

    PubMed

    Manja, Veena; AlBashir, Siwar; Guyatt, Gordon

    2017-02-01

    Composite end points are frequently used in reports of clinical trials. One rationale for the use of composite end points is to account for competing risks. In the presence of competing risks, the event rate of a specific event depends on the rates of other competing events. One proposed solution is to include all important competing events in one composite end point. Clinical trialists require guidance regarding when this approach is appropriate. To identify publications describing criteria for use of composite end points for competing risk and to offer guidance regarding when a composite end point is appropriate on the basis of competing risks. We searched MEDLINE, CINAHL, EMBASE, The Cochrane's Central & Systematic Review databases including the Health Technology Assessment database, and the Cochrane's Methodology register from inception to April 2015, and candidate textbooks, to identify all articles providing guidance on this issue. Eligible publications explicitly addressed the issue of a composite outcome to address competing risks. Two reviewers independently screened the titles and abstracts for full-text review; independently reviewed full-text publications; and abstracted specific criteria authors offered for use of composite end points to address competing risks. Of 63,645 titles and abstracts, 166 proved potentially relevant of which 43 publications were included in the final review. Most publications note competing risks as a reason for using composite end points without further elaboration. None of the articles or textbook chapters provide specific criteria for use of composite end points for competing risk. Some advocate using composite end points to avoid bias due to competing risks and others suggest that composite end points seldom or never be used for this purpose. We recommend using composite end points for competing risks only if the competing risk is plausible and if it occurs with sufficiently high frequency to influence the interpretation of the effect of intervention on the end point of interest. These criteria will seldom be met. Review of heart failure trials published in the New England Journal of Medicine revealed that many of them use the composite end point of death or hospitalization; none of the trials, however, satisfied our criteria. The existing literature fails to provide clear guidance regarding use of composite end point for competing risks. We recommend using composite end points for competing risks only if the competing risk is plausible and if it occurs sufficiently often. Published by Elsevier Inc.

  6. IA and PA network-based computation of coordinating combat behaviors in the military MAS

    NASA Astrophysics Data System (ADS)

    Xia, Zuxun; Fang, Huijia

    2004-09-01

    In the military multi-agent system every agent needs to analyze the dependent and temporal relations among the tasks or combat behaviors for working-out its plans and getting the correct behavior sequences, it could guarantee good coordination, avoid unexpected damnification and guard against bungling the change of winning a battle due to the possible incorrect scheduling and conflicts. In this paper IA and PA network based computation of coordinating combat behaviors is put forward, and emphasize particularly on using 5x5 matrix to represent and compute the temporal binary relation (between two interval-events, two point-events or between one interval-event and one point-event), this matrix method makes the coordination computing convenience than before.

  7. The 1980 solar maximum mission event listing

    NASA Technical Reports Server (NTRS)

    Speich, D. M.; Nelson, J. J.; Licata, J. P.; Tolbert, A. K.

    1991-01-01

    Information is contained on solar burst and transient activity observed by the Solar Maximum Mission (SMM) during 1980 pointed observations. Data from the following SMM experiments are included: (1) Gamma Ray Spectrometer, (2) Hard X-Ray Burst Spectrometer, (3) Hard X-Ray Imaging Spectrometer, (4) Flat Crystal Spectrometer, (5) Bent Crystal Spectrometer, (6) Ultraviolet Spectrometer and Polarimeter, and (7) Coronagraph/Polarimeter. Correlative optical, radio, and Geostationary Operational Environmental Satellite (GOES) x ray data are also presented. Where possible, bursts or transients observed in the various wavelengths were grouped into discrete flare events identified by unique event numbers. Each event carries a qualifier denoting the quality or completeness of the observations. Spacecraft pointing coordinates and flare site angular displacement values from Sun center are also included.

  8. Korean Affairs Report.

    DTIC Science & Technology

    1986-07-25

    scientists , technicians, and workers to emulate the Nagwon working class in rejecting conservatism and technical mysticism and work toward creating new...25 Twelve Students Charged With Rallies Released (THE KOREA TIMES, 21 Jun 86) 26 Seven SNU Activists Create Disturbances in Courtroom (THE...Tokyo. The speakers at the symposium pointed to the tense situation created on the Korean peninsula, the recent anti-U.S. struggle for independence in

  9. The ocean-atmosphere response to wind-induced thermocline changes in the tropical South Western Indian Ocean

    NASA Astrophysics Data System (ADS)

    Manola, Iris; Selten, F. M.; de Ruijter, W. P. M.; Hazeleger, W.

    2015-08-01

    In the Indian Ocean basin the sea surface temperatures (SSTs) are most sensitive to changes in the oceanic depth of the thermocline in the region of the Seychelles Dome. Observational studies have suggested that the strong SST variations in this region influence the atmospheric evolution around the basin, while its impact could extend far into the Pacific and the extra-tropics. Here we study the adjustments of the coupled atmosphere-ocean system to a winter shallow doming event using dedicated ensemble simulations with the state-of-the-art EC-Earth climate model. The doming creates an equatorial Kelvin wave and a pair of westward moving Rossby waves, leading to higher SST 1-2 months later in the Western equatorial Indian Ocean. Atmospheric convection is strengthened and the Walker circulation responds with reduced convection over Indonesia and cooling of the SST in that region. The Pacific warm pool convection shifts eastward and an oceanic Kelvin wave is triggered at thermocline depth. The wave leads to an SST warming in the East Equatorial Pacific 5-6 months after the initiation of the Seychelles Dome event. The atmosphere responds to this warming with weak anomalous atmospheric convection. The changes in the upper tropospheric divergence in this sequence of events create large-scale Rossby waves that propagate away from the tropics along the atmospheric waveguides. We suggest to repeat these types of experiments with other models to test the robustness of the results. We also suggest to create the doming event in June so that the East-Pacific warming occurs in November when the atmosphere is most sensitive to SST anomalies and El Niño could possibly be triggered by the doming event under suitable conditions.

  10. Using integrated modeling for generating watershed-scale dynamic flood maps for Hurricane Harvey

    NASA Astrophysics Data System (ADS)

    Saksena, S.; Dey, S.; Merwade, V.; Singhofen, P. J.

    2017-12-01

    Hurricane Harvey, which was categorized as a 1000-year return period event, produced unprecedented rainfall and flooding in Houston. Although the expected rainfall was forecasted much before the event, there was no way to identify which regions were at higher risk of flooding, the magnitude of flooding, and when the impacts of rainfall would be highest. The inability to predict the location, duration, and depth of flooding created uncertainty over evacuation planning and preparation. This catastrophic event highlighted that the conventional approach to managing flood risk using 100-year static flood inundation maps is inadequate because of its inability to predict flood duration and extents for 500-year or 1000-year return period events in real-time. The purpose of this study is to create models that can dynamically predict the impacts of rainfall and subsequent flooding, so that necessary evacuation and rescue efforts can be planned in advance. This study uses a 2D integrated surface water-groundwater model called ICPR (Interconnected Channel and Pond Routing) to simulate both the hydrology and hydrodynamics for Hurricane Harvey. The methodology involves using the NHD stream network to create a 2D model that incorporates rainfall, land use, vadose zone properties and topography to estimate streamflow and generate dynamic flood depths and extents. The results show that dynamic flood mapping captures the flood hydrodynamics more accurately and is able to predict the magnitude, extent and time of occurrence for extreme events such as Hurricane Harvey. Therefore, integrated modeling has the potential to identify regions that are more susceptible to flooding, which is especially useful for large-scale planning and allocation of resources for protection against future flood risk.

  11. Asymmetric Fireballs in Symmetric Collisions

    DOE PAGES

    Bialas, A.; Bzdak, A.; Zalewski, K.

    2013-01-01

    Here, this contribution reports on the results obtained in the two recently published papers demonstrating that data of the STAR Collaboration show a substantial asymmetric component in the rapidity distribution of the system created in central Au-Au collisions, implying that boost invariance is violated on the event-by-event basis even at the mid c.m. rapidity.

  12. How Incidental Sequence Learning Creates Reportable Knowledge: The Role of Unexpected Events

    ERIC Educational Resources Information Center

    Runger, Dennis; Frensch, Peter A.

    2008-01-01

    Research on incidental sequence learning typically is concerned with the characteristics of implicit or nonconscious learning. In this article, the authors aim to elucidate the cognitive mechanisms that contribute to the generation of explicit, reportable sequence knowledge. According to the unexpected-event hypothesis (P. A. Frensch, H. Haider,…

  13. Creating Reality: How TV News Distorts Events.

    ERIC Educational Resources Information Center

    Altheide, David L.

    A three-year research project, including more than one year in a network affiliate station, provided the material for an analysis of current practices in television news programming. Based on the thesis that the organization of news encourages the oversimplification of events, this analysis traces the foundation of the bias called the "news…

  14. Clarifying muddy water: probing the linkages to municipal water quality.

    Treesearch

    Sally Duncan

    2003-01-01

    In the Pacific Northwest, several recent and dramatic "muddy waters" events have created major problems for water utilities. Resulting from floods and measures to retrofit dams to reduce impacts on temperature, these events also have focused public and scientific attention on interactions among dams, forest-land use, and municipal water supplies. Far from...

  15. Single Event Effects (SEE) for Power Metal-Oxide-Semiconductor Field-Effect Transistors (MOSFETs)

    NASA Technical Reports Server (NTRS)

    Lauenstein, Jean-Marie

    2011-01-01

    Single-event gate rupture (SEGR) continues to be a key failure mode in power MOSFETs. (1) SEGR is complex, making rate prediction difficult SEGR mechanism has two main components: (1) Oxide damage-- Reduces field required for rupture (2) Epilayer response -- Creates transient high field across the oxide.

  16. Using SDO Data in the Classroom to Do Real Science -- A Community College Laboratory Investigation

    NASA Astrophysics Data System (ADS)

    Dave, T. A.; Hildreth, S.; Lee, S.; Scherrer, D. K.

    2013-12-01

    The incredible accessibility of extremely high spatial and temporal resolution data from the Solar Dynamics Observatory creates an opportunity for students to do almost real-time investigation in an Astronomy Lab. We are developing a short series of laboratory exercises using SDO data, targeted for Community College students in an introductory lab class, extendable to high school and university students. The labs initially lead students to explore what SDO can do, online, through existing SDO video clips taken on specific dates. Students then investigate solar events using the Heliophysics Events Knowledgebase (HEK), and make their own online movies of events, to discuss and share with classmates. Finally, students can investigate specific events and areas, selecting specific dates, locations, wavelength regions, and time cadences to create and gather their own SDO datasets for more detailed investigation. In exploring the Sun using actual data, students actually do real science. We are in the process of beta testing the sequence of labs, and are seeking interested community college, university, and high school astronomy lab teachers who might consider trying the labs themselves.

  17. Acupuncture for treating fibromyalgia

    PubMed Central

    Deare, John C; Zheng, Zhen; Xue, Charlie CL; Liu, Jian Ping; Shang, Jingsheng; Scott, Sean W; Littlejohn, Geoff

    2014-01-01

    Background One in five fibromyalgia sufferers use acupuncture treatment within two years of diagnosis. Objectives To examine the benefits and safety of acupuncture treatment for fibromyalgia. Search methods We searched CENTRAL, PubMed, EMBASE, CINAHL, National Research Register, HSR Project and Current Contents, as well as the Chinese databases VIP and Wangfang to January 2012 with no language restrictions. Selection criteria Randomised and quasi-randomised studies evaluating any type of invasive acupuncture for fibromyalgia diagnosed according to the American College of Rheumatology (ACR) criteria, and reporting any main outcome: pain, physical function, fatigue, sleep, total well-being, stiffness and adverse events. Data collection and analysis Two author pairs selected trials, extracted data and assessed risk of bias. Treatment effects were reported as standardised mean differences (SMD) and 95%confidence intervals (CI) for continuous outcomes using different measurement tools (pain, physical function, fatigue, sleep, total well-being and stiffness) and risk ratio (RR) and 95% CI for dichotomous outcomes (adverse events).We pooled data using the random-effects model. Main results Nine trials (395 participants) were included. All studies except one were at low risk of selection bias; five were at risk of selective reporting bias (favouring either treatment group); two were subject to attrition bias (favouring acupuncture); three were subject to performance bias (favouring acupuncture) and one to detection bias (favouring acupuncture). Three studies utilised electro-acupuncture (EA) with the remainder using manual acupuncture (MA) without electrical stimulation. All studies used ’formula acupuncture’ except for one, which used trigger points. Low quality evidence from one study (13 participants) showed EA improved symptoms with no adverse events at one month following treatment. Mean pain in the non-treatment control group was 70 points on a 100 point scale; EA reduced pain by a mean of 22 points (95% confidence interval (CI) 4 to 41), or 22% absolute improvement. Control group global well-being was 66.5 points on a 100 point scale; EA improved well-being by a mean of 15 points (95% CI 5 to 26 points). Control group stiffness was 4.8 points on a 0 to 10 point; EA reduced stiffness by a mean of 0.9 points (95% CI 0.1 to 2 points; absolute reduction 9%, 95% CI 4% to 16%). Fatigue was 4.5 points (10 point scale) without treatment; EA reduced fatigue by a mean of 1 point (95% CI 0.22 to 2 points), absolute reduction 11% (2% to 20%). There was no difference in sleep quality (MD 0.4 points, 95% CI −1 to 0.21 points, 10 point scale), and physical function was not reported. Moderate quality evidence from six studies (286 participants) indicated that acupuncture (EA or MA) was no better than sham acupuncture, except for less stiffness at one month. Subgroup analysis of two studies (104 participants) indicated benefits of EA. Mean pain was 70 points on 0 to 100 point scale with sham treatment; EA reduced pain by 13% (5% to 22%); (SMD −0.63, 95% CI −1.02 to −0.23). Global well-being was 5.2 points on a 10 point scale with sham treatment; EA improved well-being: SMD 0.65, 95% CI 0.26 to 1.05; absolute improvement 11% (4% to 17%). EA improved sleep, from 3 points on a 0 to 10 point scale in the sham group: SMD 0.40 (95% CI 0.01 to 0.79); absolute improvement 8% (0.2% to 16%). Low-quality evidence from one study suggested that MA group resulted in poorer physical function: mean function in the sham group was 28 points (100 point scale); treatment worsened function by a mean of 6 points (95% CI −10.9 to −0.7). Low-quality evidence from three trials (289 participants) suggested no difference in adverse events between real (9%) and sham acupuncture (35%); RR 0.44 (95% CI 0.12 to 1.63). Moderate quality evidence from one study (58 participants) found that compared with standard therapy alone (antidepressants and exercise), adjunct acupuncture therapy reduced pain at one month after treatment: mean pain was 8 points on a 0 to 10 point scale in the standard therapy group; treatment reduced pain by 3 points (95% CI −3.9 to −2.1), an absolute reduction of 30% (21% to 39%). Two people treated with acupuncture reported adverse events; there were none in the control group (RR 3.57; 95% CI 0.18 to 71.21). Global well-being, sleep, fatigue and stiffness were not reported. Physical function data were not usable. Low quality evidence from one study (38 participants) showed a short-term benefit of acupuncture over antidepressants in pain relief: mean pain was 29 points (0 to 100 point scale) in the antidepressant group; acupuncture reduced pain by 17 points (95% CI −24.1 to −10.5). Other outcomes or adverse events were not reported. Moderate-quality evidence from one study (41 participants) indicated that deep needling with or without deqi did not differ in pain, fatigue, function or adverse events. Other outcomes were not reported. Four studies reported no differences between acupuncture and control or other treatments described at six to seven months follow-up. No serious adverse events were reported, but there were insufficient adverse events to be certain of the risks. Authors’ conclusions There is low tomoderate-level evidence that compared with no treatment and standard therapy, acupuncture improves pain and stiffness in people with fibromyalgia. There is moderate-level evidence that the effect of acupuncture does not differ from sham acupuncture in reducing pain or fatigue, or improving sleep or global well-being. EA is probably better than MA for pain and stiffness reduction and improvement of global well-being, sleep and fatigue. The effect lasts up to one month, but is not maintained at six months follow-up. MA probably does not improve pain or physical functioning. Acupuncture appears safe. People with fibromyalgia may consider using EA alone or with exercise and medication. The small sample size, scarcity of studies for each comparison, lack of an ideal sham acupuncture weaken the level of evidence and its clinical implications. Larger studies are warranted. PMID:23728665

  18. High resolution hybrid optical and acoustic sea floor maps (Invited)

    NASA Astrophysics Data System (ADS)

    Roman, C.; Inglis, G.

    2013-12-01

    This abstract presents a method for creating hybrid optical and acoustic sea floor reconstructions at centimeter scale grid resolutions with robotic vehicles. Multibeam sonar and stereo vision are two common sensing modalities with complementary strengths that are well suited for data fusion. We have recently developed an automated two stage pipeline to create such maps. The steps can be broken down as navigation refinement and map construction. During navigation refinement a graph-based optimization algorithm is used to align 3D point clouds created with both the multibeam sonar and stereo cameras. The process combats the typical growth in navigation error that has a detrimental affect on map fidelity and typically introduces artifacts at small grid sizes. During this process we are able to automatically register local point clouds created by each sensor to themselves and to each other where they overlap in a survey pattern. The process also estimates the sensor offsets, such as heading, pitch and roll, that describe how each sensor is mounted to the vehicle. The end results of the navigation step is a refined vehicle trajectory that ensures the points clouds from each sensor are consistently aligned, and the individual sensor offsets. In the mapping step, grid cells in the map are selectively populated by choosing data points from each sensor in an automated manner. The selection process is designed to pick points that preserve the best characteristics of each sensor and honor some specific map quality criteria to reduce outliers and ghosting. In general, the algorithm selects dense 3D stereo points in areas of high texture and point density. In areas where the stereo vision is poor, such as in a scene with low contrast or texture, multibeam sonar points are inserted in the map. This process is automated and results in a hybrid map populated with data from both sensors. Additional cross modality checks are made to reject outliers in a robust manner. The final hybrid map retains the strengths of both sensors and shows improvement over the single modality maps and a naively assembled multi-modal map where all the data points are included and averaged. Results will be presented from marine geological and archaeological applications using a 1350 kHz BlueView multibeam sonar and 1.3 megapixel digital still cameras.

  19. Measurement and reconstruction of the leaflet geometry for a pericardial artificial heart valve.

    PubMed

    Jiang, Hongjun; Campbell, Gord; Xi, Fengfeng

    2005-03-01

    This paper describes the measurement and reconstruction of the leaflet geometry for a pericardial heart valve. Tasks involved include mapping the leaflet geometries by laser digitizing and reconstructing the 3D freeform leaflet surface based on a laser scanned profile. The challenge is to design a prosthetic valve that maximizes the benefits offered to the recipient as compared to the normally operating naturally-occurring valve. This research was prompted by the fact that artificial heart valve bioprostheses do not provide long life durability comparable to the natural heart valve, together with the anticipated benefits associated with defining the valve geometries, especially the leaflet geometries for the bioprosthetic and human valves, in order to create a replicate valve fabricated from synthetic materials. Our method applies the concept of reverse engineering in order to reconstruct the freeform surface geometry. A Brown & Shape coordinate measuring machine (CMM) equipped with a HyMARC laser-digitizing system was used to measure the leaflet profiles of a Baxter Carpentier-Edwards pericardial heart valve. The computer software, Polyworks was used to pre-process the raw data obtained from the scanning, which included merging images, eliminating duplicate points, and adding interpolated points. Three methods, creating a mesh model from cloud points, creating a freeform surface from cloud points, and generating a freeform surface by B-splines are presented in this paper to reconstruct the freeform leaflet surface. The mesh model created using Polyworks can be used for rapid prototyping and visualization. To fit a freeform surface to cloud points is straightforward but the rendering of a smooth surface is usually unpredictable. A surface fitted by a group of B-splines fitted to cloud points was found to be much smoother. This method offers the possibility of manually adjusting the surface curvature, locally. However, the process is complex and requires additional manipulation. Finally, this paper presents a reverse engineered design for the pericardial heart valve which contains three identical leaflets with reconstructed geometry.

  20. Safety, risk and mental health: decision-making processes prescribed by Australian mental health legislation.

    PubMed

    Smith-Merry, Jennifer; Caple, Andrew

    2014-03-01

    Adverse events in mental health care occur frequently and cause significant distress for those who experience them, derailing treatment and sometimes leading to death. These events are clustered around particular aspects of care and treatment and are therefore avoidable if practices in these areas are strengthened. The research reported in this article takes as its starting point coronial recommendations made in relation to mental health. We report on those points and processes in treatment and discharge where coronial recommendations are most frequently made. We then examine the legislative requirements around these points and processes in three Australian States. We find that the key areas that need to be strengthened to avoid adverse events are assessment processes, communication and information transfer, documentation, planning and training. We make recommendations for improvements in these key areas.

  1. Stochastic point-source modeling of ground motions in the Cascadia region

    USGS Publications Warehouse

    Atkinson, G.M.; Boore, D.M.

    1997-01-01

    A stochastic model is used to develop preliminary ground motion relations for the Cascadia region for rock sites. The model parameters are derived from empirical analyses of seismographic data from the Cascadia region. The model is based on a Brune point-source characterized by a stress parameter of 50 bars. The model predictions are compared to ground-motion data from the Cascadia region and to data from large earthquakes in other subduction zones. The point-source simulations match the observations from moderate events (M 100 km). The discrepancy at large magnitudes suggests further work on modeling finite-fault effects and regional attenuation is warranted. In the meantime, the preliminary equations are satisfactory for predicting motions from events of M < 7 and provide conservative estimates of motions from larger events at distances less than 100 km.

  2. Placebo effects in trials evaluating 12 selected minimally invasive interventions: a systematic review and meta-analysis

    PubMed Central

    Holtedahl, Robin; Brox, Jens Ivar; Tjomsland, Ole

    2015-01-01

    Objectives To analyse the impact of placebo effects on outcome in trials of selected minimally invasive procedures and to assess reported adverse events in both trial arms. Design A systematic review and meta-analysis. Data sources and study selection We searched MEDLINE and Cochrane library to identify systematic reviews of musculoskeletal, neurological and cardiac conditions published between January 2009 and January 2014 comparing selected minimally invasive with placebo (sham) procedures. We searched MEDLINE for additional randomised controlled trials published between January 2000 and January 2014. Data synthesis Effect sizes (ES) in the active and placebo arms in the trials’ primary and pooled secondary end points were calculated. Linear regression was used to analyse the association between end points in the active and sham groups. Reported adverse events in both trial arms were registered. Results We included 21 trials involving 2519 adult participants. For primary end points, there was a large clinical effect (ES≥0.8) after active treatment in 12 trials and after sham procedures in 11 trials. For secondary end points, 7 and 5 trials showed a large clinical effect. Three trials showed a moderate difference in ES between active treatment and sham on primary end points (ES ≥0.5) but no trials reported a large difference. No trials showed large or moderate differences in ES on pooled secondary end points. Regression analysis of end points in active treatment and sham arms estimated an R2 of 0.78 for primary and 0.84 for secondary end points. Adverse events after sham were in most cases minor and of short duration. Conclusions The generally small differences in ES between active treatment and sham suggest that non-specific mechanisms, including placebo, are major predictors of the observed effects. Adverse events related to sham procedures were mainly minor and short-lived. Ethical arguments frequently raised against sham-controlled trials were generally not substantiated. PMID:25636794

  3. Extracting rate changes in transcriptional regulation from MEDLINE abstracts.

    PubMed

    Liu, Wenting; Miao, Kui; Li, Guangxia; Chang, Kuiyu; Zheng, Jie; Rajapakse, Jagath C

    2014-01-01

    Time delays are important factors that are often neglected in gene regulatory network (GRN) inference models. Validating time delays from knowledge bases is a challenge since the vast majority of biological databases do not record temporal information of gene regulations. Biological knowledge and facts on gene regulations are typically extracted from bio-literature with specialized methods that depend on the regulation task. In this paper, we mine evidences for time delays related to the transcriptional regulation of yeast from the PubMed abstracts. Since the vast majority of abstracts lack quantitative time information, we can only collect qualitative evidences of time delays. Specifically, the speed-up or delay in transcriptional regulation rate can provide evidences for time delays (shorter or longer) in GRN. Thus, we focus on deriving events related to rate changes in transcriptional regulation. A corpus of yeast regulation related abstracts was manually labeled with such events. In order to capture these events automatically, we create an ontology of sub-processes that are likely to result in transcription rate changes by combining textual patterns and biological knowledge. We also propose effective feature extraction methods based on the created ontology to identify the direct evidences with specific details of these events. Our ontologies outperform existing state-of-the-art gene regulation ontologies in the automatic rule learning method applied to our corpus. The proposed deterministic ontology rule-based method can achieve comparable performance to the automatic rule learning method based on decision trees. This demonstrates the effectiveness of our ontology in identifying rate-changing events. We also tested the effectiveness of the proposed feature mining methods on detecting direct evidence of events. Experimental results show that the machine learning method on these features achieves an F1-score of 71.43%. The manually labeled corpus of events relating to rate changes in transcriptional regulation for yeast is available in https://sites.google.com/site/wentingntu/data. The created ontologies summarized both biological causes of rate changes in transcriptional regulation and corresponding positive and negative textual patterns from the corpus. They are demonstrated to be effective in identifying rate-changing events, which shows the benefits of combining textual patterns and biological knowledge on extracting complex biological events.

  4. Impact of Stressful Life Events on Patients with Chronic Obstructive Pulmonary Disease.

    PubMed

    Yu, Tsung; Frei, Anja; Ter Riet, Gerben; Puhan, Milo A

    There is a general notion that stressful life events may cause mental and physical health problems. We aimed to describe stressful life events reported by patients with chronic obstructive pulmonary disease (COPD) and to assess their impact on health outcomes and behaviors. Two hundred and sixty-six primary care patients who participated in the ICE COLD ERIC cohort study were asked to document any stressful life events in the past 3 years. We assessed the before-after (the event) changes for symptoms of depression and anxiety, health status, dyspnea-related quality of life, exacerbations, cigarette use, and physical activity. We used linear regression analysis to estimate the crude and adjusted magnitude of the before-after changes. About 41% (110/266) of patients reported the experience of any stressful life events and "death of relatives/important persons" was most common (31%). After accounting for age, sex, living status, lung function, and anxiety/depression status at baseline, experiencing any stressful life events was associated with a 0.9-point increase on the depression scale (95% CI 0.3 to 1.4), a 0.8-point increase on the anxiety scale (95% CI 0.3 to 1.3), and a 0.8-point decrease in the physical activity score (95% CI -1.6 to 0). Experiencing stressful life events was associated with a small to moderate increase in symptoms of depression and anxiety in COPD, but no discernable effect was found for other physical outcomes. However, confirmation of these results in other COPD cohorts and identification of patients particularly vulnerable to stressful life events are needed. © 2017 S. Karger AG, Basel.

  5. 1ST International Conference on Small Satellites: New Technologies, Achievements, Problems And Prospects For International Co-Operation In The New Millenium.

    DTIC Science & Technology

    1998-01-01

    deployment of the two first systems Iridium and Globalstar. This event forces us to reconsider prospects of creating new systems of a similar class...Korolev, Moscow Region, Russian Federation Now are created and the new electro-optics equipment of the earth remote sensing are developed which...PC for control and data preprocessing; • software. The modern level of microelectronics development allows to create an advanced SMASSIR with new

  6. Know me - a journey in creating a personal electronic health record.

    PubMed

    Buckley, Amanda; Fox, Suzanne

    2015-01-01

    KnowMe is a patient created personal story of key life events both medical and non-medical that enables clinicians to understand what matters to the patient, not what's the matter with them. By shifting the Electronic Health Record (EHR) focus to knowing when a patient was at their best, what's important to them, their personal health goals, and care preferences, clinicians and patients can collaboratively work together in creating a treatment plan that aligns resources tailored to the their needs.

  7. The Coast Artillery Journal. Volume 82, Number 1, January-February 1939

    DTIC Science & Technology

    1939-02-01

    commerce has been created , assisted and jealously guarded by a powerful fleet, and history shows that England has alw~vs resorted to war whenever her...p3rt in modern warfare. Creating 3n inrerest in w3rning nets and blackouts directed the public mind roward the entire subject of passive defense...causes of currem events in Palestine, which h,l\\’e created a condition that borders on anarch\\’ and h.ls forced the introduction of martial law, lie in

  8. LiDAR Mapping of Earthquake Uplifted Paleo-shorelines, Southern Wairarapa Coast, North Island, New Zealand

    NASA Astrophysics Data System (ADS)

    Valenciano, J.; Angenent, J.; Marshall, J. S.; Clark, K.; Litchfield, N. J.

    2017-12-01

    The Hikurangi subduction margin along the east coast of the North Island, New Zealand accommodates oblique convergence of the Pacific Plate westward beneath the Australian plate at 45 mm/yr. Pronounced forearc uplift occurs at the southern end of the margin along the Wairarapa coast, onshore of the subducting Hikurangi plateau. Along a narrow coastal lowland, a series of uplifted Holocene marine terraces and beach ridges preserve a geologic record of prehistoric coseismic uplift events. In January 2017, we participated in the Research Experience for Undergraduates (REU) program of the NSF SHIRE Project (Subduction at Hikurangi Integrated Research Experiment). We visited multiple coastal sites for reconnaissance fieldwork to select locations for future in-depth study. For the coastline between Flat Point and Te Kaukau Point, we used airborne LiDAR data provided by Land Information New Zealand (LINZ) to create ArcGIS digital terrain models for mapping and correlating uplifted paleo-shorelines. Terrace elevations derived from the LiDAR data were calibrated through the use of Real Time Kinematic (RTK) GPS surveying at one field site (Glenburn Station). Prior field mapping and radiocarbon dating results (Berryman et al., 2001; Litchfield and Clark, 2015) were used to guide our LiDAR mapping efforts. The resultant maps show between four and seven uplifted terraces and associated beach ridges along this coastal segment. At some sites, terrace mapping and lateral correlation are impeded by discontinuous exposures and the presence of landslide debris, alluvial fan deposits, and sand dunes. Tectonic uplift along the southern Hikurangi margin is generated by a complex interaction between deep megathrust slip and shallow upper-plate faulting. Each uplifted Holocene paleo-shoreline is interpreted to represent a single coseismic uplift event. Continued mapping, surveying, and age dating may help differentiate between very large margin-wide megathrust earthquakes (M8.0-9.0+) and smaller, more localized upper-plate thrust events (M7.0-8.0). Both of these earthquake types pose a significant seismic and tsunami hazard for New Zealand residents.

  9. An evaluation of the feasibility and usability of a proof of concept mobile app for adverse event reporting post influenza vaccination

    PubMed Central

    Wilson, Kumanan; Atkinson, Katherine M.; Westeinde, Jacqueline; Bell, Cameron; Marty, Kim; Fergusson, Dean; Deeks, Shelley L.; Crowcroft, Natasha; Bettinger, Julie A.

    2016-01-01

    ABSTRACT The Canadian National Vaccine Safety network (CANVAS) gathers and analyzes safety data on individuals receiving the influenza vaccine during the early stages of annual influenza vaccination campaigns with data collected via participant surveys through the Internet. We sought to examine whether it was feasible to use a mobile application (app) to facilitate AEFI reporting for the CANVAS network. To explore this, we developed a novel smartphone app, recruited participants from a hospital influenza immunization clinic and by word of mouth and instructed them to download and utilize the app. The app reminded participants to complete the CANVAS AEFI surveillance surveys (“AEFI surveys”) on day 8 and 30, a survey capturing app usability metrics at day 30 (“usability survey”) and provided a mechanism to report AEFI events spontaneously throughout the whole study period. All survey results and spontaneous reports were recorded on a privacy compliant, cloud server. A software plug-in, Lookback, was used to record the on-screen experience of the app sessions. Of the 76 participants who consented to participate, 48(63%) successfully downloaded the app and created a profile. In total, 38 unique participants completed all of the required surveillance surveys; transmitting 1104 data points (survey question responses and spontaneous reports) from 83 completed surveys, including 21 usability surveys and one spontaneous report. In total, we received information on new or worsening health conditions after receiving the influenza vaccine from 11(28%) participants. Of the usability survey responses, 86% agreed or strongly agreed that they would prefer to use a mobile app based reporting system instead of a web-based system. The single spontaneous report received was from a participant who had also reported using the Day 8 survey. Of Lookback observable sessions, an accurate transmission proportion of 100% (n=290) was reported for data points. We demonstrated that a mobile app can be used for AEFI reporting, although download and survey completion proportions suggest potential barriers to adoption. Future studies should examine implementation of mobile reporting in a broader audience and impact on the quality of reporting of adverse events following immunization. PMID:26905396

  10. Shallow Chamber & Conduit Behavior of Silicic Magma: A Thermo- and Fluid- Dynamic Parameterization Model of Physical Deformation as Constrained by Geodetic Observations: Case Study; Soufriere Hills Volcano, Montserrat

    NASA Astrophysics Data System (ADS)

    Gunn de Rosas, C. L.

    2013-12-01

    The Soufrière Hills Volcano, Montserrat (SHV) is an active, mainly andesitic and well-studied stratovolcano situated at the northern end of the Lesser Antilles Arc subduction zone in the Caribbean Sea. The goal of our research is to create a high resolution 3D subsurface model of the shallow and deeper aspects of the magma storage and plumbing system at SHV. Our model will integrate inversions using continuous and campaign geodetic observations at SHV from 1995 to the present as well as local seismic records taken at various unrest intervals to construct a best-fit geometry, pressure point source and inflation rate and magnitude. We will also incorporate a heterogeneous media in the crust and use the most contemporary understanding of deep crustal- or even mantle-depth 'hot-zone' genesis and chemical evolution of silicic and intermediate magmas to inform the character of the deep edifice influx. Our heat transfer model will be constructed with a modified 'thin shell' enveloping the magma chamber to simulate the insulating or conducting influence of heat-altered chamber boundary conditions. The final forward model should elucidate observational data preceding and proceeding unrest events, the behavioral suite of magma transport in the subsurface environment and the feedback mechanisms that may contribute to eruption triggering. Preliminary hypotheses suggest wet, low-viscosity residual melts derived from 'hot zones' will ascend rapidly to shallower stall-points and that their products (eventually erupted lavas as well as stalled plutonic masses) will experience and display two discrete periods of shallow evolution; a rapid depressurization crystallization event followed by a slower conduction-controlled heat transfer and cooling crystallization. These events have particular implications for shallow magma behaviors, notably inflation, compressibility and pressure values. Visualization of the model with its inversion constraints will be affected with ComSol. Conclusions about the subsurface behavioral suite at SHV will have high applicability to other silicic and intermediate volcanic edifices and may aid in the hazard mitigation associated with volcanic unrest.

  11. An evaluation of the feasibility and usability of a proof of concept mobile app for adverse event reporting post influenza vaccination.

    PubMed

    Wilson, Kumanan; Atkinson, Katherine M; Westeinde, Jacqueline; Bell, Cameron; Marty, Kim; Fergusson, Dean; Deeks, Shelley L; Crowcroft, Natasha; Bettinger, Julie A

    2016-07-02

    The Canadian National Vaccine Safety network (CANVAS) gathers and analyzes safety data on individuals receiving the influenza vaccine during the early stages of annual influenza vaccination campaigns with data collected via participant surveys through the Internet. We sought to examine whether it was feasible to use a mobile application (app) to facilitate AEFI reporting for the CANVAS network. To explore this, we developed a novel smartphone app, recruited participants from a hospital influenza immunization clinic and by word of mouth and instructed them to download and utilize the app. The app reminded participants to complete the CANVAS AEFI surveillance surveys ("AEFI surveys") on day 8 and 30, a survey capturing app usability metrics at day 30 ("usability survey") and provided a mechanism to report AEFI events spontaneously throughout the whole study period. All survey results and spontaneous reports were recorded on a privacy compliant, cloud server. A software plug-in, Lookback, was used to record the on-screen experience of the app sessions. Of the 76 participants who consented to participate, 48(63%) successfully downloaded the app and created a profile. In total, 38 unique participants completed all of the required surveillance surveys; transmitting 1104 data points (survey question responses and spontaneous reports) from 83 completed surveys, including 21 usability surveys and one spontaneous report. In total, we received information on new or worsening health conditions after receiving the influenza vaccine from 11(28%) participants. Of the usability survey responses, 86% agreed or strongly agreed that they would prefer to use a mobile app based reporting system instead of a web-based system. The single spontaneous report received was from a participant who had also reported using the Day 8 survey. Of Lookback observable sessions, an accurate transmission proportion of 100% (n=290) was reported for data points. We demonstrated that a mobile app can be used for AEFI reporting, although download and survey completion proportions suggest potential barriers to adoption. Future studies should examine implementation of mobile reporting in a broader audience and impact on the quality of reporting of adverse events following immunization.

  12. High salmon density and low discharge create periodic hypoxia in coastal rivers

    Treesearch

    Christopher J. Sergeant; J. Ryan Bellmore; Casey McConnell; Jonathan W. Moore

    2017-01-01

    Dissolved oxygen (DO) is essential to the survival of almost all aquatic organisms. Here, we examine the possibility that abundant Pacific salmon (Oncorhynchus spp.) and low streamflow combine to create hypoxic events in coastal rivers. Using high-frequency DO time series from two similar watersheds in southeastern Alaska, we summarize DO regimes...

  13. Land-markings: 12 Journeys through 9/11 Living Memorials

    Treesearch

    Erika S. Svendsen; Lindsay K. Campbell

    2006-01-01

    Living memorials are spaces created, used, or reappropriated by people as they employ the landscape to memorialize individuals, places, and events. Ranging from single tree plantings, to the creation of new parks, to the rededication of existing forests, hundreds of groups across the country created a vast network of sites that continues to grow. "Land-markings:...

  14. Creating Micro-Videos to Demonstrate Technology Learning and Digital Literacy

    ERIC Educational Resources Information Center

    Frydenberg, Mark; Andone, Diana

    2016-01-01

    Purpose: Short videos, also known as micro-videos, have emerged as a platform for sharing ideas, experiences and life events via online social networks. This paper aims to share preliminary results of a study, involving students from two universities who created six-second videos using the Vine mobile app to explain or illustrate technological…

  15. An Image of Possibility: Illustrating a Pedagogic Encounter with Culture

    ERIC Educational Resources Information Center

    Michael, Maureen K.

    2011-01-01

    An Image of Possibility is an interplay between image-making and interpretation. It explores author-created illustration as an art-based tool for educational inquiry and is designed further to inform the creative research practice of the author. The illustration "Meeting People" is created by the author to render an event of learning and culture…

  16. Living Memorials: Understanding the Social Meanings of Community-Based Memorials to September 11, 2001

    Treesearch

    Erika S. Svendsen; Lindsay K. Campbell

    2010-01-01

    Living memorials are landscaped spaces created by people to memorialize individuals, places, and events. Hundreds of stewardship groups across the United States of America created living memorials in response to the September 11, 2001 terrorist attacks. This study sought to understand how stewards value, use, and talk about their living, community-based memorials....

  17. Search for Point Sources of Ultra-High-Energy Cosmic Rays above 4.0 × 1019 eV Using a Maximum Likelihood Ratio Test

    NASA Astrophysics Data System (ADS)

    Abbasi, R. U.; Abu-Zayyad, T.; Amann, J. F.; Archbold, G.; Atkins, R.; Bellido, J. A.; Belov, K.; Belz, J. W.; Ben-Zvi, S. Y.; Bergman, D. R.; Boyer, J. H.; Burt, G. W.; Cao, Z.; Clay, R. W.; Connolly, B. M.; Dawson, B. R.; Deng, W.; Farrar, G. R.; Fedorova, Y.; Findlay, J.; Finley, C. B.; Hanlon, W. F.; Hoffman, C. M.; Holzscheiter, M. H.; Hughes, G. A.; Hüntemeyer, P.; Jui, C. C. H.; Kim, K.; Kirn, M. A.; Knapp, B. C.; Loh, E. C.; Maestas, M. M.; Manago, N.; Mannel, E. J.; Marek, L. J.; Martens, K.; Matthews, J. A. J.; Matthews, J. N.; O'Neill, A.; Painter, C. A.; Perera, L.; Reil, K.; Riehle, R.; Roberts, M. D.; Sasaki, M.; Schnetzer, S. R.; Seman, M.; Simpson, K. M.; Sinnis, G.; Smith, J. D.; Snow, R.; Sokolsky, P.; Song, C.; Springer, R. W.; Stokes, B. T.; Thomas, J. R.; Thomas, S. B.; Thomson, G. B.; Tupa, D.; Westerhoff, S.; Wiencke, L. R.; Zech, A.

    2005-04-01

    We present the results of a search for cosmic-ray point sources at energies in excess of 4.0×1019 eV in the combined data sets recorded by the Akeno Giant Air Shower Array and High Resolution Fly's Eye stereo experiments. The analysis is based on a maximum likelihood ratio test using the probability density function for each event rather than requiring an a priori choice of a fixed angular bin size. No statistically significant clustering of events consistent with a point source is found.

  18. The Origins of the Cold War: A Unit of Study for Grades 9-12.

    ERIC Educational Resources Information Center

    King, Lisa

    This unit is one of a series that represents specific moments in history from which students focus on the meanings of landmark events. The events of 1945 are regarded widely as a turning point in 20th century history, a point when the United States unequivocally took its place as a world power, at a time when Americans had a strong but…

  19. Creating a Highly Reliable Neonatal Intensive Care Unit Through Safer Systems of Care.

    PubMed

    Panagos, Patoula G; Pearlman, Stephen A

    2017-09-01

    Neonates requiring intensive care are at high risk for medical errors due to their unique characteristics and high acuity. Designing a safer work environment begins with safe processes. Creating a culture of safety demands the involvement of all organizational levels and an interdisciplinary approach. Adverse events can result from suboptimal communication and lack of a shared mental model. This chapter describes tools to promote better patient safety in the NICU through monitoring adverse events, improving communication and using information technology. Unplanned extubation is an example of a neonatal safety concern that can be reduced by employing quality improvement methodology. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. An Efficient Method to Create Digital Terrain Models from Point Clouds Collected by Mobile LiDAR Systems

    NASA Astrophysics Data System (ADS)

    Gézero, L.; Antunes, C.

    2017-05-01

    The digital terrain models (DTM) assume an essential role in all types of road maintenance, water supply and sanitation projects. The demand of such information is more significant in developing countries, where the lack of infrastructures is higher. In recent years, the use of Mobile LiDAR Systems (MLS) proved to be a very efficient technique in the acquisition of precise and dense point clouds. These point clouds can be a solution to obtain the data for the production of DTM in remote areas, due mainly to the safety, precision, speed of acquisition and the detail of the information gathered. However, the point clouds filtering and algorithms to separate "terrain points" from "no terrain points", quickly and consistently, remain a challenge that has caught the interest of researchers. This work presents a method to create the DTM from point clouds collected by MLS. The method is based in two interactive steps. The first step of the process allows reducing the cloud point to a set of points that represent the terrain's shape, being the distance between points inversely proportional to the terrain variation. The second step is based on the Delaunay triangulation of the points resulting from the first step. The achieved results encourage a wider use of this technology as a solution for large scale DTM production in remote areas.

  1. A true blind for subjects who receive spinal manipulation therapy.

    PubMed

    Kawchuk, Gregory N; Haugen, Rick; Fritz, Julie

    2009-02-01

    To determine if short-duration anesthesia (propofol and remifentanil) can blind subjects to the provision or withholding of spinal manipulative therapy (SMT). Placebo control. Day-procedure ward, University of Alberta Hospital. Human subjects with uncomplicated low back pain (LBP) (n=6). In each subject, propofol and remifentanil were administered intravenously. Once unconsciousness was achieved (3-5min), subjects were placed in a lateral recumbent position and then randomized to either a control group (n=3) or an experimental group (with SMT, n=3); subjects received a single SMT to the lumbar spine. Subjects were given a standardized auditory and visual cue and then allowed to recover from anesthesia in a supine position (3-5min). Before anesthesia and 30 minutes after recovery, a blinded evaluator asked each subject to quantify their LBP by using an 11-point scale. This same evaluator then assessed the ability of each subject to recall specific memories while under presumed anesthesia including events related to treatment and specific auditory and visual cues. In either the experimental or control group, subjects could not recall any event while under anesthesia. Some SMT subjects reported pain reduction greater than the minimally important clinical difference and greater than control subjects. No adverse events were reported. Short-duration, low-risk general anesthesia can create effective blinding of subjects to the provision or withholding of SMT. An anesthetic blind for SMT subjects solves many, if not all, problems associated with prior SMT blinding strategies. Although further studies are needed to refine this technique, the potential now exists to conduct the first placebo-controlled randomized controlled trial to assess SMT efficacy.

  2. The Charged Particle Environment on the Surface of Mars induced by Solar Energetic Particles - Five Years of Measurements with the MSL/RAD instrument

    NASA Astrophysics Data System (ADS)

    Ehresmann, B.; Hassler, D.; Zeitlin, C.; Guo, J.; Lee, C. O.; Wimmer-Schweingruber, R. F.; Appel, J. K.; Boehm, E.; Boettcher, S. I.; Brinza, D. E.; Burmeister, S.; Lohf, H.; Martin-Garcia, C.; Matthiae, D.; Rafkin, S. C.; Reitz, G.

    2017-12-01

    NASA's Mars Science Laboratory (MSL) mission has now been operating in Gale crater on the surface of Mars for five years. On board MSL, the Radiation Assessment Detector (MSL/RAD) is measuring the Martian surface radiation environment, providing insights on its intensity and composition. This radiation field is mainly composed of primary Galactic Cosmic Rays (GCRs) and secondary particles created by the GCRs' interactions with the Martian atmosphere and soil. However, on shorter time scales the radiation environment can be dominated by contributions from Solar Energetic Particle (SEP) events. Due to the modulating effect of the Martian atmosphere shape and intensity of these SEP spectra will differ significantly between interplanetary space and the Martian surface. Understanding how SEP events influence the surface radiation field is crucial to assess associated health risks for potential human missions to Mars. Here, we present updated MSL/RAD results for charged particle fluxes measured on the surface during SEP activity from the five years of MSL operations on Mars. The presented results incorporate updated analysis techniques for the MSL/RAD data and yield the most robust particle spectra to date. Furthermore, we compare the MSL/RAD SEP-induced fluxes to measurements from other spacecraft in the inner heliosphere and, in particular, in Martian orbit. Analyzing changes of SEP intensities from interplanetary space to the Martian surface gives insight into the modulating effect of the Martian atmosphere, while comparing timing profiles of SEP events between Mars and different points in interplanetary space can increase our understanding of SEP propagation in the heliosphere.

  3. Turbulent Mixing and Vertical Heat Transfer in the Surface Mixed Layer of the Arctic Ocean: Implication of a Cross-Pycnocline High-Temperature Anomaly

    NASA Astrophysics Data System (ADS)

    Kawaguchi, Yusuke; Takeda, Hiroki

    2017-04-01

    This study focuses on the mixing processes in the vicinity of surface mixed layer (SML) of the Arctic Ocean. Turbulence activity and vertical heat transfer are quantitatively characterized in the Northwind Abyssal Plain, based on the RV Mirai Arctic cruise, during the transition from late summer to early winter 2014. During the cruise, noticeable storm events were observed, which came over the ship's location and contributed to the deepening of the SML. According to the ship-based microstructure observation, within the SML, the strong wind events produced enhanced dissipation rates of turbulent kinetic energy in the order of magnitude of ɛ = 10-6-10-4W kg-1. On thermal variance dissipation rate, χ increases toward the base of SML, reaching O(10-7) K2 s-1, resulting in vertical heat flux of O(10) W m-2. During the occasional energetic mixing events, the near-surface warm water was transferred downward and penetrated through the SML base, creating a cross-pycnocline high-temperature anomaly (CPHTA) at approximately 20-30 m depth. Near CPHTA, the vertical heat flux was anomalously magnified to O(10-100) W m-2. Following the fixed-point observation, in the regions of marginal and thick ice zones, the SML heat content was monitored using an autonomous drifting buoy, UpTempO. During most of the ice-covered period, the ocean-to-ice turbulent heat flux was dominant, rather than the diapycnal heat transfer across the SML bottom interface.

  4. Creating, Using and Updating Thesauri Files for AutoMap and ORA

    DTIC Science & Technology

    2012-07-26

    occurrences or phenomena that happen. An Event could be 9-11, the JFK Assignation, the Super Bowl, a wedding, a funeral, or an inauguration. Specific events...a better place without Caesar (Belief). To kill Caesar (Task) they form a group of assassins (Organization). To accomplish their task they need to...know about Caesar’s daily routine (Knowledge) and how to get their knives (Resources) into the senate. Finally, the assassination (Event) takes place

  5. Slip-Sliding-Away: A Review of the Literature on the Constraining Qualities of PowerPoint

    ERIC Educational Resources Information Center

    Kernbach, Sebastian; Bresciani, Sabrina; Eppler, Martin J.

    2015-01-01

    PowerPoint is a dominant communication tool in business and education. It allows for creating professional-looking presentations easily, but without understanding its constraining qualities it can be used inappropriately. Therefore we conducted a systematic literature review structuring the literature on PowerPoint in three chronological phases…

  6. Teach Graphic Design Basics with PowerPoint

    ERIC Educational Resources Information Center

    Lazaros, Edward J.; Spotts, Thomas H.

    2007-01-01

    While PowerPoint is generally regarded as simply software for creating slide presentations, it includes often overlooked--but powerful--drawing tools. Because it is part of the Microsoft Office package, PowerPoint comes preloaded on many computers and thus is already available in many classrooms. Since most computers are not preloaded with good…

  7. Pointing with Power or Creating with Chalk

    ERIC Educational Resources Information Center

    Rudow, Sasha R.; Finck, Joseph E.

    2015-01-01

    This study examines the attitudes of students on the use of PowerPoint and chalk/white boards in college science lecture classes. Students were asked to complete a survey regarding their experiences with PowerPoint and chalk/white boards in their science classes. Both multiple-choice and short answer questions were used. The multiple-choice…

  8. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field

    PubMed Central

    Jeong, Myeong-Hun; Duckham, Matt

    2015-01-01

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes’ coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks. PMID:26343672

  9. Efficient, Decentralized Detection of Qualitative Spatial Events in a Dynamic Scalar Field.

    PubMed

    Jeong, Myeong-Hun; Duckham, Matt

    2015-08-28

    This paper describes an efficient, decentralized algorithm to monitor qualitative spatial events in a dynamic scalar field. The events of interest involve changes to the critical points (i.e., peak, pits and passes) and edges of the surface network derived from the field. Four fundamental types of event (appearance, disappearance, movement and switch) are defined. Our algorithm is designed to rely purely on qualitative information about the neighborhoods of nodes in the sensor network and does not require information about nodes' coordinate positions. Experimental investigations confirm that our algorithm is efficient, with O(n) overall communication complexity (where n is the number of nodes in the sensor network), an even load balance and low operational latency. The accuracy of event detection is comparable to established centralized algorithms for the identification of critical points of a surface network. Our algorithm is relevant to a broad range of environmental monitoring applications of sensor networks.

  10. Effect of bromocriptine-QR (a quick-release formulation of bromocriptine mesylate) on major adverse cardiovascular events in type 2 diabetes subjects.

    PubMed

    Gaziano, J Michael; Cincotta, Anthony H; Vinik, Aaron; Blonde, Lawrence; Bohannon, Nancy; Scranton, Richard

    2012-10-01

    Bromocriptine-QR (a quick-release formulation of bromocriptine mesylate), a dopamine D2 receptor agonist, is a US Food and Drug Administrration-approved treatment for type 2 diabetes mellitus (T2DM). A 3070-subject randomized trial demonstrated a significant, 40% reduction in relative risk among bromocriptine-QR-treated subjects in a prespecified composite cardiovascular (CV) end point that included ischemic-related (myocardial infarction and stroke) and nonischemic-related (hospitalization for unstable angina, congestive heart failure [CHF], or revascularization surgery) end points, but did not include cardiovascular death as a component of this composite. The present investigation was undertaken to more critically evaluate the impact of bromocriptine-QR on cardiovascular outcomes in this study subject population by (1) including CV death in the above-described original composite analysis and then stratifying this new analysis on the basis of multiple demographic subgroups and (2) analyzing the influence of this intervention on only the "hard" CV end points of myocardial infarction, stroke, and CV death (major adverse cardiovascular events [MACEs]). Three thousand seventy T2DM subjects on stable doses of ≤2 antidiabetes medications (including insulin) with HbA1c ≤10.0 (average baseline HbA1c=7.0) were randomized 2:1 to bromocriptine-QR (1.6 to 4.8 mg/day) or placebo for a 52-week treatment period. Subjects with heart failure (New York Heart Classes I and II) and precedent myocardial infarction or revascularization surgery were allowed to participate in the trial. Study outcomes included time to first event for each of the 2 CV composite end points described above. The relative risk comparing bromocriptine-QR with the control for the cardiovascular outcomes was estimated as a hazard ratio with 95% confidence interval on the basis of Cox proportional hazards regression. The statistical significance of any between-group difference in the cumulative percentage of CV events over time (derived from a Kaplan-Meier curve) was determined by a log-rank test on the intention-to-treat population. Study subjects were in reasonable metabolic control, with an average baseline HbA1c of 7.0±1.1, blood pressure of 128/76±14/9, and total and LDL cholesterol of 179±42 and 98±32, respectively, with 88%, 77%, and 69% of subjects being treated with antidiabetic, antihypertensive, and antihyperlipidemic agents, respectively. Ninety-one percent of the expected person-year outcome ascertainment was obtained in this study. Respecting the CV-inclusive composite cardiovascular end point, there were 39 events (1.9%) among 2054 bromocriptine-QR-treated subjects versus 33 events (3.2%) among 1016 placebo subjects, yielding a significant, 39% reduction in relative risk in this end point with bromocriptine-QR exposure (P=0.0346; log-rank test) that was not influenced by age, sex, race, body mass index, duration of diabetes, or preexisting cardiovascular disease. In addition, regarding the MACE end point, there were 14 events (0.7%) among 2054 bromocriptine-QR-treated subjects and 15 events (1.5%) among 1016 placebo-treated subjects, yielding a significant, 52% reduction in relative risk in this end point with bromocriptine-QR exposure (P<0.05; log-rank test). These findings reaffirm and extend the original observation of relative risk reduction in cardiovascular adverse events among type 2 diabetes subjects treated with bromocriptine-QR and suggest that further investigation into this impact of bromocriptine-QR is warranted. URL: http://clinicaltrials.gov. Unique Identifier: NCT00377676.

  11. Educational Utilization of Microsoft Powerpoint for Oral and Maxillofacial Cancer Presentations.

    PubMed

    Carvalho, Francisco Samuel Rodrigues; Chaves, Filipe Nobre; Soares, Eduardo Costa Studart; Pereira, Karuza Maria Alves; Ribeiro, Thyciana Rodrigues; Fonteles, Cristiane Sa Roriz; Costa, Fabio Wildson Gurgel

    2016-01-01

    Electronic presentations have become useful tools for surgeons, other clinicians and patients, facilitating medical and legal support and scientific research. Microsoft® PowerPoint is by far and away the most commonly used computer-based presentation package. Setting up surgical clinical cases with PowerPoint makes it easy to register and follow patients for the purpose of discussion of treatment plan or scientific presentations. It facilitates communication between professionals, supervising clinical cases and teaching. It is often useful to create a template to standardize the presentation, offered by the software through the slide master. The purpose of this paper was to show a simple and practical method for creating a Microsoft® PowerPoint template for use in presentations concerning oral and maxillofacial cancer.

  12. Microhole Tubing Bending Report

    DOE Data Explorer

    Oglesby, Ken

    2012-01-01

    A downhole tubing bending study was made and is reported herein. IT contains a report and 2 excel spreadsheets to calculate tubing bending and to estimate contact points of the tubing to the drilled hole wall (creating a new support point).

  13. Statistical density modification using local pattern matching

    DOEpatents

    Terwilliger, Thomas C.

    2007-01-23

    A computer implemented method modifies an experimental electron density map. A set of selected known experimental and model electron density maps is provided and standard templates of electron density are created from the selected experimental and model electron density maps by clustering and averaging values of electron density in a spherical region about each point in a grid that defines each selected known experimental and model electron density maps. Histograms are also created from the selected experimental and model electron density maps that relate the value of electron density at the center of each of the spherical regions to a correlation coefficient of a density surrounding each corresponding grid point in each one of the standard templates. The standard templates and the histograms are applied to grid points on the experimental electron density map to form new estimates of electron density at each grid point in the experimental electron density map.

  14. Nanotexturing of surfaces to reduce melting point.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Ernest J.; Zubia, David; Mireles, Jose

    2011-11-01

    This investigation examined the use of nano-patterned structures on Silicon-on-Insulator (SOI) material to reduce the bulk material melting point (1414 C). It has been found that sharp-tipped and other similar structures have a propensity to move to the lower energy states of spherical structures and as a result exhibit lower melting points than the bulk material. Such a reduction of the melting point would offer a number of interesting opportunities for bonding in microsystems packaging applications. Nano patterning process capabilities were developed to create the required structures for the investigation. One of the technical challenges of the project was understandingmore » and creating the specialized conditions required to observe the melting and reshaping phenomena. Through systematic experimentation and review of the literature these conditions were determined and used to conduct phase change experiments. Melting temperatures as low as 1030 C were observed.« less

  15. The politics of learning to teach: The juxtaposition of reform, risk-taking, and survival for a prospective science teacher

    NASA Astrophysics Data System (ADS)

    McLoughlin, Andrea Sabatini

    1998-12-01

    It has proven difficult for teachers to enact and sustain the changes to thinking and pedagogy called for in science education reforms. It may be especially difficult for prospective teachers to create coherent professional identities as they learn to teach in the borderland between educational change and the existing context of education. Field experiences remain a pivotal point in teacher education, as prospective teachers mature from the perspective they have lived as students to the vantage point they are constructing as developing teachers. This qualitative, naturalistic case study examined a reform-oriented preservice science teacher's beliefs and actions during a year of field practica, including student teaching. Interviews, observations, and written documents were collected to examine the extent to which the prospective teacher's thoughts and actions continued to reflect reform ideals across that time. Inductive data analysis indicated that tacit beliefs held by the participant interacted with significant events of the field experiences to direct her learning to teach process in non-educative ways. Implications include: (a) deeper examination of the beliefs and experiences of prospective teachers would allow teacher educators the ability to understand and guide professional development in deeper and more productive ways, (b) the establishment of an atmosphere of experimentation/inquiry and a more cohesive, collaborative approach to teacher education are needed, especially during field experiences, if teacher education programs are to foster the productive and educative experiences supportive of reform ideals, (c) the preparation of prospective teachers who intend to implement reform ideals should include developing understandings of the dynamics of the change process, and (d) the exploration/confrontation of the power structures inherent in the existing educational system is essential if they are to be prevented from undermining reform efforts. As science teacher educators more fully explore prospective teachers' beliefs about teaching and learning, challenge the thinking and events that tend to reproduce the status quo, and look for the common points of departure that will help prospective teachers to construct an empowered and implementable new vision of themselves and their classrooms, science education reform will surely move closer to sustainability.

  16. Capturing rogue waves by multi-point statistics

    NASA Astrophysics Data System (ADS)

    Hadjihosseini, A.; Wächter, Matthias; Hoffmann, N. P.; Peinke, J.

    2016-01-01

    As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics.

  17. Higher moments of net-proton multiplicity distributions in a heavy-ion event pile-up scenario

    NASA Astrophysics Data System (ADS)

    Garg, P.; Mishra, D. K.

    2017-10-01

    High-luminosity modern accelerators, like the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory (BNL) and Large Hadron Collider (LHC) at European Organization for Nuclear Research (CERN), inherently have event pile-up scenarios which significantly contribute to physics events as a background. While state-of-the-art tracking algorithms and detector concepts take care of these event pile-up scenarios, several offline analytical techniques are used to remove such events from the physics analysis. It is still difficult to identify the remaining pile-up events in an event sample for physics analysis. Since the fraction of these events is significantly small, it may not be as serious of an issue for other analyses as it would be for an event-by-event analysis. Particularly when the characteristics of the multiplicity distribution are observable, one needs to be very careful. In the present work, we demonstrate how a small fraction of residual pile-up events can change the moments and their ratios of an event-by-event net-proton multiplicity distribution, which are sensitive to the dynamical fluctuations due to the QCD critical point. For this study, we assume that the individual event-by-event proton and antiproton multiplicity distributions follow Poisson, negative binomial, or binomial distributions. We observe a significant effect in cumulants and their ratios of net-proton multiplicity distributions due to pile-up events, particularly at lower energies. It might be crucial to estimate the fraction of pile-up events in the data sample while interpreting the experimental observable for the critical point.

  18. Loggerhead Turtles (Caretta caretta) Use Vision to Forage on Gelatinous Prey in Mid-Water

    PubMed Central

    Narazaki, Tomoko; Sato, Katsufumi; Abernathy, Kyler J.; Marshall, Greg J.; Miyazaki, Nobuyuki

    2013-01-01

    Identifying characteristics of foraging activity is fundamental to understanding an animals’ lifestyle and foraging ecology. Despite its importance, monitoring the foraging activities of marine animals is difficult because direct observation is rarely possible. In this study, we use an animal-borne imaging system and three-dimensional data logger simultaneously to observe the foraging behaviour of large juvenile and adult sized loggerhead turtles (Caretta caretta) in their natural environment. Video recordings showed that the turtles foraged on gelatinous prey while swimming in mid-water (i.e., defined as epipelagic water column deeper than 1 m in this study). By linking video and 3D data, we found that mid-water foraging events share the common feature of a marked deceleration phase associated with the capture and handling of the sluggish prey. Analysis of high-resolution 3D movements during mid-water foraging events, including presumptive events extracted from 3D data using deceleration in swim speed as a proxy for foraging (detection rate = 0.67), showed that turtles swam straight toward prey in 171 events (i.e., turning point absent) but made a single turn toward the prey an average of 5.7±6.0 m before reaching the prey in 229 events (i.e., turning point present). Foraging events with a turning point tended to occur during the daytime, suggesting that turtles primarily used visual cues to locate prey. In addition, an incident of a turtle encountering a plastic bag while swimming in mid-water was recorded. The fact that the turtle’s movements while approaching the plastic bag were analogous to those of a true foraging event, having a turning point and deceleration phase, also support the use of vision in mid-water foraging. Our study shows that integrated video and high-resolution 3D data analysis provides unique opportunities to understand foraging behaviours in the context of the sensory ecology involved in prey location. PMID:23776603

  19. SU(2) x U(1) vacuum and the Centauro events

    NASA Technical Reports Server (NTRS)

    Kazanas, D.; Balasubrahmanyan, V. K.; Streitmatter, R. E.

    1984-01-01

    It is proposed that the fireballs invoked to explain the Centauro events are bubbles of a metastable superdense state of nuclear matter, created in high energy (E is approximately 10 to the 15th power eV) cosmic ray collisions at the top of the atmosphere. If these bubbles are created with a Lorentz factor gamma approximately = 10 at their CM frame, the objections against the origin of these events in cosmic ray interactions are overcome. Assuming further, that the Centauro events are to the explosive decay of these metastable bubbles, a relationship between their lifetime, tau, and the threshold energy for bubble formation, E sub th, is derived. The minimum lifetime consistent with such an interpretation in tau is approximately 10 to the -8th power sec, while the E sub th appears to be insensitive to the value of tau and always close to E sub th is approximately 10 to the 15th power eV. Finally it is speculated that if the available CM energy is thermalized in such collisions, these bubbles might be manifestations of excitations of the SU(2) x U(1) false vacuum. The absence of neutral pions in the Centauro events is then explained by the decay of these excitations.

  20. Comprehensive Essays for World History Finals.

    ERIC Educational Resources Information Center

    Feldman, Martha J.

    1997-01-01

    Describes a novel approach to comprehensive questions in world history examinations. Recommends using current events as illustrative reference points for complex subjects such as nationalism, liberalism, and international trade. Students receive information packets on the events for several weeks and must relate the subjects to these events. (MJP)

  1. Every Journey Begins with a Step

    ERIC Educational Resources Information Center

    Tavangar, Homa Sabet

    2016-01-01

    Events like Britain's vote to leave the European Union reflect the fear and sense of injustice many people feel in the face of globalization. Such events, Tavangar argues, send a message to K-12 educators: 21st century education must respond to the challenges of the global economy and create opportunities for everyone to find meaningful work in an…

  2. Dynamic landscape management

    Treesearch

    Valerie Rapp

    2002-01-01

    Pacific Northwest forests and all their species evolved with fires, floods, windstorms, landslides, and other disturbances. The dynamics of disturbance were basic to how forests changed and renewed. Disturbance regimes, as scientists call the long-term patterns of these events—what kind of event, how often, how large, and how severe—created the landscape patterns seen...

  3. Increasing Risk Awareness: The Coastal Community Resilience Index

    ERIC Educational Resources Information Center

    Thompson, Jody A.; Sempier, Tracie; Swann, LaDon

    2012-01-01

    As the number of people moving to the Gulf Coast increases, so does the risk of exposure to floods, hurricanes, and other storm-related events. In an effort to assist communities in preparing for future storm events, the Coastal Community Resilience Index was created. The end result is for communities to take actions to address the weaknesses they…

  4. Sibling Negotiations and the Construction of Literacy Events in an Urban Area of Tanzania

    ERIC Educational Resources Information Center

    Frankenberg, Sofia Johnson; Holmqvist, Rolf; Rubenson, Birgitta; Rindstedt, Camilla

    2012-01-01

    This study presents findings from analyses of naturally occurring literacy events, where children jointly focus on reading and writing letters of the alphabet, illustrating social constructions of learning created through language and embodied action. Video recorded data from two different families living in an urban low-income area in Tanzania is…

  5. Building Resilience to Trauma: Creating a Safe and Supportive Early Childhood Classroom

    ERIC Educational Resources Information Center

    Berson, Ilene R.; Baggerly, Jennifer

    2009-01-01

    Children around the world are being exposed to traumatic events at a troubling rate. In large, nationally representative studies of children in the United States, researchers have reported that 71% of children have been exposed to at least one potentially traumatic event in the past year, and almost 70% of children have experienced multiple…

  6. Dynamic landscape management.

    Treesearch

    Valerie Rapp

    2003-01-01

    Pacific Northwest forests and all their species evolved with fires, floods, windstorms, landslides, and other disturbances. The dynamics of disturbance were basic to how forests changed and renewed. Disturbance regimes, as scientists call the long-term patterns of these events—what kind of event, how often, how large, and how severe—created the landscape patterns seen...

  7. Murder and Mayhem. "The Great Gatsby": The Facts Behind the Fiction. Learning Page Lesson Plan.

    ERIC Educational Resources Information Center

    Rohrbach, Margie; Koszoru, Janie

    To appreciate historical fiction, students need to understand the factual context and recognize how popular culture reflects the values, mores, and events of the time period. Since a newspaper records significant events and attitudes representative of a period, students create their own newspapers, utilizing primary source materials from several…

  8. Extreme Magnetic Storms: Their Characteristics and Possible Consequences for Humanity

    NASA Astrophysics Data System (ADS)

    Falkowski, B. J.; Tsurutani, B.; Lakhina, G. S.; Deng, Y.; Mannucci, A. J.

    2015-12-01

    The solar and interplanetary conditions necessary to create an extreme magnetic storm will be discussed. The Carrington 1859 event was not the largest possible. It will be shown that different facets of fast ICMEs/extreme magnetic storms will have different limitations. Some possible adverse effects of such extreme space weather events on society will be addressed.

  9. Effect of non-linear fluid pressure diffusion on modeling induced seismicity during reservoir stimulation

    NASA Astrophysics Data System (ADS)

    Gischig, V.; Goertz-Allmann, B. P.; Bachmann, C. E.; Wiemer, S.

    2012-04-01

    Success of future enhanced geothermal systems relies on an appropriate pre-estimate of seismic risk associated with fluid injection at high pressure. A forward-model based on a semi-stochastic approach was created, which is able to compute synthetic earthquake catalogues. It proved to be able to reproduce characteristics of the seismic cloud detected during the geothermal project in Basel (Switzerland), such as radial dependence of stress drop and b-values as well as higher probability of large magnitude earthquakes (M>3) after shut-in. The modeling strategy relies on a simplistic fluid pressure model used to trigger failure points (so-called seeds) that are randomly distributed around an injection well. The seed points are assigned principal stress magnitudes drawn from Gaussian distributions representative of the ambient stress field. Once the effective stress state at a seed point meets a pre-defined Mohr-Coulomb failure criterion due to a fluid pressure increase a seismic event is induced. We assume a negative linear relationship between b-values and differential stress. Thus, for each event a magnitude can be drawn from a Gutenberg-Richter distribution with a b-value corresponding to differential stress at failure. The result is a seismic cloud evolving in time and space. Triggering of seismic events depends on appropriately calculating the transient fluid pressure field. Hence an effective continuum reservoir model able to reasonably reproduce the hydraulic behavior of the reservoir during stimulation is required. While analytical solutions for pressure diffusion are computationally efficient, they rely on linear pressure diffusion with constant hydraulic parameters, and only consider well head pressure while neglecting fluid injection rate. They cannot be considered appropriate in a stimulation experiment where permeability irreversibly increases by orders of magnitude during injection. We here suggest a numerical continuum model of non-linear pressure diffusion. Permeability increases both reversibly and, if a certain pressure threshold is reached, irreversibly in the form of a smoothed step-function. The models are able to reproduce realistic well head pressure magnitudes for injection rates common during reservoir stimulation. We connect this numerical model with the semi-stochastic seismicity model, and demonstrate the role of non-linear pressure diffusion on earthquakes probability estimates. We further use the model to explore various injection histories to assess the dependence of seismicity on injection strategy. It allows to qualitatively explore the probability of larger magnitude earthquakes (M>3) for different injection volumes, injection times, as well as injection build-up and shut-in strategies.

  10. Northeast Artificial Intelligence Consortium Annual Report for 1987. Volume 7. Part A. Time Oriented Problem Solving

    DTIC Science & Technology

    1989-03-01

    obvious categories axioms F1 , P2 assert the existence of ’points’ on of model of the basic theory, it is either either side of a rational point: start[a...of its role function; e.g., E1 is the f1 direct-component of E0. The type End never appears as a direct component of another type; nor does any type...observed event must be a component-of an End event; call these End events N 1, N2, ... Nn. Let there be a minimum covering model for F1 , ... r. in which

  11. B4 2 After, 3D Deformation Field From Matching Pre- To Post-Event Aerial LiDAR Point Clouds, The 2010 El Mayor-Cucapah M7.2 Earthquake Case

    NASA Astrophysics Data System (ADS)

    Hinojosa-Corona, A.; Nissen, E.; Limon-Tirado, J. F.; Arrowsmith, R.; Krishnan, A.; Saripalli, S.; Oskin, M. E.; Glennie, C. L.; Arregui, S. M.; Fletcher, J. M.; Teran, O. J.

    2013-05-01

    Aerial LiDAR surveys reconstruct with amazing fidelity the sinuosity of terrain relief. In this research we explore the 3D deformation field at the surface after a big earthquake (M7.2) by comparing pre- to post-event aerial LiDAR point clouds. The April 4 2010 earthquake produced a NW-SE surface rupture ~110km long with right-lateral normal slip up to 3m in magnitude over a very favorable target: scarcely vegetated and unaltered desert mountain range, sierras El Mayor and Cucapah, in northern Baja California, close to the US-México border. It is a plate boundary region between the Pacific and North American plates. The pre-event LiDAR with lower point density (0.013-0.033 pts m-2) required filtering and post-processing before comparing with the denser (9-18 pts m-2) more accurate post event dataset. The 3D surface displacement field was determined using an adaptation of the Iterative Closest Point (ICP) algorithm, implemented in the open source Point Cloud Library (PCL). The LiDAR datasets are first split into a grid of windows, and for each one, ICP iteratively converges on the rigid body transformation (comprising translations and rotations) that best aligns the pre- to post-event points. Perturbing the pre- and post-event point clouds independently with a synthetic right lateral inverse displacements of known magnitude along a proposed fault, ICP recovered the synthetically introduced translations. Windows with dimensions of 100-200m gave the best results for datasets with these densities. The simplified surface rupture photo interpreted and mapped in the field, delineates very well the vertical displacements patterns unveiled by ICP. The method revealed block rotations, some with clockwise and others counter clockwise direction along the simplified surface rupture. As ground truth, displacements from ICP have similar values as those measured in the field along the main rupture by Fletcher and collaborators. The vertical component was better estimated than the horizontal having the latter problems in flat areas as expected. Hybrid approaches, as simple differencing, could be taken in these areas. Outliers were removed from results. ICP detected extraction from quarries developed between the two dates of LiDAR collection and expressed as a negative vertical displacement close to the sites. To improve the accuracy of the 3D displacement field, we intend to reprocess the pre-event source survey data to reduce the systematic error introduced by the sensor. Multidisciplinary approach will be needed to make tectonic inferences from the 3D displacement field revealed by ICP, about the processes at depth expressed at surface.

  12. A Simulation of Alternatives for Wholesale Inventory Replenishment

    DTIC Science & Technology

    2016-03-01

    algorithmic details. The last method is a mixed-integer, linear optimization model. Comparative Inventory Simulation, a discrete event simulation model, is...simulation; event graphs; reorder point; fill-rate; backorder; discrete event simulation; wholesale inventory optimization model 15. NUMBER OF PAGES...model. Comparative Inventory Simulation, a discrete event simulation model, is designed to find fill rates achieved for each National Item

  13. Comparison between European and Iranian cutoff points of triglyceride/high-density lipoprotein cholesterol concentrations in predicting cardiovascular disease outcomes.

    PubMed

    Gharipour, Mojgan; Sadeghi, Masoumeh; Dianatkhah, Minoo; Nezafati, Pouya; Talaie, Mohammad; Oveisgharan, Shahram; Golshahi, Jafar

    2016-01-01

    High triglyceride (TG) and low high-density lipoprotein cholesterol (HDL-C) are important cardiovascular risk factors. The exact prognostic value of the TG/HDL-C ratio, a marker for cardiovascular events, is currently unknown among Iranians so this study sought to determine the optimal cutoff point for the TG/HDL-C ratio in predicting cardiovascular disease events in the Iranian population. The Isfahan Cohort Study (ICS) is an ongoing, longitudinal, population-based study that was originally conducted on adults aged ≥ 35 years, living in urban and rural areas of three districts in central Iran. After 10 years of follow-up, 5431 participants were re-evaluated using a standard protocol similar to the one used for baseline. At both measurements, participants underwent medical interviews, physical examinations, and fasting blood measurements. "High-risk" subjects were defined by the discrimination power of indices, which were assessed using receiver operating characteristic (ROC) analysis; the optimal cutoff point value for each index was then derived. The mean age of the participants was 50.7 ± 11.6 years. The TG/HDL-C ratio, at a threshold of 3.68, was used to screen for cardiovascular events among the study population. Subjects were divided into two groups ("low" and "high" risk) according to the TG/HDL-C concentration ratio at baseline. A slightly higher number of high-risk individuals were identified using the European cutoff points of 63.7% in comparison with the ICS cutoff points of 49.5%. The unadjusted hazard ratio (HR) was greatest in high-risk individuals identified by the ICS cutoff points (HR = 1.54, 95% CI [1.33-1.79]) vs European cutoff points (HR = 1.38, 95% [1.17-1.63]). There were no remarkable changes after adjusting for differences in sex and age (HR = 1.58, 95% CI [1.36-1.84] vs HR = 1.44, 95% CI [1.22-1.71]) for the ICS and European cutoff points, respectively. The threshold of TG/HDL ≥ 3.68 is the optimal cutoff point for predicting cardiovascular events in Iranian individuals. Copyright © 2016 National Lipid Association. Published by Elsevier Inc. All rights reserved.

  14. Keep Them Guessing

    NASA Astrophysics Data System (ADS)

    Riendeau, Diane

    2013-10-01

    Discrepant events are surprising occurrences that challenge learners' preconceptions.1 These events puzzle students because the results are contrary to what they believe should happen.2 Due to the unexpected outcome, students experience cognitive disequilibrium,3 and this often leads to a desire to solve the problem. Discrepant events are great motivators. Often, even disinterested students are drawn into the discussion by their need to understand what just happened.1 Although it is best to have the students experience the events firsthand, there are many videos on YouTube that can be used to create a sense of wonder in students.

  15. A method for co-creation of an evidence-based patient workbook to address alcohol use when quitting smoking in primary care: a case study.

    PubMed

    Minian, Nadia; Noormohamed, Aliya; Zawertailo, Laurie; Baliunas, Dolly; Giesbrecht, Norman; Le Foll, Bernard; Rehm, Jürgen; Samokhvalov, Andriy; Selby, Peter L

    2018-01-01

    The purpose of this paper is to describe a patient engagement event designed to create an educational workbook with smokers who drink alcohol at harmful levels. The goal was to create a workbook that combined scientific evidence with patients' values, preferences, and needs. Fourteen adult smokers who drink alcohol were invited to the Centre for Addiction and Mental Health (CAMH) to take part in a four-hour event to help design the workbook with the CAMH research team. Participants provided their opinions and ideas to create an outline for the workbook, including activities, images, and titles. The workbook - called Self-Awareness - is currently being offered in a smoking cessation program in 221 primary care clinics across Ontario to help smokers quit or reduce their harmful alcohol use. The patient engagement event was a useful way to co-create educational materials that incorporate both scientific research and patient needs. Background Evidence-based medicine is the integration of best research evidence with clinical expertise and patient values. There are few methodologies on how to design evidence-based programs and resources to include patient values. The latter is an important aspect of patient-centered care, and is essential for patients to trust the recommendations and empower them as consumers to make informed choices. This manuscript describes a participatory research approach to design patient-facing educational materials that incorporate both evidence-based and community-sensitive principles. These materials are intended to support smokers to reduce or stop harmful alcohol consumption. Methods Adult smokers who report consuming alcohol were invited to a co-creation meeting at the Centre for Addiction and Mental Health's Nicotine Dependence Service to guide the adaptation of evidence-based materials. The four-hour event consisted of individual reflections, group discussions, and consensus-building interactions. Detailed notes were taken and then incorporated into the material. Results Fourteen individuals participated in the event. The end product was a descriptive outline of an educational resource - entitled Self-Awareness - incorporating material from evidence-based workbooks and patient-driven features. Participants collaboratively selected the resource's content, structure, and titles. Conclusions This model describes a participatory research method that emphasizes the value of the patient perspective; preliminary evidence finds this adaptation approach can increase the adoption of resources. The process described in this article could be replicated in other settings to co-create evidence-based resources, interventions, and programs that reflect the needs of the community. Trial registration ClinicalTrials.gov NCT03108144. Retrospectively registered 11 April 2017.

  16. Measles elimination – review of event notifications sent to National IHR Focal Point between 2010 and 2016

    PubMed

    Izdebski, Radosław; Henszel, Łukasz; Janiec, Janusz; Radziszewski, Franciszek

    The Member States of the World Health Organization (WHO) in accordance with International Health Regulations (2005) were obliged to appoint National IHR Focal Points (N IHR FP), of which tasks include obtaining information concerning public health emergencies of international concern which occurred abroad or within the country. The aim of this work is the review of WHO, ECDC, National IHR Focal Points from the WHO Member States and The State Sanitary Inspection notifications related to measles received by National IHR Focal Point in Poland in the period from 2010 to 2016. During this period N IHR FP was informed about 79 events related to measles. These events include: 36 related to the outbreaks in different countries, 27 concerning individual cases, 14 related to the exposure in contact with a measles case during air travel and two concerning the implementation of the MMR vaccination programs. Despite the progress in implementing the measures included in the elimination of measles programs in Europe, there was a significant increase in the number of measles cases and outbreaks particularly in years 2010-2011.

  17. Woodchip bioreactors effectively treat aquaculture effluent

    USDA-ARS?s Scientific Manuscript database

    Nutrients, in particular nitrogen and phosphorus, can create eutrophication problems in any watershed. Preventing water quality impairment requires controlling nutrients from both point-source and non-point source discharges. Woodchip bioreactors are one relatively new approach that can be utilized ...

  18. A Meeting of the Minds: Learning about the Eastern Hemisphere and Creating Citizens of the World

    ERIC Educational Resources Information Center

    Sheehan, Kevin; Laifer, Larry

    2011-01-01

    Working at the sixth grade level, the authors write about their effort to interest students in current events and their historical roots. This article outlines a series of learning experiences and assessments that the authors created for sixth grade students at Lockhart School in Massapequa, New York. These learning experiences culminated in a…

  19. An approach to medical knowledge sharing in a hospital information system using MCLink.

    PubMed

    Shibuya, Akiko; Inoue, Ryusuke; Nakayama, Masaharu; Kasahara, Shin; Maeda, Yukihiro; Umesato, Yoshimasa; Kondo, Yoshiaki

    2013-08-01

    Clinicians often need access to electronic information resources that answer questions that occur in daily clinical practice. This information generally comes from publicly available resources. However, clinicians also need knowledge on institution-specific information (e.g., institution-specific guidelines, choice of drug, choice of laboratory test, information on adverse events, and advice from professional colleagues). This information needs to be available in real time. This study characterizes these needs in order to build a prototype hospital information system (HIS) that can help clinicians get timely answers to questions. We previously designed medical knowledge units called Medical Cells (MCs). We developed a portal server of MCs that can create and store medical information such as institution-specific information. We then developed a prototype HIS that embeds MCs as links (MCLink); these links are based on specific terms (e.g., drug, laboratory test, and disease). This prototype HIS presents clinicians with institution-specific information. The HIS clients (e.g., clinicians, nurses, pharmacists, and laboratory technicians) can also create an MCLink in the HIS using the portal server in the hospital. The prototype HIS allowed efficient sharing and use of institution-specific information to clinicians at the point of care. This study included institution-specific information resources and advice from professional colleagues, both of which might have an important role in supporting good clinical decision making.

  20. Using Photographs to Probe Students' Understanding of Physical Concepts: The Case of Newton's 3rd Law

    NASA Astrophysics Data System (ADS)

    Eshach, Haim

    2010-08-01

    The starting point of the present research is the following question: since we live in an age that makes increasing use of visual representations of all sorts, is not the visual representation a learner constructs a window into his/her understanding of what is or is not being learned? Following this direction of inquiry, the present preliminary study introduces and evaluates a novel technique for pinpointing learners’ misconceptions, namely, one that has learners create and interpret their own photographs (CIP). 27 high-school students and 26 pre-service teacher trainees were asked to assume the role of textbook designers and create a display—photograph plus attached verbal explanation—which, in their opinion, best depicted Newton’s 3rd law. Subsequent analysis of the participants’ photographs yielded the following six misconception categories: 3rd law not depicted; 3rd law depicts a sequence of events; tendency to introduce irrelevant entities in explanations; the word ‘reaction’ used colloquially; tendency to restrict the application of the third law to dynamic situations; and informal explanations in which the word “force” is absent. The findings indicate that, indeed, the CIP method can be effectively employed to elicit, detect, and investigate learners’ misconceptions. The CIP method joins the growing efforts to utilize the yet relatively untapped potential of visual tools for science education purposes.

  1. Loss estimation and damage forecast using database provided

    NASA Astrophysics Data System (ADS)

    Pyrchenko, V.; Byrova, V.; Petrasov, A.

    2009-04-01

    There is a wide spectrum of development of natural hazards is observed in Russian territory. It the necessity of investigation of numerous events of dangerous natural processes, researches of mechanisms of their development and interaction with each other (synergetic amplification or new hazards emerging) with the purpose of the forecast of possible losses. Employees of Laboratory of the analysis of geological risk IEG RAS have created a database about displays of natural hazards in territory of Russia, which contains the information on 1310 cases of their display during 1991 - 2008. The wide range of the used sources has determined certain difficulties in creation of Database and has demanded to develop a special new technique of unification of the information received at different times. One of points of this technique is classification of negative consequences of display of the natural hazards, considering a death-roll, wounded mans, victims and direct economic damage. This Database has allowed to track dynamics of natural hazards and the emergency situations caused by them (ES) for the considered period, and also to define laws of their development in territory of Russia in time and space. It gives the chance to create theoretical, methodological and methodical bases of forecasting of possible losses with a certain degree of probability for territory of Russia and for its separate regions that guarantees in the future maintenance of adequate, operative and efficient pre-emptive decision-making.

  2. Statistical Examination of Tornado Report and Warning Near-Storm Environments

    NASA Astrophysics Data System (ADS)

    Anderson-Frey, Alexandra K.

    This study makes use of a 13-year dataset of 14,814 tornado events and 44,961 tornado warnings in the continental United States, along with near-storm environmental data associated with each of those tornado events and warnings, to build a methodology that can be used to create nuanced climatologies of near-storm environmental data. Two key parameter spaces are identified as being particularly useful in this endeavor: mixed-layer convective available potential energy (MLCAPE) versus 0-6-km vector shear magnitude (SHR6) and mixed-layer lifting condensation level (MLLCL) versus 0-1-km storm-relative helicity (SRH1). In addition, the Significant Tornado Parameter (STP) is identified as a useful composite parameter that can highlight near-storm environments that are particularly favorable for the development of significant tornadoes. Two particular statistical methods for the analysis and characterization of near-storm environments are described and applied: Kernel Density Estimation (KDE), which is applied to bulk (proximity soundinglike) parameter values associated with each event or warning, and Self-Organizing Maps (SOMs), which are applied to fully two-dimensional plots of STP in an area surrounding each event or warning. The KDE approach characterizes and identifies differences in the environments of tornadoes forming in quasi-linear convective systems versus those forming in right-moving supercells; specific environmental traits are also identified for different geographical regions, seasons, and times of day. Tornado warning performance is found to be best in environments with particularly large values of MLCAPE and SHR6. The early evening transition (EET) period is of particular interest: MLCAPE and MLLCL heights are in the process of falling, and SHR6 and SRH1 are in the process of increasing. Accordingly, tornadoes rated 2 or greater on the enhanced Fujita scale (EF2+) are also most common during the EET, probability of detection (POD) is relatively high, and false-alarm ratio (FAR) is relatively low. Overall, when comparing the distribution of environments for events versus those for warnings, there is no "smoking gun" indicating a systematic problem with forecasting that explains the high overall false-alarm ratio, which instead seems to stem from the inability to know which storms in a given environment will be tornadic. The SOM approach establishes nine statistically distinct clusters of spatial distributions of STP values in the 480 km x 480 km region surrounding each tornado event or warning. For tornado events, distinct patterns are associated more with particular times of day, geographical locations, and times of year, and the use of two-dimensional data rather than point proximity sounding information means that these patterns can be identified and characterized with still more detail; for instance, the archetypal springtime dryline environment in the Great Plains emerges readily from the data. Although high values of STP tend to be associated with relatively high POD and relatively low FAR, the majority of tornado events occur within a pattern of low STP, with relatively high FAR and low POD. The two-dimensional plots produced by the SOM approach provide an intuitive way to create distinct climatologies of tornadic near-storm environments. Having established a methodology through the use of KDE and SOM, this research then examines the topic of tornado outbreaks [defined as ten or more (E)F1+ tornadoes that occur with no more than 6 h or 2,000 km between subsequent tornadoes]. Outbreak tornadoes in a given geographical region have greater SRH1 and SHR6 than isolated tornadoes in the same region, and also have considerably higher POD than isolated tornadoes. When SOMs are created for all (E)F1+ tornadoes, the percentage of outbreak tornadoes in a given node is found to depend more strongly on the magnitude of the STP value surrounding the tornado than its orientation. For the SOM of outbreak tornadoes, outbreaks occurring in environments with higher magnitudes of STP will generally also have the highest casualty rates, regardless of the specific two-dimensional pattern of STP. Two specific tornado outbreaks are then examined through this methodology, which allows the events to be placed into their climatological context with more nuance than typical proximity sounding-based approaches would allow.

  3. Automated Detection and Closing of Holes in Aerial Point Clouds Using AN Uas

    NASA Astrophysics Data System (ADS)

    Fiolka, T.; Rouatbi, F.; Bender, D.

    2017-08-01

    3D terrain models are an important instrument in areas like geology, agriculture and reconnaissance. Using an automated UAS with a line-based LiDAR can create terrain models fast and easily even from large areas. But the resulting point cloud may contain holes and therefore be incomplete. This might happen due to occlusions, a missed flight route due to wind or simply as a result of changes in the ground height which would alter the swath of the LiDAR system. This paper proposes a method to detect holes in 3D point clouds generated during the flight and adjust the course in order to close them. First, a grid-based search for holes in the horizontal ground plane is performed. Then a check for vertical holes mainly created by buildings walls is done. Due to occlusions and steep LiDAR angles, closing the vertical gaps may be difficult or even impossible. Therefore, the current approach deals with holes in the ground plane and only marks the vertical holes in such a way that the operator can decide on further actions regarding them. The aim is to efficiently create point clouds which can be used for the generation of complete 3D terrain models.

  4. Elliptic Flow, Initial Eccentricity and Elliptic Flow Fluctuations in Heavy Ion Collisions at RHIC

    NASA Astrophysics Data System (ADS)

    Nouicer, Rachid; Alver, B.; Back, B. B.; Baker, M. D.; Ballintijn, M.; Barton, D. S.; Betts, R. R.; Bickley, A. A.; Bindel, R.; Busza, W.; Carroll, A.; Chai, Z.; Decowski, M. P.; García, E.; Gburek, T.; George, N.; Gulbrandsen, K.; Halliwell, C.; Hamblen, J.; Hauer, M.; Henderson, C.; Hofman, D. J.; Hollis, R. S.; Holzman, B.; Iordanova, A.; Kane, J. L.; Khan, N.; Kulinich, P.; Kuo, C. M.; Li, W.; Lin, W. T.; Loizides, C.; Manly, S.; Mignerey, A. C.; Nouicer, R.; Olszewski, A.; Pak, R.; Reed, C.; Roland, C.; Roland, G.; Sagerer, J.; Seals, H.; Sedykh, I.; Smith, C. E.; Stankiewicz, M. A.; Steinberg, P.; Stephans, G. S. F.; Sukhanov, A.; Tonjes, M. B.; Trzupek, A.; Vale, C.; van Nieuwenhuizen, G. J.; Vaurynovich, S. S.; Verdier, R.; Veres, G. I.; Walters, P.; Wenger, E.; Wolfs, F. L. H.; Wosiek, B.; Woźniak, K.; Wysłouch, B.

    2008-12-01

    We present measurements of elliptic flow and event-by-event fluctuations established by the PHOBOS experiment. Elliptic flow scaled by participant eccentricity is found to be similar for both systems when collisions with the same number of participants or the same particle area density are compared. The agreement of elliptic flow between Au+Au and Cu+Cu collisions provides evidence that the matter is created in the initial stage of relativistic heavy ion collisions with transverse granularity similar to that of the participant nucleons. The event-by-event fluctuation results reveal that the initial collision geometry is translated into the final state azimuthal particle distribution, leading to an event-by-event proportionality between the observed elliptic flow and initial eccentricity.

  5. How to Create "Thriller" PowerPoints[R] in the Classroom!

    ERIC Educational Resources Information Center

    Berk, Ronald A.

    2012-01-01

    PowerPoint[R] presentations in academia have a reputation for being less than engaging in this era of learner-centered teaching. The Net Generation also presents a formidable challenge to using PowerPoint[R]. Although the research on the basic elements is rather sparse, the multimedia elements of movement, music, and videos have a stronger…

  6. A Point to Share: Streamlining Access Services Workflow through Online Collaboration, Communication, and Storage with Microsoft SharePoint

    ERIC Educational Resources Information Center

    Diffin, Jennifer; Chirombo, Fanuel; Nangle, Dennis; de Jong, Mark

    2010-01-01

    This article explains how the document management team (circulation and interlibrary loan) at the University of Maryland University College implemented Microsoft's SharePoint product to create a central hub for online collaboration, communication, and storage. Enhancing the team's efficiency, organization, and cooperation was the primary goal.…

  7. The Importance of Impacts within the Solar System - A Short History

    NASA Astrophysics Data System (ADS)

    Yeomans, D. K.

    2005-08-01

    While early meteorite falls had been observed by Chinese and European observers and lunar craters were identified in the early seventeenth century, the important role of impacts in determining the surface features of the moon and Earth would not be widely recognized for more than three centuries. Despite the fact that Earth's volcanic craters were dissimilar in both size and shape from lunar craters, a volcanic origin for the lunar craters was favored. The impact origin for these craters was not seriously discussed until the early twentieth century. Until then, near-Earth asteroids were unknown and it was difficult to explain why the observed lunar craters had circular rims when those created by impacts should have oblong rims to reflect the oblique approach angle of most impactors. Although Opik first pointed out in 1916 that lunar impactors coming in at any angle would create explosive events that could explain the near circularity of their crater rims, his paper was buried in an obscure journal. In the first half of the twentieth century, the consensus view of astronomers was that volcanic activity was responsible for lunar craters while geologists leaned toward an impact origin. Thus, each group dismissed the mechanism that was most familiar to them. At a time when most astronomers stubbornly refused to acknowledge any impact craters on the moon or Earth, the geologist and entrepreneur Daniel Barringer doggedly championed the impact formation of the Meteor crater near Flagstaff Arizona. It was not until 1980 that Alvarez et al suggested and provided evidence for an impact extinction event that corresponded with the boundary between the Cretaceous and Tertiary periods some 65 million years ago. The issue of an engineering solution for the mitigation of an Earth threatening object (i.e., Project Icarus) was first studied in 1967 by an undergraduate engineering class at MIT.

  8. NASA's Earth Science Use of Commercially Availiable Remote Sensing Datasets: Cover Image

    NASA Technical Reports Server (NTRS)

    Underwood, Lauren W.; Goward, Samuel N.; Fearon, Matthew G.; Fletcher, Rose; Garvin, Jim; Hurtt, George

    2008-01-01

    The cover image incorporates high resolution stereo pairs acquired from the DigitalGlobe(R) QuickBird sensor. It shows a digital elevation model of Meteor Crater, Arizona at approximately 1.3 meter point-spacing. Image analysts used the Leica Photogrammetry Suite to produce the DEM. The outside portion was computed from two QuickBird panchromatic scenes acquired October 2006, while an Optech laser scan dataset was used for the crater s interior elevations. The crater s terrain model and image drape were created in a NASA Constellation Program project focused on simulating lunar surface environments for prototyping and testing lunar surface mission analysis and planning tools. This work exemplifies NASA s Scientific Data Purchase legacy and commercial high resolution imagery applications, as scientists use commercial high resolution data to examine lunar analog Earth landscapes for advanced planning and trade studies for future lunar surface activities. Other applications include landscape dynamics related to volcanism, hydrologic events, climate change, and ice movement.

  9. Older Adults Co-Creating Meaningful Individualized Social Activities Online for Healthy Ageing.

    PubMed

    Blusi, Madeleine; Nilsson, Ingeborg; Lindgren, Helena

    2018-01-01

    Social isolation and loneliness among older people is a growing problem with negative effects on physical and mental health. In co-creation with older adults individualized social activities were designed where older adults through computer mediated communication were able to participate in social activities without leaving their homes. Four types of activities were designed; outdoor activity, music event, visiting a friend and leisure activity. A participatory action research design was applied, where end users together with scientists from two research fields developed, tested and evaluated online participation in the activities. Usability and safety of the systems were major concerns among older adults. The evaluation pointed out that level of simplicity, usability and audio-video quality determined the level of satisfaction with the human interaction during the activity, thereby affecting the meaningfulness of the activity. The research presented in this paper constitutes the first step in a long-term research process aiming at developing a digital coaching system that gives older adults personalized support for increasing participation in meaningful social activities.

  10. The termination phase of psychoanalysis in a narcissistic personality.

    PubMed

    Warnes, H

    This paper describes a patient whose termination phase of analysis activated an intense mourning reaction that helped to overcome the stalemate of therapy. After I attempted to demonstrate how her narcissistic armouring yielded when the termination of analysis was agreed upon, the psychological reenactment of a split off (disavowed) trauma of an early loss (her father) and the failure of essential attributes in maternal care became manifested behind her narcissistic defenses. The reconstruction of these events was possible during the process of mourning. At the termination phase she behaved as if she "had lost the war"; from the point of view of her masochism it was a Pyrrhic victory, "a victory through defeat". Contrary to mother, I let her go but then she refused to go, which created a situation that activated a profound mourning reaction leading to important structural changes. A review of the pertinent psychoanalytic literature on termination along with clinical material derived from the termination phase of a patient with a narcissistic personality is presented.

  11. Whole-genome landscape of pancreatic neuroendocrine tumours.

    PubMed

    Scarpa, Aldo; Chang, David K; Nones, Katia; Corbo, Vincenzo; Patch, Ann-Marie; Bailey, Peter; Lawlor, Rita T; Johns, Amber L; Miller, David K; Mafficini, Andrea; Rusev, Borislav; Scardoni, Maria; Antonello, Davide; Barbi, Stefano; Sikora, Katarzyna O; Cingarlini, Sara; Vicentini, Caterina; McKay, Skye; Quinn, Michael C J; Bruxner, Timothy J C; Christ, Angelika N; Harliwong, Ivon; Idrisoglu, Senel; McLean, Suzanne; Nourse, Craig; Nourbakhsh, Ehsan; Wilson, Peter J; Anderson, Matthew J; Fink, J Lynn; Newell, Felicity; Waddell, Nick; Holmes, Oliver; Kazakoff, Stephen H; Leonard, Conrad; Wood, Scott; Xu, Qinying; Nagaraj, Shivashankar Hiriyur; Amato, Eliana; Dalai, Irene; Bersani, Samantha; Cataldo, Ivana; Dei Tos, Angelo P; Capelli, Paola; Davì, Maria Vittoria; Landoni, Luca; Malpaga, Anna; Miotto, Marco; Whitehall, Vicki L J; Leggett, Barbara A; Harris, Janelle L; Harris, Jonathan; Jones, Marc D; Humphris, Jeremy; Chantrill, Lorraine A; Chin, Venessa; Nagrial, Adnan M; Pajic, Marina; Scarlett, Christopher J; Pinho, Andreia; Rooman, Ilse; Toon, Christopher; Wu, Jianmin; Pinese, Mark; Cowley, Mark; Barbour, Andrew; Mawson, Amanda; Humphrey, Emily S; Colvin, Emily K; Chou, Angela; Lovell, Jessica A; Jamieson, Nigel B; Duthie, Fraser; Gingras, Marie-Claude; Fisher, William E; Dagg, Rebecca A; Lau, Loretta M S; Lee, Michael; Pickett, Hilda A; Reddel, Roger R; Samra, Jaswinder S; Kench, James G; Merrett, Neil D; Epari, Krishna; Nguyen, Nam Q; Zeps, Nikolajs; Falconi, Massimo; Simbolo, Michele; Butturini, Giovanni; Van Buren, George; Partelli, Stefano; Fassan, Matteo; Khanna, Kum Kum; Gill, Anthony J; Wheeler, David A; Gibbs, Richard A; Musgrove, Elizabeth A; Bassi, Claudio; Tortora, Giampaolo; Pederzoli, Paolo; Pearson, John V; Waddell, Nicola; Biankin, Andrew V; Grimmond, Sean M

    2017-03-02

    The diagnosis of pancreatic neuroendocrine tumours (PanNETs) is increasing owing to more sensitive detection methods, and this increase is creating challenges for clinical management. We performed whole-genome sequencing of 102 primary PanNETs and defined the genomic events that characterize their pathogenesis. Here we describe the mutational signatures they harbour, including a deficiency in G:C > T:A base excision repair due to inactivation of MUTYH, which encodes a DNA glycosylase. Clinically sporadic PanNETs contain a larger-than-expected proportion of germline mutations, including previously unreported mutations in the DNA repair genes MUTYH, CHEK2 and BRCA2. Together with mutations in MEN1 and VHL, these mutations occur in 17% of patients. Somatic mutations, including point mutations and gene fusions, were commonly found in genes involved in four main pathways: chromatin remodelling, DNA damage repair, activation of mTOR signalling (including previously undescribed EWSR1 gene fusions), and telomere maintenance. In addition, our gene expression analyses identified a subgroup of tumours associated with hypoxia and HIF signalling.

  12. International Halley Watch: Discipline specialists for large scale phenomena

    NASA Technical Reports Server (NTRS)

    Brandt, J. C.; Niedner, M. B., Jr.

    1986-01-01

    The largest scale structures of comets, their tails, are extremely interesting from a physical point of view, and some of their properties are among the most spectacular displayed by comets. Because the tail(s) is an important component part of a comet, the Large-Scale Phenomena (L-SP) Discipline was created as one of eight different observational methods in which Halley data would be encouraged and collected from all around the world under the aspices of the International Halley Watch (IHW). The L-SP Discipline Specialist (DS) Team resides at NASA/Goddard Space Flight Center under the leadership of John C. Brandt, Malcolm B. Niedner, and their team of image-processing and computer specialists; Jurgan Rahe at NASA Headquarters completes the formal DS science staff. The team has adopted the study of disconnection events (DE) as its principal science target, and it is because of the rapid changes which occur in connection with DE's that such extensive global coverage was deemed necessary to assemble a complete record.

  13. Stress in mangrove forests: early detection and preemptive rehabilitation are essential for future successful worldwide mangrove forest management

    USGS Publications Warehouse

    Lewis, Roy R; Milbrandt, Eric C; Brown, Benjamin; Krauss, Ken W.; Rovai, Andre S.; Beever, James W.; Flynn, Laura L

    2016-01-01

    Mangrove forest rehabilitation should begin much sooner than at the point of catastrophic loss. We describe the need for “mangrove forest heart attack prevention”, and how that might be accomplished in a general sense by embedding plot and remote sensing monitoring within coastal management plans. The major cause of mangrove stress at many sites globally is often linked to reduced tidal flows and exchanges. Blocked water flows can reduce flushing not only from the seaward side, but also result in higher salinity and reduced sediments when flows are blocked landward. Long-term degradation of function leads to acute mortality prompted by acute events, but created by a systematic propensity for long-term neglect of mangroves. Often, mangroves are lost within a few years; however, vulnerability is re-set decades earlier when seemingly innocuous hydrological modifications are made (e.g., road construction, blocked tidal channels), but which remain undetected without reasonable large-scale monitoring.

  14. INTERFRAGMENTARY SURFACE AREA AS AN INDEX OF COMMINUTION SEVERITY IN CORTICAL BONE IMPACT

    PubMed Central

    Beardsley, Christina L.; Anderson, Donald D.; Marsh, J. Lawrence; Brown, Thomas D.

    2008-01-01

    Summary A monotonic relationship is expected between energy absorption and fracture surface area generation for brittle solids, based on fracture mechanics principles. It was hypothesized that this relationship is demonstrable in bone, to the point that on a continuous scale, comminuted fractures created with specific levels of energy delivery could be discriminated from one another. Using bovine cortical bone segments in conjunction with digital image analysis of CT fracture data, the surface area freed by controlled impact fracture events was measured. The results demonstrated a statistically significant (p<0.0001) difference in measured de novo surface area between three specimen groups, over a range of input energies from 0.423 to 0.702 J/g. Local material properties were also incorporated into these measurements via CT Hounsfield intensities. This study confirms that comminution severity of bone fractures can indeed be measured on a continuous scale, based on energy absorption. This lays a foundation for similar assessments in human injuries. PMID:15885492

  15. Seismically reactivated Hattian slide in Kashmir, Northern Pakistan

    NASA Astrophysics Data System (ADS)

    Schneider, Jean F.

    2009-07-01

    The Pakistan 2005 earthquake, of magnitude 7.6, caused severe damage on landscape and infrastructure, in addition to numerous casualties. The event reactivated Hattian Slide, creating a rock avalanche in a location where earlier mass movements had happened already, as indicated by satellite imagery and ground investigation. The slide originated on Dana Hill, in the upper catchment area of Hattian on Karli Stream, a tributary of Jhelum River, Pakistan, and buried the hamlet Dandbeh and several farms nearby. A natural dam accumulated, impounding two lakes, the larger one threatening parts of downstream Hattian Village with flooding. An access road and artificial spillways needed to be constructed in very short time to minimize the flooding risk. As shown by this example, when pointing out the risk of large-scale damage to population and infrastructure by way of hazard indication maps of seismically active regions, and preparing for alleviation of that risk, it is advisable to consider the complete Holocene history of the slopes involved.

  16. Magnetic Topology of a Long-Lived Coronal Condensation Site Lasting Eight Months

    NASA Astrophysics Data System (ADS)

    Sun, X.; Yu, S.; Liu, W.

    2017-12-01

    It is well known that cool material, such as prominences or coronal rain, can form in-situ by condensation of hot coronal plasma due to a runaway radiative cooling instability (a.k.a. thermal non-equilibrium). Recent observations and numerical simulations suggest that such condensations are quite common, but in quiet-Sun regions, they occur preferentially in locations where magnetic field is weak (e.g., null points) or discontinuous (e.g., current sheets). Such events usually have short lifetimes of hours to days. Surprisingly, we observed a high-latitude condensation site lasting over eight months in 2014 with recurrent and episodic condensations fueling a funnel-shaped prominence. We analyze the coronal magnetic topology to investigate the necessary condition of such a long-lived condensation site. We find that the site was directly above a poleward photospheric flux surge when the polar field polarity was close to its solar cycle reversal. The large-scale magnetic cancellation front may have sustained interchange reconnection at this location, creating suitable conditions for coronal plasma condensation.

  17. The Canterbury Tales: Lessons from the Canterbury Earthquake Sequence to Inform Better Public Communication Models

    NASA Astrophysics Data System (ADS)

    McBride, S.; Tilley, E. N.; Johnston, D. M.; Becker, J.; Orchiston, C.

    2015-12-01

    This research evaluates the public education earthquake information prior to the Canterbury Earthquake sequence (2010-present), and examines communication learnings to create recommendations for improvement in implementation for these types of campaigns in future. The research comes from a practitioner perspective of someone who worked on these campaigns in Canterbury prior to the Earthquake Sequence and who also was the Public Information Manager Second in Command during the earthquake response in February 2011. Documents, specifically those addressing seismic risk, that were created prior to the earthquake sequence, were analyzed, using a "best practice matrix" created by the researcher, for how closely these aligned to best practice academic research. Readability tests and word counts are also employed to assist with triangulation of the data as was practitioner involvement. This research also outlines the lessons learned by practitioners and explores their experiences in regards to creating these materials and how they perceive these now, given all that has happened since the inception of the booklets. The findings from the research showed these documents lacked many of the attributes of best practice. The overly long, jargon filled text had little positive outcome expectancy messages. This probably would have failed to persuade anyone that earthquakes were a real threat in Canterbury. Paradoxically, it is likely these booklets may have created fatalism in publics who read the booklets. While the overall intention was positive, for scientists to explain earthquakes, tsunami, landslides and other risks to encourage the public to prepare for these events, the implementation could be greatly improved. This final component of the research highlights points of improvement for implementation for more successful campaigns in future. The importance of preparedness and science information campaigns can be not only in preparing the population but also into development of crisis communication plans. These plans are prepared in advance of a major emergency and symbiotic development of strategies, messages, themes and organizational structures in the preparedness stage can impact successful crisis communication plan implementation during an emergency.

  18. Seismicity of the Bering Glacier Region: Inferences from Relocations Using Data from STEEP

    NASA Astrophysics Data System (ADS)

    Panessa, A. L.; Pavlis, G. L.; Hansen, R. A.; Ruppert, N.

    2008-12-01

    We relocated earthquakes recorded from 1990 to 2007 in the area of the Bering Glacier in southeastern Alaska to test a hypothesis that faults in this area are linked to glaciers. We used waveform correlation to improve arrival time measurements for data from all broadband channels including all the data from the STEEP experiment. We used a novel form of correlation based on interactive array processing of common receiver gathers linked to a three-dimensional grid of control points. This procedure produced 8556 gathers that we processed interactively to produce improved arrival time estimates. The interactive procedure allowed us to select which events in each gather were sufficiently similar to warrant correlation. Redundancy in the result was resolved in a secondary correlation that aligned event stacks of the same station-event pair associated with multiple control points. This procedure yielded only 2240 waveforms that correlated and modified only a total of 524 arrivals in a total database of 12263 arrivals. The correlation procedure changed arrival times on 145 of 509 events in this database. Events with arrivals constrained by correlation were not clustered but were randomly distributed throughout the study area. We used a version of the Progressive Multiple Event Location (PMEL) that analyzed data at each control point to invert for relative locations and a set of path anomalies for each control point. We applied the PMEL procedure with different velocity models and constraints and compared the results to a HypoDD solution produced from the original arrival time data. The relocations are all significant improvements from the standard single-event, catalog locations. The relocations suggest the seismicity in this region is mostly linked to fold and thrust deformation in the Yakatat block. There is a suggestion of a north-dipping trend to much of the seismicity, but the dominant trend is a fairly diffuse cloud of events largely confined to the Yakatat block south of the Bagley Icefield. This is consistent with the recently published tectonic model by Berger et al. (2008).

  19. CPAP for Prevention of Cardiovascular Events in Obstructive Sleep Apnea.

    PubMed

    McEvoy, R Doug; Antic, Nick A; Heeley, Emma; Luo, Yuanming; Ou, Qiong; Zhang, Xilong; Mediano, Olga; Chen, Rui; Drager, Luciano F; Liu, Zhihong; Chen, Guofang; Du, Baoliang; McArdle, Nigel; Mukherjee, Sutapa; Tripathi, Manjari; Billot, Laurent; Li, Qiang; Lorenzi-Filho, Geraldo; Barbe, Ferran; Redline, Susan; Wang, Jiguang; Arima, Hisatomi; Neal, Bruce; White, David P; Grunstein, Ron R; Zhong, Nanshan; Anderson, Craig S

    2016-09-08

    Obstructive sleep apnea is associated with an increased risk of cardiovascular events; whether treatment with continuous positive airway pressure (CPAP) prevents major cardiovascular events is uncertain. After a 1-week run-in period during which the participants used sham CPAP, we randomly assigned 2717 eligible adults between 45 and 75 years of age who had moderate-to-severe obstructive sleep apnea and coronary or cerebrovascular disease to receive CPAP treatment plus usual care (CPAP group) or usual care alone (usual-care group). The primary composite end point was death from cardiovascular causes, myocardial infarction, stroke, or hospitalization for unstable angina, heart failure, or transient ischemic attack. Secondary end points included other cardiovascular outcomes, health-related quality of life, snoring symptoms, daytime sleepiness, and mood. Most of the participants were men who had moderate-to-severe obstructive sleep apnea and minimal sleepiness. In the CPAP group, the mean duration of adherence to CPAP therapy was 3.3 hours per night, and the mean apnea-hypopnea index (the number of apnea or hypopnea events per hour of recording) decreased from 29.0 events per hour at baseline to 3.7 events per hour during follow-up. After a mean follow-up of 3.7 years, a primary end-point event had occurred in 229 participants in the CPAP group (17.0%) and in 207 participants in the usual-care group (15.4%) (hazard ratio with CPAP, 1.10; 95% confidence interval, 0.91 to 1.32; P=0.34). No significant effect on any individual or other composite cardiovascular end point was observed. CPAP significantly reduced snoring and daytime sleepiness and improved health-related quality of life and mood. Therapy with CPAP plus usual care, as compared with usual care alone, did not prevent cardiovascular events in patients with moderate-to-severe obstructive sleep apnea and established cardiovascular disease. (Funded by the National Health and Medical Research Council of Australia and others; SAVE ClinicalTrials.gov number, NCT00738179 ; Australian New Zealand Clinical Trials Registry number, ACTRN12608000409370 .).

  20. powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks

    NASA Astrophysics Data System (ADS)

    Murray, Steven G.

    2018-05-01

    powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.

  1. Reduced Antiplatelet Effect of Aspirin Does Not Predict Cardiovascular Events in Patients With Stable Coronary Artery Disease.

    PubMed

    Larsen, Sanne Bøjet; Grove, Erik Lerkevang; Neergaard-Petersen, Søs; Würtz, Morten; Hvas, Anne-Mette; Kristensen, Steen Dalby

    2017-08-05

    Increased platelet aggregation during antiplatelet therapy may predict cardiovascular events in patients with coronary artery disease. The majority of these patients receive aspirin monotherapy. We aimed to investigate whether high platelet-aggregation levels predict cardiovascular events in stable coronary artery disease patients treated with aspirin. We included 900 stable coronary artery disease patients with either previous myocardial infarction, type 2 diabetes mellitus, or both. All patients received single antithrombotic therapy with 75 mg aspirin daily. Platelet aggregation was evaluated 1 hour after aspirin intake using the VerifyNow Aspirin Assay (Accriva Diagnostics) and Multiplate Analyzer (Roche; agonists: arachidonic acid and collagen). Adherence to aspirin was confirmed by serum thromboxane B 2 . The primary end point was the composite of nonfatal myocardial infarction, ischemic stroke, and cardiovascular death. At 3-year follow-up, 78 primary end points were registered. The primary end point did not occur more frequently in patients with high platelet-aggregation levels (first versus fourth quartile) assessed by VerifyNow (hazard ratio: 0.5 [95% CI, 0.3-1.1], P =0.08) or Multiplate using arachidonic acid (hazard ratio: 1.0 [95% CI, 0.5-2.1], P =0.92) or collagen (hazard ratio: 1.4 [95% CI, 0.7-2.8], P =0.38). Similar results were found for the composite secondary end point (nonfatal myocardial infarction, ischemic stroke, stent thrombosis, and all-cause death) and the single end points. Thromboxane B 2 levels did not predict any end points. Renal insufficiency was the only clinical risk factor predicting the primary and secondary end points. This study is the largest to investigate platelet aggregation in stable coronary artery disease patients receiving aspirin as single antithrombotic therapy. We found that high platelet-aggregation levels did not predict cardiovascular events. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  2. STELLAR ROTATION EFFECTS IN POLARIMETRIC MICROLENSING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sajadian, Sedighe, E-mail: sajadian@ipm.ir

    2016-07-10

    It is well known that the polarization signal in microlensing events of hot stars is larger than that of main-sequence stars. Most hot stars rotate rapidly around their stellar axes. The stellar rotation creates ellipticity and gravity-darkening effects that break the spherical symmetry of the source's shape and the circular symmetry of the source's surface brightness respectively. Hence, it causes a net polarization signal for the source star. This polarization signal should be considered in polarimetric microlensing of fast rotating stars. For moderately rotating stars, lensing can magnify or even characterize small polarization signals due to the stellar rotation throughmore » polarimetric observations. The gravity-darkening effect due to a rotating source star creates asymmetric perturbations in polarimetric and photometric microlensing curves whose maximum occurs when the lens trajectory crosses the projected position of the rotation pole on the sky plane. The stellar ellipticity creates a time shift (i) in the position of the second peak of the polarimetric curves in transit microlensing events and (ii) in the peak position of the polarimetric curves with respect to the photometric peak position in bypass microlensing events. By measuring this time shift via polarimetric observations of microlensing events, we can evaluate the ellipticity of the projected source surface on the sky plane. Given the characterizations of the FOcal Reducer and low dispersion Spectrograph (FORS2) polarimeter at the Very Large Telescope, the probability of observing this time shift is very small. The more accurate polarimeters of the next generation may well measure these time shifts and evaluate the ellipticity of microlensing source stars.« less

  3. Projections of Flood Risk using Credible Climate Signals in the Ohio River Basin

    NASA Astrophysics Data System (ADS)

    Schlef, K.; Robertson, A. W.; Brown, C.

    2017-12-01

    Estimating future hydrologic flood risk under non-stationary climate is a key challenge to the design of long-term water resources infrastructure and flood management strategies. In this work, we demonstrate how projections of large-scale climate patterns can be credibly used to create projections of long-term flood risk. Our study area is the northwest region of the Ohio River Basin in the United States Midwest. In the region, three major teleconnections have been previously demonstrated to affect synoptic patterns that influence extreme precipitation and streamflow: the El Nino Southern Oscillation, the Pacific North American pattern, and the Pacific Decadal Oscillation. These teleconnections are strongest during the winter season (January-March), which also experiences the greatest number of peak flow events. For this reason, flood events are defined as the maximum daily streamflow to occur in the winter season. For each gage in the region, the location parameter of a log Pearson type 3 distribution is conditioned on the first principal component of the three teleconnections to create a statistical model of flood events. Future projections of flood risk are created by forcing the statistical model with projections of the teleconnections from general circulation models selected for skill. We compare the results of our method to the results of two other methods: the traditional model chain (i.e., general circulation model projections to downscaling method to hydrologic model to flood frequency analysis) and that of using the historic trend. We also discuss the potential for developing credible projections of flood events for the continental United States.

  4. Intraseasonal Cold Air Outbreak over East Asia and the preceding atmospheric condition over the Barents-Kara Sea

    NASA Astrophysics Data System (ADS)

    Hori, M. E.; Inoue, J.

    2011-12-01

    Frequent occurrence of cold air outbreak is a dominant feature of the East Asian winter monsoon. A contributing factor for the this cold air outbreak is the role of stationary Rossby waves over the Eurasian continent which intensifies the surface Siberian High and the accompanying cold air outflow. Reduced sea ice and increase in turbulence heat flux is hypothesized as a source of such stationary waves (Honda et al. 2009). In particular, the winter of 2009/2010 saw a strong correlation of high pressure anomaly over the Barents/Kara sea and the following cold air buildup over the Eurasian continent and its advection towards East Asia (Hori et al. 2011). The lag correlation of surface temperature over Japan and the 850hPa geopotential height shows a cyclonic anomaly appearing over the Barents/Kara sea which creates a cold air advection over the Eurasian continent. The pressure anomaly subsequently shifted westward to mature into a blocking high which created a wave- train pattern downstream advecting the cold air buildup eastward toward East Asia and Japan (Fig1). We further examine this mechanism for other years including the 2005/2006, 2010/2011 winter and other winters with extreme cold air outbreaks. Overall, the existence of an anticyclonic anomaly over the Barents/Kara sea correlated well with the seasonal dominance of cold air over the Eurasian continent thereby creating a contrast of a warm Arctic and cold Eurasian continent.In the intraseasonal timescale, the existence of this anticyclone corresponds to a persisting atmospheric blocking in the high latitudes. In the presentation, we address the underlying chain of events leading up to a strong cold air outbreak over East Asia from an atmosphere - sea ice - land surafce interaction point of view for paritular cold winter years.

  5. Prediction of Febrile Neutropenia after Chemotherapy Based on Pretreatment Risk Factors among Cancer Patients

    PubMed Central

    Aagaard, Theis; Roen, Ashley; Daugaard, Gedske; Brown, Peter; Sengeløv, Henrik; Mocroft, Amanda; Lundgren, Jens; Helleberg, Marie

    2017-01-01

    Abstract Background Febrile neutropenia (FN) is a common complication to chemotherapy associated with a high burden of morbidity and mortality. Reliable prediction of individual risk based on pretreatment risk factors allows for stratification of preventive interventions. We aimed to develop such a risk stratification model to predict FN in the 30 days after initiation of chemotherapy. Methods We included consecutive treatment-naïve patients with solid cancers and diffuse large B-cell lymphomas at Copenhagen University Hospital, 2010–2015. Data were obtained from the PERSIMUNE repository of electronic health records. FN was defined as neutrophils ≤0.5 × 10E9/L ​at the time of either a blood culture sample or death. Time from initiation of chemotherapy to FN was analyzed using Fine-Gray models with death as a competing event. Risk factors investigated were: age, sex, body surface area, haemoglobin, albumin, neutrophil-to-lymphocyte ratio, Charlson Comorbidity Index (CCI) and chemotherapy drugs. Parameter estimates were scaled and summed to create the risk score. The scores were grouped into four: low, intermediate, high and very high risk. Results Among 8,585 patients, 467 experienced FN, incidence rate/30 person-days 0.05 (95% CI, 0.05–0.06). Age (1 point if > 65 years), albumin (1 point if < 39 g/L), CCI (1 point if > 2) and chemotherapy (range -5 to 6 points/drug) predicted FN. Median score at inclusion was 2 points (range –5 to 9). The cumulative incidence and the incidence rates and hazard ratios of FN are shown in Figure 1 and Table 1, respectively. Conclusion We developed a risk score to predict FN the first month after initiation of chemotherapy. The score is easy to use and provides good differentiation of risk groups; the score needs independent validation before routine use. Disclosures All authors: No reported disclosures.

  6. The Event Chain of Survival in the Context of Music Festivals: A Framework for Improving Outcomes at Major Planned Events.

    PubMed

    Lund, Adam; Turris, Sheila

    2017-08-01

    Despite the best efforts of event producers and on-site medical teams, there are sometimes serious illnesses, life-threatening injuries, and fatalities related to music festival attendance. Producers, clinicians, and researchers are actively seeking ways to reduce the mortality and morbidity associated with these events. After analyzing the available literature on music festival health and safety, several major themes emerged. Principally, stakeholder groups planning in isolation from one another (ie, in silos) create fragmentation, gaps, and overlap in plans for major planned events (MPEs). The authors hypothesized that one approach to minimizing this fragmentation may be to create a framework to "connect the dots," or join together the many silos of professionals responsible for safety, security, health, and emergency planning at MPEs. Adapted from the well-established literature regarding the management of cardiac arrests, both in and out of hospital, the "chain of survival" concept is applied to the disparate groups providing services that support event safety in the context of music festivals. The authors propose this framework for describing, understanding, coordinating and planning around the integration of safety, security, health, and emergency service for events. The adapted Event Chain of Survival contains six interdependent links, including: (1) event producers; (2) police and security; (3) festival health; (4) on-site medical services; (5) ambulance services; and (6) off-site medical services. The authors argue that adapting and applying this framework in the context of MPEs in general, and music festivals specifically, has the potential to break down the current disconnected approach to event safety, security, health, and emergency planning. It offers a means of shifting the focus from a purely reactive stance to a more proactive, collaborative, and integrated approach. Improving health outcomes for music festival attendees, reducing gaps in planning, promoting consistency, and improving efficiency by reducing duplication of services will ultimately require coordination and collaboration from the beginning of event production to post-event reporting. Lund A , Turris SA . The Event Chain of Survival in the context of music festivals: a framework for improving outcomes at major planned events. Prehosp Disaster Med. 2017;32(4):437-443.

  7. Designing an eMap to Teach Multimedia Applications Online

    ERIC Educational Resources Information Center

    Ruffini, Michael F.

    2004-01-01

    Teachers and students use multimedia software to create interactive presentations and content projects. Popular multimedia programs include: Microsoft's PowerPoint[R], Knowledge Adventure's HyperStudio[R], and Macromedia's Director MX 2004[R]. Creating multimedia projects engage students in active learning and thinking as they complete projects…

  8. Algorithm for Screening Phasor Measurement Unit Data for Power System Events and Categories and Common Characteristics for Events Seen in Phasor Measurement Unit Relative Phase-Angle Differences and Frequency Signals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, A.; Santoso, S.; Muljadi, E.

    2013-08-01

    A network of multiple phasor measurement units (PMU) was created, set up, and maintained at the University of Texas at Austin to obtain actual power system measurements for power system analysis. Power system analysis in this report covers a variety of time ranges, such as short- term analysis for power system disturbances and their effects on power system behavior and long- term power system behavior using modal analysis. The first objective of this report is to screen the PMU data for events. The second objective of the report is to identify and describe common characteristics extracted from power system eventsmore » as measured by PMUs. The numerical characteristics for each category and how these characteristics are used to create selection rules for the algorithm are also described. Trends in PMU data related to different levels and fluctuations in wind power output are also examined.« less

  9. Investigation of standardized administration of anti-platelet drugs and its effect on the prognosis of patients with coronary heart disease.

    PubMed

    Ding, Chao; Zhang, Jianhua; Li, Rongcheng; Wang, Jiacai; Hu, Yongcang; Chen, Yanyan; Li, Xiannan; Xu, Yan

    2017-10-01

    The aim of the present study was to explore the effect of adherence to standardized administration of anti-platelet drugs on the prognosis of patients with coronary heart disease. A total of 144 patients newly diagnosed with coronary heart disease at Lu'an Shili Hospital of Anhui Province (Lu'an, China) between June 2010 and June 2012 were followed up. Kaplan-Meier curves and the Cox regression model were used to evaluate the effects of standardized administration of anti-platelet drugs on primary and secondary end-point events. Of the patients with coronary heart disease, 109 (76%) patients took standard anti-platelet drugs following discharge. Kaplan-Meier curve and Cox regression analysis showed that standardized administration of anti-platelet drugs reduced the risk of primary end-point events (including all-cause mortality, non-lethal myocardial infarction and stroke) of patients with coronary heart disease [hazard ratio (HR)=0.307; 95% confidence interval (CI): 0.099-0.953; P=0.041) and all-cause mortality (HR=0.162; 95% CI: 0.029-0.890; P=0.036); however, standardized administration had no predictive value with regard to secondary end-point events. Standardized administration of anti-platelet drugs obviously reduced the risk of primary end-point events in patients with coronary heart disease, and further analysis showed that only all-cause mortality exhibited a statistically significant reduction.

  10. Efficacy and safety of a biodegradable polymer sirolimus-eluting stent in primary percutaneous coronary intervention: a randomized controlled trial

    PubMed Central

    Li, Qiang; Tong, Zichuan; Wang, Lefeng; Zhang, Jianjun; Ge, Yonggui; Wang, Hongshi; Li, Weiming; Xu, Li; Ni, Zhuhua

    2013-01-01

    Introduction With long-term follow-up, whether biodegradable polymer drug-eluting stents (DES) is efficient and safe in primary percutaneous coronary intervention (PCI) remains a controversial issue. This study aims to assess the long-term efficacy and safety of DES in PCI for ST-segment elevation myocardial infarction (STEMI). Material and methods A prospective, randomized single-blind study with 3-year follow-up was performed to compare biodegradable polymer DES with durable polymer DES in 332 STEMI patients treated with primary PCI. The primary end point was major adverse cardiac events (MACE) at 3 years after the procedure, defined as the composite of cardiac death, recurrent infarction, and target vessel revascularization. The secondary end points included in-segment late luminal loss (LLL) and binary restenosis at 9 months and cumulative stent thrombosis (ST) event rates up to 3 years. Results The rate of the primary end points and the secondary end points including major adverse cardiac events, in-segment late luminal loss, binary restenosis, and cumulative thrombotic event rates were comparable between biodegradable polymer DES and durable polymer DES in these 332 STEMI patients treated with primary PCI at 3 years. Conclusions Biodegradable polymer DES has similar efficacy and safety profiles at 3 years compared with durable polymer DES in STEMI patients treated with primary PCI. PMID:24482648

  11. Turning Points: Priorities for Teacher Education in a Democracy

    ERIC Educational Resources Information Center

    Romano, Rosalie M.

    2009-01-01

    Every generation has its moment, some turning point that will mark its place in the historical record. Such points provide the direction of our history and our future. Turning points are, characteristically, times of turmoil based on a fundamental change in models or events--what Thomas Kuhn called a "paradigm shift." In terms of a democratic…

  12. Gravity Does it: Redshift of Light from the Galaxies Yes, Expanding Universe NO!

    NASA Astrophysics Data System (ADS)

    Malhotra, Satish

    2018-04-01

    In the history of physics, ideas on space and time have changed the course of physics a number of times; this is another such event. We postulate 'space and time' as a flow of quantum gravity energy, having the absolute velocity c (same as velocity of light), where time is the delay in the spread of space (delay from infinite velocity flow, when there would be no time), such a flow has to have a reverse cycle, as energy creating it (howsoever large it might be has to be limited and limited energy can only create a limited space and time energy spread) and the reverse cycle is that of the creation of fundamental particles. This explanation of the universe tells us that the idea of an expanding universe is only an appearance, the argument, in brief, is as follows: One, the universe is so large that we cannot see the edges, light from the edges, the reality is non-observable. Two, the process is dark, it is beyond observation, the process of creation of charge (the reflection of light starts with it), the space energy flow process is in the range of invisible (before charge emerged); it is the elusive dark energy of the universe; we never connected space and time to flow of energy, and so did not find its connection either to its limitedness or to its dark nature (dark energy). Three, the space energy flow has a reverse process which leads to the formation of fundamental particles we have not included it in the totality of the processes of the universe, the former is the dark energy and the initial part of the reverse process—till it reaches the state of ionisation-- is dark matter. In the continuity of the cycle of space flow and its reversal to matter forms, ionisation happens at a particular point and visibility comes through along with; ionisation here is a later event (which is a part of the reverse process, enters visibility).It is this reverse process which creates fundamental particles (no big bang creation. With no idea of space as energy flow and no idea of the reverse process, physicists could never take the step in the direction of the correct understanding of the 'dark energy' or 'dark matter'.

  13. Community-based participatory research: development of an emergency department-based youth violence intervention using concept mapping.

    PubMed

    Snider, Carolyn E; Kirst, Maritt; Abubakar, Shakira; Ahmad, Farah; Nathens, Avery B

    2010-08-01

    Emergency departments (EDs) see a high number of youths injured by violence. In Ontario, the most common cause of injury for youths visiting EDs is assault. Secondary prevention strategies using the teachable moment (i.e., events that can lead individuals to make positive changes in their lives) are ideal for use by clinicians. An opportunity exists to take advantage of the teachable moment in the ED in an effort to prevent future occurrences of injury in at-risk youths. However, little is known about perceptions of youths, parents, and community organizations about such interventions in EDs. The aims of this study were to engage youths, parents, and frontline community workers in conceptualizing a hospital-based violence prevention intervention and to identify outcomes relevant to the community. Concept mapping is an innovative, mixed-method research approach. It combines structured qualitative processes such as brainstorming and group sorting, with various statistical analyses such as multidimensional scaling and hierarchical clustering, to develop a conceptual framework, and allows for an objective presentation of qualitative data. Concept mapping involves multiple structured steps: 1) brainstorming, 2) sorting, 3) rating, and 4) interpretation. For this study, the first three steps occurred online, and the fourth step occurred during a community meeting. Over 90 participants were involved, including youths, parents, and community youth workers. A two-dimensional point map was created and clusters formed to create a visual display of participant ideas on an ED-based youth violence prevention intervention. Issues related to youth violence prevention that were rated of highest importance and most realistic for hospital involvement included mentorship, the development of youth support groups in the hospital, training doctors and nurses to ask questions about the violent event, and treating youth with respect. Small-group discussions on the various clusters developed job descriptions, a list of essential services, and suggestions on ways to create a more youth-friendly environment in the hospital. A large-group discussion revealed outcomes that participants felt should be measured to determine the success of an intervention program. This study has been the springboard for the development of an ED-based youth violence intervention that is supported by the community and affected youth. Using information generated by youth that is grounded in their experience through participatory research methods is feasible for the development of successful and meaningful youth violence prevention interventions.

  14. Image Makers: Reporters or Sources.

    ERIC Educational Resources Information Center

    Petruzzello, Marion C.

    To explore how news sources are used by media to create a social image of women during key suffrage events of 1858, 1920, and 1970, the front page stories of the "New York Times" were reviewed for 1 week prior to and 1 week following each of these events: May 14, 1858, the Eighth National Women's Rights Convention in New York City;…

  15. The Dark Knight Rises: In "42" Jackie Robinson Saves the American Dream

    ERIC Educational Resources Information Center

    Beck, Bernard

    2014-01-01

    The movie "42" shows memorable events that have faded from our view in recent years. The events are important to the evolution of a multicultural society in America because of the importance of baseball to the common national culture that all the American people have created. Jackie Robinson's significance as a cultural hero is…

  16. A Case Study on Using Prediction Markets as a Rich Environment for Active Learning

    ERIC Educational Resources Information Center

    Buckley, Patrick; Garvey, John; McGrath, Fergal

    2011-01-01

    In this paper, prediction markets are presented as an innovative pedagogical tool which can be used to create a Rich Environment for Active Learning (REAL). Prediction markets are designed to make forecasts about specific future events by using a market mechanism to aggregate the information held by a large group of traders about that event into a…

  17. Sports-Related Concussion Occurrence at Various Time Points During High School Athletic Events: Part 2.

    PubMed

    Covassin, Tracey; Petit, Kyle M; Savage, Jennifer L; Bretzin, Abigail C; Fox, Meghan E; Walker, Lauren F; Gould, Daniel

    2018-06-01

    Sports-related concussion (SRC) injury rates, and identifying those athletes at the highest risk, have been a primary research focus. However, no studies have evaluated at which time point during an athletic event athletes are most susceptible to SRCs. To determine the clinical incidence of SRCs during the start, middle, and end of practice and competition among high school male and female athletes in the state of Michigan. Descriptive epidemiological study. There were 110,774 male and 71,945 female student-athletes in grades 9 through 12 (mean time in high school, 2.32 ± 1.1 years) who participated in sponsored athletic activities (13 sports) during the 2015-2016 academic year. An SRC was diagnosed and managed by a medical professional (ie, MD, DO, PA, NP). SRC injuries were reported by certified athletic trainers, athletic administrators, and coaches using the Michigan High School Athletic Association Head Injury Reporting System. Time of SRC was defined as the beginning, middle, or end of practice/competition. Clinical incidence was calculated by dividing the number of SRCs in a time point (eg, beginning) by the total number of participants in a sport per 100 student-athletes (95% CI). Risk ratios were calculated by dividing one time point by another time point. There were 4314 SRCs reported, with the highest in football, women's basketball, and women's soccer. The total clinical incidence for all sports was 2.36 (95% CI, 2.29-2.43) per 100 student-athletes. The most common time for SRCs was the middle, followed by the end of all events. Athletes had a 4.90 (95% CI, 4.44-5.41) and 1.50 (95% CI, 1.40-1.60) times greater risk during the middle of all events when compared with the beginning and end, respectively. There was a 3.28 (95% CI, 2.96-3.63) times greater risk at the end of all events when compared with the beginning. Athletes were at the greatest risk for SRCs at the middle of practice and competition when compared with the beginning and end. The current study suggests that medical attention is particularly important during the middle of all athletic events. Intervention measures to limit SRCs may be most beneficial during the middle of athletic events.

  18. Multilingual Analysis of Twitter News in Support of Mass Emergency Events

    NASA Astrophysics Data System (ADS)

    Zielinski, A.; Bügel, U.; Middleton, L.; Middleton, S. E.; Tokarchuk, L.; Watson, K.; Chaves, F.

    2012-04-01

    Social media are increasingly becoming an additional source of information for event-based early warning systems in the sense that they can help to detect natural crises and support crisis management during or after disasters. Within the European FP7 TRIDEC project we study the problem of analyzing multilingual twitter feeds for emergency events. Specifically, we consider tsunami and earthquakes, as one possible originating cause of tsunami, and propose to analyze twitter messages for capturing testified information at affected points of interest in order to obtain a better picture of the actual situation. For tsunami, these could be the so called Forecast Points, i.e. agreed-upon points chosen by the Regional Tsunami Warning Centers (RTWC) and the potentially affected countries, which must be considered when calculating expected tsunami arrival times. Generally, local civil protection authorities and the population are likely to respond in their native languages. Therefore, the present work focuses on English as "lingua franca" and on under-resourced Mediterranean languages in endangered zones, particularly in Turkey, Greece, and Romania. We investigated ten earthquake events and defined four language-specific classifiers that can be used to detect natural crisis events by filtering out irrelevant messages that do not relate to the event. Preliminary results indicate that such a filter has the potential to support earthquake detection and could be integrated into seismographic sensor networks. One hindrance in our study is the lack of geo-located data for asserting the geographical origin of the tweets and thus to be able to observe correlations of events across languages. One way to overcome this deficit consists in identifying geographic names contained in tweets that correspond to or which are located in the vicinity of specific points-of-interest such as the forecast points of the tsunami scenario. We also intend to use twitter analysis for situation picture assessment, e.g. for planning relief actions. At present, a multilingual corpus of Twitter messages related to crises is being assembled, and domain-specific language resources such as multilingual terminology lists and language-specific Natural Language Processing (NLP) tools are being built up to help cross the language barrier. The final goal is to extend this work to the main languages spoken around the Mediterranean and to classify and extract relevant information from tweets, translating the main keywords into English.

  19. Automatic visualization of 3D geometry contained in online databases

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; John, Nigel W.

    2003-04-01

    In this paper, the application of the Virtual Reality Modeling Language (VRML) for efficient database visualization is analyzed. With the help of JAVA programming, three examples of automatic visualization from a database containing 3-D Geometry are given. The first example is used to create basic geometries. The second example is used to create cylinders with a defined start point and end point. The third example is used to processs data from an old copper mine complex in Cheshire, United Kingdom. Interactive 3-D visualization of all geometric data in an online database is achieved with JSP technology.

  20. A Coordinated Initialization Process for the Distributed Space Exploration Simulation

    NASA Technical Reports Server (NTRS)

    Crues, Edwin Z.; Phillips, Robert G.; Dexter, Dan; Hasan, David

    2007-01-01

    A viewgraph presentation on the federate initialization process for the Distributed Space Exploration Simulation (DSES) is described. The topics include: 1) Background: DSES; 2) Simulation requirements; 3) Nine Step Initialization; 4) Step 1: Create the Federation; 5) Step 2: Publish and Subscribe; 6) Step 3: Create Object Instances; 7) Step 4: Confirm All Federates Have Joined; 8) Step 5: Achieve initialize Synchronization Point; 9) Step 6: Update Object Instances With Initial Data; 10) Step 7: Wait for Object Reflections; 11) Step 8: Set Up Time Management; 12) Step 9: Achieve startup Synchronization Point; and 13) Conclusions

  1. Everolimus-Eluting Stents or Bypass Surgery for Left Main Coronary Artery Disease.

    PubMed

    Stone, Gregg W; Sabik, Joseph F; Serruys, Patrick W; Simonton, Charles A; Généreux, Philippe; Puskas, John; Kandzari, David E; Morice, Marie-Claude; Lembo, Nicholas; Brown, W Morris; Taggart, David P; Banning, Adrian; Merkely, Béla; Horkay, Ferenc; Boonstra, Piet W; van Boven, Ad J; Ungi, Imre; Bogáts, Gabor; Mansour, Samer; Noiseux, Nicolas; Sabaté, Manel; Pomar, José; Hickey, Mark; Gershlick, Anthony; Buszman, Pawel; Bochenek, Andrzej; Schampaert, Erick; Pagé, Pierre; Dressler, Ovidiu; Kosmidou, Ioanna; Mehran, Roxana; Pocock, Stuart J; Kappetein, A Pieter

    2016-12-08

    Patients with obstructive left main coronary artery disease are usually treated with coronary-artery bypass grafting (CABG). Randomized trials have suggested that drug-eluting stents may be an acceptable alternative to CABG in selected patients with left main coronary disease. We randomly assigned 1905 eligible patients with left main coronary artery disease of low or intermediate anatomical complexity to undergo either percutaneous coronary intervention (PCI) with fluoropolymer-based cobalt-chromium everolimus-eluting stents (PCI group, 948 patients) or CABG (CABG group, 957 patients). Anatomic complexity was assessed at the sites and defined by a Synergy between Percutaneous Coronary Intervention with Taxus and Cardiac Surgery (SYNTAX) score of 32 or lower (the SYNTAX score reflects a comprehensive angiographic assessment of the coronary vasculature, with 0 as the lowest score and higher scores [no upper limit] indicating more complex coronary anatomy). The primary end point was the rate of a composite of death from any cause, stroke, or myocardial infarction at 3 years, and the trial was powered for noninferiority testing of the primary end point (noninferiority margin, 4.2 percentage points). Major secondary end points included the rate of a composite of death from any cause, stroke, or myocardial infarction at 30 days and the rate of a composite of death, stroke, myocardial infarction, or ischemia-driven revascularization at 3 years. Event rates were based on Kaplan-Meier estimates in time-to-first-event analyses. At 3 years, a primary end-point event had occurred in 15.4% of the patients in the PCI group and in 14.7% of the patients in the CABG group (difference, 0.7 percentage points; upper 97.5% confidence limit, 4.0 percentage points; P=0.02 for noninferiority; hazard ratio, 1.00; 95% confidence interval, 0.79 to 1.26; P=0.98 for superiority). The secondary end-point event of death, stroke, or myocardial infarction at 30 days occurred in 4.9% of the patients in the PCI group and in 7.9% in the CABG group (P<0.001 for noninferiority, P=0.008 for superiority). The secondary end-point event of death, stroke, myocardial infarction, or ischemia-driven revascularization at 3 years occurred in 23.1% of the patients in the PCI group and in 19.1% in the CABG group (P=0.01 for noninferiority, P=0.10 for superiority). In patients with left main coronary artery disease and low or intermediate SYNTAX scores by site assessment, PCI with everolimus-eluting stents was noninferior to CABG with respect to the rate of the composite end point of death, stroke, or myocardial infarction at 3 years. (Funded by Abbott Vascular; EXCEL ClinicalTrials.gov number, NCT01205776 .).

  2. GLOBE Observer and the Association of Science & Technology Centers: Leveraging Citizen Science and Partnerships for an International Science Experiment to Build Climate Literacy

    NASA Astrophysics Data System (ADS)

    Riebeek Kohl, H.; Chambers, L. H.; Murphy, T.

    2016-12-01

    For more that 20 years, the Global Learning and Observations to Benefit the Environment (GLOBE) Program has sought to increase environment literacy in students by involving them in the process of data collection and scientific research. In 2016, the program expanded to accept observations from citizen scientists of all ages through a relatively simple app. Called GLOBE Observer, the new program aims to help participants feel connected to a global community focused on advancing the scientific understanding of Earth system science while building climate literacy among participants and increasing valuable environmental data points to expand both student and scientific research. In October 2016, GLOBE Observer partnered with the Association of Science & Technology Centers (ASTC) in an international science experiment in which museums and patrons around the world collected cloud observations through GLOBE Observer to create a global cloud map in support of NASA satellite science. The experiment was an element of the International Science Center and Science Museum Day, an event planned in partnership with UNESCO and ASTC. Museums and science centers provided the climate context for the observations, while GLOBE Observer offered a uniform experience and a digital platform to build a connected global community. This talk will introduce GLOBE Observer and will present the results of the experiment, including evaluation feedback on gains in climate literacy through the event.

  3. Howthe IMF By induces a By component in the closed magnetosphere and how it leads to asymmetric currents and convection patterns in the two hemispheres

    NASA Astrophysics Data System (ADS)

    Tenfjord, Paul; Østgaard, Nikolai; Snekvik, Kristian; Reistad, Jone; Magnus Laundal, Karl; Haaland, Stein; Milan, Steve

    2016-04-01

    We describe the effects of the interplanetary magnetic field (IMF) By component on the coupling between the solar wind and magnetosphere-ionosphere system using AMPERE observations and MHD simulations. We show how By is induced on closed magnetospheric field lines on both the dayside and nightside. The magnetosphere imposes asymmetric forces on the ionosphere, and the effects on the ionospheric flow are characterized by distorted convection cell patterns, often referred to as "banana" and "orange" cell patterns. The flux asymmetrically added to the lobes results in a nonuniform induced By in the closed magnetosphere. We present a mechanism that predicts asymmetric Birkeland currents at conjugate foot points. Asymmetric Birkeland currents are created as a consequence of y directed tension contained in the return flow. Associated with these currents, we expect aurora and fast localized ionospheric azimuthal flows present in one hemisphere but not necessarily in the other. We present a statistical study where we show that these processes should occur on timescales of about 30 minutes after the IMF By has arrived at the magnetopause. We also present an event with simultaneous global imaging of the aurora and SuperDARN measurements from both hemisphere. The event is interpreted as an example of the of the proposed asymmetric current mechanism.

  4. Health throughout the lifespan: The phenomenon of the inner child reflected in events during childhood experienced by older persons

    PubMed Central

    Sjöblom, Margareta; Öhrling, Kerstin; Prellwitz, Maria; Kostenius, Catrine

    2016-01-01

    The aim of this study was to describe and gain more knowledge of the phenomenon of the inner child, reflected in events during childhood experienced by older persons. Thirteen older persons aged 70 to 91 years old were interviewed. A hermeneutical phenomenological analysis of the data revealed two main themes: the inner child becomes visible and the inner child's presence through life. The participants’ narratives showed that their understanding of the experiences included both positive and negative feelings, as well as ways to be creative, in which the inner child became visible. The participants’ experiences indicated that the inner child was present throughout the lifespan, was found in challenges that occurred in life, and could turn something bad into something good. However, the presence of the inner child could also be a source for development throughout life and could interfere with the person. The findings from this study point to older persons’ need to be recognized, acknowledged, and understood as a unique person living his or her own life. In addition, dimensions of well-being such as feeling safe, loved, supported, and creating space for fantasy and possibilities can be compared to the physical, mental, social, and existential dimensions of well-being found in WHO surveys and definitions of health. This calls for a holistic approach when caring for older persons. PMID:27317381

  5. Health throughout the lifespan: The phenomenon of the inner child reflected in events during childhood experienced by older persons.

    PubMed

    Sjöblom, Margareta; Öhrling, Kerstin; Prellwitz, Maria; Kostenius, Catrine

    2016-01-01

    The aim of this study was to describe and gain more knowledge of the phenomenon of the inner child, reflected in events during childhood experienced by older persons. Thirteen older persons aged 70 to 91 years old were interviewed. A hermeneutical phenomenological analysis of the data revealed two main themes: the inner child becomes visible and the inner child's presence through life. The participants' narratives showed that their understanding of the experiences included both positive and negative feelings, as well as ways to be creative, in which the inner child became visible. The participants' experiences indicated that the inner child was present throughout the lifespan, was found in challenges that occurred in life, and could turn something bad into something good. However, the presence of the inner child could also be a source for development throughout life and could interfere with the person. The findings from this study point to older persons' need to be recognized, acknowledged, and understood as a unique person living his or her own life. In addition, dimensions of well-being such as feeling safe, loved, supported, and creating space for fantasy and possibilities can be compared to the physical, mental, social, and existential dimensions of well-being found in WHO surveys and definitions of health. This calls for a holistic approach when caring for older persons.

  6. Neural assembly computing.

    PubMed

    Ranhel, João

    2012-06-01

    Spiking neurons can realize several computational operations when firing cooperatively. This is a prevalent notion, although the mechanisms are not yet understood. A way by which neural assemblies compute is proposed in this paper. It is shown how neural coalitions represent things (and world states), memorize them, and control their hierarchical relations in order to perform algorithms. It is described how neural groups perform statistic logic functions as they form assemblies. Neural coalitions can reverberate, becoming bistable loops. Such bistable neural assemblies become short- or long-term memories that represent the event that triggers them. In addition, assemblies can branch and dismantle other neural groups generating new events that trigger other coalitions. Hence, such capabilities and the interaction among assemblies allow neural networks to create and control hierarchical cascades of causal activities, giving rise to parallel algorithms. Computing and algorithms are used here as in a nonstandard computation approach. In this sense, neural assembly computing (NAC) can be seen as a new class of spiking neural network machines. NAC can explain the following points: 1) how neuron groups represent things and states; 2) how they retain binary states in memories that do not require any plasticity mechanism; and 3) how branching, disbanding, and interaction among assemblies may result in algorithms and behavioral responses. Simulations were carried out and the results are in agreement with the hypothesis presented. A MATLAB code is available as a supplementary material.

  7. Possible role of electric forces in bromine activation during polar boundary layer ozone depletion and aerosol formation events

    NASA Astrophysics Data System (ADS)

    Tkachenko, Ekaterina

    2017-11-01

    This work presents a hypothesis about the mechanism of bromine activation during polar boundary layer ozone depletion events (ODEs) as well as the mechanism of aerosol formation from the frost flowers. The author suggests that ODEs may be initiated by the electric-field gradients created at the sharp tips of ice formations as a result of the combined effect of various environmental conditions. According to the author's estimates, these electric-field gradients may be sufficient for the onset of point or corona discharges followed by generation of high local concentrations of the reactive oxygen species and initiation of free-radical and redox reactions. This process may be responsible for the formation of seed bromine which then undergoes further amplification by HOBr-driven bromine explosion. The proposed hypothesis may explain a variety of environmental conditions and substrates as well as poor reproducibility of ODE initiation observed by researchers in the field. According to the author's estimates, high wind can generate sufficient conditions for overcoming the Rayleigh limit and thus can initiate ;spraying; of charged aerosol nanoparticles. These charged aerosol nanoparticles can provoke formation of free radicals, turning the ODE on. One can also envision a possible emission of halogen ion as a result of the ;electrospray; process analogous to that of electrospray ionization mass-spectrometry.

  8. Dynamic water behaviour due to one trapped air pocket in a laboratory pipeline apparatus

    NASA Astrophysics Data System (ADS)

    Bergant, A.; Karadžić, U.; Tijsseling, A.

    2016-11-01

    Trapped air pockets may cause severe operational problems in hydropower and water supply systems. A locally isolated air pocket creates distinct amplitude, shape and timing of pressure pulses. This paper investigates dynamic behaviour of a single trapped air pocket. The air pocket is incorporated as a boundary condition into the discrete gas cavity model (DGCM). DGCM allows small gas cavities to form at computational sections in the method of characteristics (MOC). The growth of the pocket and gas cavities is described by the water hammer compatibility equation(s), the continuity equation for the cavity volume, and the equation of state of an ideal gas. Isentropic behaviour is assumed for the trapped gas pocket and an isothermal bath for small gas cavities. Experimental investigations have been performed in a laboratory pipeline apparatus. The apparatus consists of an upstream end high-pressure tank, a horizontal steel pipeline (total length 55.37 m, inner diameter 18 mm), four valve units positioned along the pipeline including the end points, and a downstream end tank. A trapped air pocket is captured between two ball valves at the downstream end of the pipeline. The transient event is initiated by rapid opening of the upstream end valve; the downstream end valve stays closed during the event. Predicted and measured results for a few typical cases are compared and discussed.

  9. Remembering tips

    MedlinePlus

    Memory aids; Alzheimer disease - remembering tips; Early memory loss - remembering tips; Dementia - remembering tips ... harder for your brain to create a new memory, even while you can remember actions and events ...

  10. Automation: Decision Aid or Decision Maker?

    NASA Technical Reports Server (NTRS)

    Skitka, Linda J.

    1998-01-01

    This study clarified that automation bias is something unique to automated decision making contexts, and is not the result of a general tendency toward complacency. By comparing performance on exactly the same events on the same tasks with and without an automated decision aid, we were able to determine that at least the omission error part of automation bias is due to the unique context created by having an automated decision aid, and is not a phenomena that would occur even if people were not in an automated context. However, this study also revealed that having an automated decision aid did lead to modestly improved performance across all non-error events. Participants in the non- automated condition responded with 83.68% accuracy, whereas participants in the automated condition responded with 88.67% accuracy, across all events. Automated decision aids clearly led to better overall performance when they were accurate. People performed almost exactly at the level of reliability as the automation (which across events was 88% reliable). However, also clear, is that the presence of less than 100% accurate automated decision aids creates a context in which new kinds of errors in decision making can occur. Participants in the non-automated condition responded with 97% accuracy on the six "error" events, whereas participants in the automated condition had only a 65% accuracy rate when confronted with those same six events. In short, the presence of an AMA can lead to vigilance decrements that can lead to errors in decision making.

  11. Super Clausius-Clapeyron scaling of extreme hourly precipitation and its relation to large-scale atmospheric conditions

    NASA Astrophysics Data System (ADS)

    Lenderink, Geert; Barbero, Renaud; Loriaux, Jessica; Fowler, Hayley

    2017-04-01

    Present-day precipitation-temperature scaling relations indicate that hourly precipitation extremes may have a response to warming exceeding the Clausius-Clapeyron (CC) relation; for The Netherlands the dependency on surface dew point temperature follows two times the CC relation corresponding to 14 % per degree. Our hypothesis - as supported by a simple physical argument presented here - is that this 2CC behaviour arises from the physics of convective clouds. So, we think that this response is due to local feedbacks related to the convective activity, while other large scale atmospheric forcing conditions remain similar except for the higher temperature (approximately uniform warming with height) and absolute humidity (corresponding to the assumption of unchanged relative humidity). To test this hypothesis, we analysed the large-scale atmospheric conditions accompanying summertime afternoon precipitation events using surface observations combined with a regional re-analysis for the data in The Netherlands. Events are precipitation measurements clustered in time and space derived from approximately 30 automatic weather stations. The hourly peak intensities of these events again reveal a 2CC scaling with the surface dew point temperature. The temperature excess of moist updrafts initialized at the surface and the maximum cloud depth are clear functions of surface dew point temperature, confirming the key role of surface humidity on convective activity. Almost no differences in relative humidity and the dry temperature lapse rate were found across the dew point temperature range, supporting our theory that 2CC scaling is mainly due to the response of convection to increases in near surface humidity, while other atmospheric conditions remain similar. Additionally, hourly precipitation extremes are on average accompanied by substantial large-scale upward motions and therefore large-scale moisture convergence, which appears to accelerate with surface dew point. This increase in large-scale moisture convergence appears to be consequence of latent heat release due to the convective activity as estimated from the quasi-geostrophic omega equation. Consequently, most hourly extremes occur in precipitation events with considerable spatial extent. Importantly, this event size appears to increase rapidly at the highest dew point temperature range, suggesting potentially strong impacts of climatic warming.

  12. On the Brink: Instability and the Prospect of State Failure in Pakistan

    DTIC Science & Technology

    2010-04-12

    unpredictable posture. Most importantly, these historical events, coupled with current political , economic, and security related issues, have created a...current political , economic, and security related issues, have created a fragile state with the propensity to fail. Therefore, this monograph highlights...hardships that have affected the state’s political stability, economic performance, and security. These unrelenting problems lie at the foundation

  13. Picture Books: Can They Help Caregivers Create an "Illusion of Safety" for Children in Unsafe Times?

    ERIC Educational Resources Information Center

    McNamee, Abigail; Mercurio, Mia Lynn

    2006-01-01

    The authors believe that children need to feel safe--they need the illusion of safety--so that they can develop in a healthy way. But it is an "illusion" because in reality safety is never guaranteed for anyone. At times, traumatic events disrupt the safe environments that people have created. Janoff-Buhlman (1992) describes the "shattering of…

  14. Concurrent Schedules of Positive and Negative Reinforcement: Differential-Impact and Differential-Outcomes Hypotheses

    ERIC Educational Resources Information Center

    Magoon, Michael A.; Critchfield, Thomas S.

    2008-01-01

    Considerable evidence from outside of operant psychology suggests that aversive events exert greater influence over behavior than equal-sized positive-reinforcement events. Operant theory is largely moot on this point, and most operant research is uninformative because of a scaling problem that prevents aversive events and those based on positive…

  15. Measure of mechanical impacts in commercial blueberry packing lines and potential damage to blueberry fruit

    USDA-ARS?s Scientific Manuscript database

    Modern blueberry packing lines create impact damage to blueberries which will result in fruit bruising. In this study, impacts created by commercial blueberry packing lines were measured quantitatively using a miniature instrumented sphere. Impacts were recorded at transfer points. Average peakG ...

  16. Greek Mythology: Cultures and Art. ArtsEdge Curricula, Lessons and Activities.

    ERIC Educational Resources Information Center

    Nickerson, Charles

    The visual arts offer aesthetic, perceptual, creative, and intellectual opportunities. This lesson points out that by creating and painting mythological characters, students will improve their ability to analyze, reorganize, critique, and create. The lesson also intends for fourth-grade students to gain insight into Greek culture through the…

  17. Cyberkids

    ERIC Educational Resources Information Center

    Clifford, Pat

    2005-01-01

    While critics draw important attention to worrisome aspects of digital cultures, they may be missing a much larger point about how young people live in a digital world, how they create and re-create themselves and their identities in ways that are remarkably foreign to others "digital immigrants". In this article, "digital immigrants" refers to…

  18. Directing Diplomacy: Creating the Best Experience for Everyone in the Show.

    ERIC Educational Resources Information Center

    Mulcahy, Lisa

    2003-01-01

    Discusses the subtle psychological strategies good directors know how to employ with actors. Contends that if a director demonstrates a diplomatic attitude toward every student involved in a production, a perfect working atmosphere is created. Explores diplomacy basics; first impressions; rehearsal problems; personality issues; and talking points.…

  19. Authentic, Dialogical Knowledge Construction: A Blended and Mobile Teacher Education Programme

    ERIC Educational Resources Information Center

    Ruhalahti, Sanna; Korhonen, Anne-Maria; Rasi, Päivi

    2017-01-01

    Background: Knowledge construction and technology have been identified as critical for an understanding of the future of teacher education. Knowledge is discovered, applied and created collaboratively from authentic starting points. Today's new mobile and blended learning environments create increased opportunities for such processes, including…

  20. Towards a tipping point in responding to change: rising costs, fewer options for Arctic and global societies.

    PubMed

    Huntington, Henry P; Goodstein, Eban; Euskirchen, Eugénie

    2012-02-01

    Climate change incurs costs, but government adaptation budgets are limited. Beyond a certain point, individuals must bear the costs or adapt to new circumstances, creating political-economic tipping points that we explore in three examples. First, many Alaska Native villages are threatened by erosion, but relocation is expensive. To date, critically threatened villages have not yet been relocated, suggesting that we may already have reached a political-economic tipping point. Second, forest fires shape landscape and ecological characteristics in interior Alaska. Climate-driven changes in fire regime require increased fire-fighting resources to maintain current patterns of vegetation and land use, but these resources appear to be less and less available, indicating an approaching tipping point. Third, rapid sea level rise, for example from accelerated melting of the Greenland ice sheet, will create a choice between protection and abandonment for coastal regions throughout the world, a potential global tipping point comparable to those now faced by Arctic communities. The examples illustrate the basic idea that if costs of response increase more quickly than available resources, then society has fewer and fewer options as time passes.

  1. Public engagement in 3D flood modelling through integrating crowd sourced imagery with UAV photogrammetry to create a 3D flood hydrograph.

    NASA Astrophysics Data System (ADS)

    Bond, C. E.; Howell, J.; Butler, R.

    2016-12-01

    With an increase in flood and storm events affecting infrastructure the role of weather systems, in a changing climate, and their impact is of increasing interest. Here we present a new workflow integrating crowd sourced imagery from the public with UAV photogrammetry to create, the first 3D hydrograph of a major flooding event. On December 30th 2015, Storm Frank resulted in high magnitude rainfall, within the Dee catchment in Aberdeenshire, resulting in the highest ever-recorded river level for the Dee, with significant impact on infrastructure and river morphology. The worst of the flooding occurred during daylight hours and was digitally captured by the public on smart phones and cameras. After the flood event a UAV was used to shoot photogrammetry to create a textured elevation model of the area around Aboyne Bridge on the River Dee. A media campaign aided crowd sourced digital imagery from the public, resulting in over 1,000 images submitted by the public. EXIF data captured by the imagery of the time, date were used to sort the images into a time series. Markers such as signs, walls, fences and roads within the images were used to determine river level height through the flood, and matched onto the elevation model to contour the change in river level. The resulting 3D hydrograph shows the build up of water on the up-stream side of the Bridge that resulted in significant scouring and under-mining in the flood. We have created the first known data based 3D hydrograph for a river section, from a UAV photogrammetric model and crowd sourced imagery. For future flood warning and infrastructure management a solution that allows a realtime hydrograph to be created utilising augmented reality to integrate the river level information in crowd sourced imagery directly onto a 3D model, would significantly improve management planning and infrastructure resilience assessment.

  2. Volcanic Tephra ejected in south eastern Asia is the sole cause of all historic ENSO events. This natural aerosol plume has been intensified by an anthropogenic plume in the same region in recent decades which has intensified some ENSO events and altered the Southern Oscillation Index characteristics

    NASA Astrophysics Data System (ADS)

    Potts, K. A.

    2017-12-01

    ENSO events are the most significant perturbation of the climate system. Previous attempts to link ENSO with volcanic eruptions typically failed because only large eruptions across the globe which eject tephra into the stratosphere were considered. I analyse all volcanic eruptions in South Eastern (SE) Asia (10ºS to 10ºN and from 90ºE to 160ºE) the most volcanically active area in the world with over 23% of all eruptions in the Global Volcanism Program database occurring here and with 5 volcanoes stated to have erupted nearly continuously for 30 years. SE Asia is also the region where the convective arm of the thermally direct Walker Circulation occurs driven by the intense equatorial solar radiation which creates the high surface temperature. The volcanic tephra plume intercepts some of the solar radiation by absorption/reflection which cools the surface and heats the atmosphere creating a temperature inversion compared to periods without the plume. This reduces convection and causes the Walker Cell and Trade Winds to weaken. This reduced wind speed causes the central Pacific Ocean to warm which creates convection there which further weakens the Walker Cell. With the reduced wind stress the western Pacific warm pool migrates east. This creates an ENSO event which continues until the tephra plume reduces, typically when the SE Asian monsoon commences, and convection is re-established over SE Asia and the Pacific warm pool migrates back to the west. Correlations of SE Asian tephra and the ENSO indices are typically over 0.80 at p < 0.01 In recent decades the anthropogenic SE Asian aerosol Plume (SEAP) has intensified the volcanic plume in some years from August to November. Using NASA satellite data from 1978 and the NASA MERRA 2 reanalysis dataset I show correlation coefficients typically over 0.70 and up to 0.97 at p < 0.01 between the aerosol optical depth or index and the ENSO indices. If two events A and B correlate 5 options are available: 1. A causes B; 2. B causes A; 3. C, another event, causes A &B simultaneously; 4. It's a coincidence; and 5. The relationship is complex with feedback. The volcanic correlations only allow options 1 or 4 as ENSO cannot cause volcanoes to erupt and are backed up by several independent satellite datasets. I conclude volcanic and anthropogenic aerosols over SE Asia are the sole cause of all ENSO events.

  3. Three-Dimensional Online Visualization and Engagement Tools for the Geosciences

    NASA Astrophysics Data System (ADS)

    Cockett, R.; Moran, T.; Pidlisecky, A.

    2013-12-01

    Educational tools often sacrifice interactivity in favour of scalability so they can reach more users. This compromise leads to tools that may be viewed as second tier when compared to more engaging activities performed in a laboratory; however, the resources required to deliver laboratory exercises that are scalable is often impractical. Geoscience education is well situated to benefit from interactive online learning tools that allow users to work in a 3D environment. Visible Geology (http://3ptscience.com/visiblegeology) is an innovative web-based application designed to enable visualization of geologic structures and processes through the use of interactive 3D models. The platform allows users to conceptualize difficult, yet important geologic principles in a scientifically accurate manner by developing unique geologic models. The environment allows students to interactively practice their visualization and interpretation skills by creating and interacting with their own models and terrains. Visible Geology has been designed from a user centric perspective resulting in a simple and intuitive interface. The platform directs students to build there own geologic models by adding beds and creating geologic events such as tilting, folding, or faulting. The level of ownership and interactivity encourages engagement, leading learners to discover geologic relationships on their own, in the context of guided assignments. In January 2013, an interactive geologic history assignment was developed for a 700-student introductory geology class at The University of British Columbia. The assignment required students to distinguish the relative age of geologic events to construct a geologic history. Traditionally this type of exercise has been taught through the use of simple geologic cross-sections showing crosscutting relationships; from these cross-sections students infer the relative age of geologic events. In contrast, the Visible Geology assignment offers students a unique experience where they first create their own geologic events allowing them to directly see how the timing of a geologic event manifests in the model and resulting cross-sections. By creating each geologic event in the model themselves, the students gain a deeper understanding of the processes and relative order of events. The resulting models can be shared amongst students, and provide instructors with a basis for guiding inquiry to address misconceptions. The ease of use of the assignment, including automatic assessment, made this tool practical for deployment in this 700 person class. The outcome of this type of large scale deployment is that students, who would normally not experience a lab exercise, gain exposure to interactive 3D thinking. Engaging tools and software that puts the user in control of their learning experiences is critical for moving to scalable, yet engaging, online learning environments.

  4. Using patients’ experiences of adverse events to improve health service delivery and practice: protocol of a data linkage study of Australian adults age 45 and above

    PubMed Central

    Walton, Merrilyn; Smith-Merry, Jennifer; Harrison, Reema; Manias, Elizabeth; Iedema, Rick; Kelly, Patrick

    2014-01-01

    Introduction Evidence of patients’ experiences is fundamental to creating effective health policy and service responses, yet is missing from our knowledge of adverse events. This protocol describes explorative research redressing this significant deficit; investigating the experiences of a large cohort of recently hospitalised patients aged 45 years and above in hospitals in New South Wales (NSW), Australia. Methods and analysis The 45 and Up Study is a cohort of 265 000 adults aged 45 years and above in NSW. Patients who were hospitalised between 1 January and 30 June 2014 will be identified from this cohort using data linkage and a random sample of 20 000 invited to participate. A cross-sectional survey (including qualitative and quantitative components) will capture patients’ experiences in hospital and specifically of adverse events. Approximately 25% of respondents are likely to report experiencing an adverse event. Quantitative components will capture the nature and type of events as well as common features of patients’ experiences. Qualitative data provide contextual knowledge of their condition and care and the impact of the event on individuals. Respondents who do not report an adverse event will report their experience in hospital and be the control group. Statistical and thematic analysis will be used to present a patient perspective of their experiences in hospital; the characteristics of patients experiencing an adverse event; experiences of information sharing after an event (open disclosure) and the other avenues of redress pursued. Interviews with key policymakers and a document analysis will be used to create a map of the current practice. Ethics and dissemination Dissemination via a one-day workshop, peer-reviewed publications and conference presentations will enable effective clinical responses and service provision and policy responses to adverse events to be developed. PMID:25311039

  5. Using patients' experiences of adverse events to improve health service delivery and practice: protocol of a data linkage study of Australian adults age 45 and above.

    PubMed

    Walton, Merrilyn; Jorm, Christine; Smith-Merry, Jennifer; Harrison, Reema; Manias, Elizabeth; Iedema, Rick; Kelly, Patrick

    2014-10-13

    Evidence of patients' experiences is fundamental to creating effective health policy and service responses, yet is missing from our knowledge of adverse events. This protocol describes explorative research redressing this significant deficit; investigating the experiences of a large cohort of recently hospitalised patients aged 45 years and above in hospitals in New South Wales (NSW), Australia. The 45 and Up Study is a cohort of 265,000 adults aged 45 years and above in NSW. Patients who were hospitalised between 1 January and 30 June 2014 will be identified from this cohort using data linkage and a random sample of 20,000 invited to participate. A cross-sectional survey (including qualitative and quantitative components) will capture patients' experiences in hospital and specifically of adverse events. Approximately 25% of respondents are likely to report experiencing an adverse event. Quantitative components will capture the nature and type of events as well as common features of patients' experiences. Qualitative data provide contextual knowledge of their condition and care and the impact of the event on individuals. Respondents who do not report an adverse event will report their experience in hospital and be the control group. Statistical and thematic analysis will be used to present a patient perspective of their experiences in hospital; the characteristics of patients experiencing an adverse event; experiences of information sharing after an event (open disclosure) and the other avenues of redress pursued. Interviews with key policymakers and a document analysis will be used to create a map of the current practice. Dissemination via a one-day workshop, peer-reviewed publications and conference presentations will enable effective clinical responses and service provision and policy responses to adverse events to be developed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. SDCLIREF - A sub-daily gridded reference dataset

    NASA Astrophysics Data System (ADS)

    Wood, Raul R.; Willkofer, Florian; Schmid, Franz-Josef; Trentini, Fabian; Komischke, Holger; Ludwig, Ralf

    2017-04-01

    Climate change is expected to impact the intensity and frequency of hydrometeorological extreme events. In order to adequately capture and analyze extreme rainfall events, in particular when assessing flood and flash flood situations, data is required at high spatial and sub-daily resolution which is often not available in sufficient density and over extended time periods. The ClimEx project (Climate Change and Hydrological Extreme Events) addresses the alteration of hydrological extreme events under climate change conditions. In order to differentiate between a clear climate change signal and the limits of natural variability, unique Single-Model Regional Climate Model Ensembles (CRCM5 driven by CanESM2, RCP8.5) were created for a European and North-American domain, each comprising 50 members of 150 years (1951-2100). In combination with the CORDEX-Database, this newly created ClimEx-Ensemble is a one-of-a-kind model dataset to analyze changes of sub-daily extreme events. For the purpose of bias-correcting the regional climate model ensembles as well as for the baseline calibration and validation of hydrological catchment models, a new sub-daily (3h) high-resolution (500m) gridded reference dataset (SDCLIREF) was created for a domain covering the Upper Danube and Main watersheds ( 100.000km2). As the sub-daily observations lack a continuous time series for the reference period 1980-2010, the need for a suitable method to bridge the gap of the discontinuous time series arouse. The Method of Fragments (Sharma and Srikanthan (2006); Westra et al. (2012)) was applied to transform daily observations to sub-daily rainfall events to extend the time series and densify the station network. Prior to applying the Method of Fragments and creating the gridded dataset using rigorous interpolation routines, data collection of observations, operated by several institutions in three countries (Germany, Austria, Switzerland), and the subsequent quality control of the observations was carried out. Among others, the quality control checked for steps, extensive dry seasons, temporal consistency and maximum hourly values. The resulting SDCLIREF dataset provides a robust precipitation reference for hydrometeorological applications in unprecedented high spatio-temporal resolution. References: Sharma, A.; Srikanthan, S. (2006): Continuous Rainfall Simulation: A Nonparametric Alternative. In: 30th Hydrology and Water Resources Symposium 4-7 December 2006, Launceston, Tasmania. Westra, S.; Mehrotra, R.; Sharma, A.; Srikanthan, R. (2012): Continuous rainfall simulation. 1. A regionalized subdaily disaggregation approach. In: Water Resour. Res. 48 (1). DOI: 10.1029/2011WR010489.

  7. Slip reactivation model for the 2011 Mw9 Tohoku earthquake: Dynamic rupture, sea floor displacements and tsunami simulations.

    NASA Astrophysics Data System (ADS)

    Galvez, P.; Dalguer, L. A.; Rahnema, K.; Bader, M.

    2014-12-01

    The 2011 Mw9 Tohoku earthquake has been recorded with a vast GPS and seismic network given unprecedented chance to seismologists to unveil complex rupture processes in a mega-thrust event. In fact more than one thousand near field strong-motion stations across Japan (K-Net and Kik-Net) revealed complex ground motion patterns attributed to the source effects, allowing to capture detailed information of the rupture process. The seismic stations surrounding the Miyagi regions (MYGH013) show two clear distinct waveforms separated by 40 seconds. This observation is consistent with the kinematic source model obtained from the inversion of strong motion data performed by Lee's et al (2011). In this model two rupture fronts separated by 40 seconds emanate close to the hypocenter and propagate towards the trench. This feature is clearly observed by stacking the slip-rate snapshots on fault points aligned in the EW direction passing through the hypocenter (Gabriel et al, 2012), suggesting slip reactivation during the main event. A repeating slip on large earthquakes may occur due to frictional melting and thermal fluid pressurization effects. Kanamori & Heaton (2002) argued that during faulting of large earthquakes the temperature rises high enough creating melting and further reduction of friction coefficient. We created a 3D dynamic rupture model to reproduce this slip reactivation pattern using SPECFEM3D (Galvez et al, 2014) based on a slip-weakening friction with sudden two sequential stress drops . Our model starts like a M7-8 earthquake breaking dimly the trench, then after 40 seconds a second rupture emerges close to the trench producing additional slip capable to fully break the trench and transforming the earthquake into a megathrust event. The resulting sea floor displacements are in agreement with 1Hz GPS displacements (GEONET). The seismograms agree roughly with seismic records along the coast of Japan.The simulated sea floor displacement reaches 8-10 meters of up-lift close to the trench, which may be the cause of such a devastating tsunami followed by the Tohoku earthquake. To investigate the impact of such a huge up-lift, we ran tsunami simulations with the slip reactivation model using sam(oa)2 (O. Meister et al., 2012), a state-of-the-art Finite-Volume framework to simulate the resulting tsunami waves.

  8. The Chelyabinsk event

    NASA Astrophysics Data System (ADS)

    Borovička, Jiri

    2015-08-01

    On February 15, 2013, 3:20 UT, an asteroid of the size of about 19 meters and mass of 12,000 metric tons entered the Earth's atmosphere unexpectedly near the border of Kazakhstan and Russia. It was the largest confirmed Earth impactor since the Tunguska event in 1908. The body moved approximately westwards with a speed of 19 km/s, on a trajectory inclined 18 degrees to the surface, creating a fireball of steadily increasing brightness. Eleven seconds after the first sightings, the fireball reached its maximum brightness. At that point, it was located less than 40 km south from Chelyabinsk, a Russian city of population more than one million, at an altitude of 30 km. For people directly underneath, the fireball was 30 times brighter than the Sun. The cosmic body disrupted into fragments; the largest of them was visible for another five seconds before it disappeared at an altitude of 12.5 km, when it was decelerated to 3 km/s. Fifty six second later, that ~ 600 kg fragment landed in Lake Chebarkul and created an 8 m wide hole in the ice. More material remained, however, in the atmosphere forming a dust trail up to 2 km wide and extending along the fireball trajectory from altitude 18 to 70 km. People observing the dust trail from Chelyabinsk and other places were surprised by the arrival of a very strong blast wave 90 - 150 s after the fireball passage (depending on location). The wave, produced by the supersonic flight of the body, broke ~10% of windows in Chelyabinsk (~40% of buildings were affected). More than 1600 people were injured, mostly from broken glass. Small meteorites landed in an area 60 km long and several km wide and caused no damage. The meteorites were classified as LL ordinary chondrites and were interesting by the presence of two phases, light and dark. The dust left in the atmosphere circled the Earth within few days and formed a ring around the northern hemisphere.The whole event was well documented by video cameras, seismic and infrasonic records, and satellite observations. The total energy was 500 kT TNT (2×1015 J). Details of the atmospheric fragmentations and implication for the asteroid impact hazard will be discussed in this review..

  9. Drones application on snow and ice surveys in alpine areas

    NASA Astrophysics Data System (ADS)

    La Rocca, Leonardo; Bonetti, Luigi; Fioletti, Matteo; Peretti, Giovanni

    2015-04-01

    First results from Climate change are now clear in Europe, and in Italy in particular, with the natural disasters that damaged irreparably the territory and the habitat due to extreme meteorological events. The Directive 2007/60/EC highlight that an "effective natural hazards prevention and mitigation that requires coordination between Member States above all on natural hazards prevention" is necessary. A climate change adaptation strategy is identified on the basis of the guidelines of the European Community program 2007-2013. Following the directives provided in the financial instrument for civil protection "Union Civil Protection Mechanism" under Decision No. 1313/2013 / EU of the European Parliament and Council, a cross-cutting approach that takes into account a large number of implementation tools of EU policies is proposed as climate change adaptation strategy. In last 7 years a network of trans-Alpine area's authorities was created between Italy and Switzerland to define an adaptive strategy on climate change effects on natural enviroment based on non structural remedies. The Interreg IT - CH STRADA Project (STRategie di ADAttamento al cambiamento climatico) was born to join all the non structural remedies to climate change effects caused by snow and avalanches, on mountain sources, extreme hydrological events and to manage all transnational hydrological resources, involving all stakeholders from Italy and Switzerland. The STRADA project involved all civil protection authorities and all research centers in charge of snow, hydrology end civil protection. The Snow - meteorological center of the Regional Agency for Environment Protection (CNM of ARPA Lombardia) and the Civil Protection of Lombardy Region created a research team to develop tools for avalanche prediction and to observe and predict snow cover on Alpine area. With this aim a lot of aerial photo using Drone as been performed in unusual landscape. Results of all surveys were really interesting on a scientific point of view. All flight was performed by remote controlled aero models with high resolution camera. Aero models were able to take off and to ground on snow covered or icy surfaces since the specific aerodynamic configuration and specific engine used to. All winter surveys were executed flying low to obtain a tridimensional reconstruction of an High resolution Digital Elevation Model (DEM) of snow cover and ice cover and on summer as been developed the DEM were snow amass in the maximum avalanche risk period. The difference between winter and summer DEM (difference between two point clouds) let to individuate the snow depth, and it was used as input data for the snow avalanche model for the Aprica site (Bergamo - Italy).

  10. Comparison of Organ Dosimetry for Astronaut Phantoms: Earth-Based vs. Microgravity-Based Anthropometry and Body Positioning

    NASA Technical Reports Server (NTRS)

    VanBaalen, Mary; Bahadon, Amir; Shavers, Mark; Semones, Edward

    2011-01-01

    The purpose of this study is to use NASA radiation transport codes to compare astronaut organ dose equivalents resulting from solar particle events (SPE), geomagnetically trapped protons, and free-space galactic cosmic rays (GCR) using phantom models representing Earth-based and microgravity-based anthropometry and positioning. Methods: The Univer sity of Florida hybrid adult phantoms were scaled to represent male and female astronauts with 5th, 50th, and 95th percentile heights and weights as measured on Earth. Another set of scaled phantoms, incorporating microgravity-induced changes, such as spinal lengthening, leg volume loss, and the assumption of the neutral body position, was also created. A ray-tracer was created and used to generate body self-shielding distributions for dose points within a voxelized phantom under isotropic irradiation conditions, which closely approximates the free-space radiation environment. Simplified external shielding consisting of an aluminum spherical shell was used to consider the influence of a spacesuit or shielding of a hull. These distributions were combined with depth dose distributions generated from the NASA radiation transport codes BRYNTRN (SPE and trapped protons) and HZETRN (GCR) to yield dose equivalent. Many points were sampled per organ. Results: The organ dos e equivalent rates were on the order of 1.5-2.5 mSv per day for GCR (1977 solar minimum) and 0.4-0.8 mSv per day for trapped proton irradiation with shielding of 2 g cm-2 aluminum equivalent. The organ dose equivalents for SPE irradiation varied considerably, with the skin and eye lens having the highest organ dose equivalents and deep-seated organs, such as the bladder, liver, and stomach having the lowest. Conclus ions: The greatest differences between the Earth-based and microgravity-based phantoms are observed for smaller ray thicknesses, since the most drastic changes involved limb repositioning and not overall phantom size. Improved self-shielding models reduce the overall uncertainty in organ dosimetry for mission-risk projections and assessments for astronauts

  11. 77 FR 65815 - Special Local Regulations; Marine Events in the Seventh Coast Guard District

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-31

    ... imaginary line connecting the following points: Starting at Point 1 in position 24[deg]32'08'' N, 81[deg]50'19'' W; thence east to Point 2 in position 24[deg]32'23'' N, 81[deg]48'58'' W; thence northeast to Point 3 in position 24[deg]33'14'' N, 81[deg]48'47'' W; thence northeast to Point 4 in position 24[deg...

  12. Protecting the Homeland Report of the Defense Science Board Task Force on Defensive Information Operations. 2000 Summer Study. Volume II

    DTIC Science & Technology

    2001-03-01

    between attacks and other events such as accidents, system failures, or hacking by thrill-seekers. This challenge is exacerbated by the speed of events in...International Telegraph and Telephone (CCITT) international standards body and is referred to as Signaling System # 7 ( SS7 ). Commerc" I Intelligent...point to fixed infrastructure "" Signaling Transfer Point (STP) - Packet switch in CCITT#7 Network STP ... SS7 * System Data Bases i Network

  13. Users’ Workshop on Combat Stress (5th) Held at Fort Sam Houston, Texas from 9 to 13 December 1985.

    DTIC Science & Technology

    1986-07-01

    a point that has been implicit throughout this talk, the notion of additive or cumulative aspects of stressful experiences. I have focused primarily... a tendency to dwell on the unpleasant emotion- charged events in the wake of their shooting experience. Officers report reliving and reenacting the...examined in police officers. Existing immediate observation of the link between the occurrence of stress- related symptoms and a shooting event points

  14. Children's Vantage Point of Recalling Traumatic Events.

    PubMed

    Dawson, Katie S; Bryant, Richard A

    2016-01-01

    This study investigated the recollections of child survivors of the 2004 Asian tsunami in terms of their vantage point and posttraumatic stress disorder (PTSD) responses. Five years after the tsunami, 110 children (aged 7-13 years) living in Aceh, Indonesia were assessed for source of memories of the tsunami (personal memory or second-hand source), vantage point of the memory, and were administered the Children's Revised Impact of Event Scale-13. Fifty-three children (48%) met criteria for PTSD. Two-thirds of children reported direct memories of the tsunami and one-third reported having memories based on reports from other people. More children (97%) who reported an indirect memory of the tsunami recalled the event from an onlooker's perspective to some extent than those who recalled the event directly (63%). Boys were more likely to rely on stories from others to reconstruct their memory of the tsunami, and to adopt an observer perspective. Boys who adopted an observer's perspective had less severe PTSD than those who adopted a field perspective. These findings suggest that, at least in the case of boys, an observer perspectives of trauma can be associated with levels of PTSD.

  15. Children’s Vantage Point of Recalling Traumatic Events

    PubMed Central

    Dawson, Katie S.; Bryant, Richard A.

    2016-01-01

    This study investigated the recollections of child survivors of the 2004 Asian tsunami in terms of their vantage point and posttraumatic stress disorder (PTSD) responses. Five years after the tsunami, 110 children (aged 7–13 years) living in Aceh, Indonesia were assessed for source of memories of the tsunami (personal memory or second-hand source), vantage point of the memory, and were administered the Children’s Revised Impact of Event Scale-13. Fifty-three children (48%) met criteria for PTSD. Two-thirds of children reported direct memories of the tsunami and one-third reported having memories based on reports from other people. More children (97%) who reported an indirect memory of the tsunami recalled the event from an onlooker’s perspective to some extent than those who recalled the event directly (63%). Boys were more likely to rely on stories from others to reconstruct their memory of the tsunami, and to adopt an observer perspective. Boys who adopted an observer’s perspective had less severe PTSD than those who adopted a field perspective. These findings suggest that, at least in the case of boys, an observer perspectives of trauma can be associated with levels of PTSD. PMID:27649299

  16. Impact Detection for Characterization of Complex Multiphase Flows

    NASA Astrophysics Data System (ADS)

    Chan, Wai Hong Ronald; Urzay, Javier; Mani, Ali; Moin, Parviz

    2016-11-01

    Multiphase flows often involve a wide range of impact events, such as liquid droplets impinging on a liquid pool or gas bubbles coalescing in a liquid medium. These events contribute to a myriad of large-scale phenomena, including breaking waves on ocean surfaces. As impacts between surfaces necessarily occur at isolated points, numerical simulations of impact events will require the resolution of molecular scales near the impact points for accurate modeling. This can be prohibitively expensive unless subgrid impact and breakup models are formulated to capture the effects of the interactions. The first step in a large-eddy simulation (LES) based computational methodology for complex multiphase flows like air-sea interactions requires effective detection of these impact events. The starting point of this work is a collision detection algorithm for structured grids on a coupled level set / volume of fluid (CLSVOF) solver adapted from an earlier algorithm for cloth animations that triangulates the interface with the marching cubes method. We explore the extension of collision detection to a geometric VOF solver and to unstructured grids. Supported by ONR/A*STAR. Agency of Science, Technology and Research, Singapore; Office of Naval Research, USA.

  17. Mixed-Poisson Point Process with Partially-Observed Covariates: Ecological Momentary Assessment of Smoking.

    PubMed

    Neustifter, Benjamin; Rathbun, Stephen L; Shiffman, Saul

    2012-01-01

    Ecological Momentary Assessment is an emerging method of data collection in behavioral research that may be used to capture the times of repeated behavioral events on electronic devices, and information on subjects' psychological states through the electronic administration of questionnaires at times selected from a probability-based design as well as the event times. A method for fitting a mixed Poisson point process model is proposed for the impact of partially-observed, time-varying covariates on the timing of repeated behavioral events. A random frailty is included in the point-process intensity to describe variation among subjects in baseline rates of event occurrence. Covariate coefficients are estimated using estimating equations constructed by replacing the integrated intensity in the Poisson score equations with a design-unbiased estimator. An estimator is also proposed for the variance of the random frailties. Our estimators are robust in the sense that no model assumptions are made regarding the distribution of the time-varying covariates or the distribution of the random effects. However, subject effects are estimated under gamma frailties using an approximate hierarchical likelihood. The proposed approach is illustrated using smoking data.

  18. Reciprocal influences between negative life events and callous-unemotional traits.

    PubMed

    Kimonis, Eva R; Centifanti, Luna C M; Allen, Jennifer L; Frick, Paul J

    2014-11-01

    Children with conduct problems and co-occurring callous-unemotional (CU) traits show more severe, stable, and aggressive antisocial behaviors than those without CU traits. Exposure to negative life events has been identified as an important contributing factor to the expression of CU traits across time, although the directionality of this effect has remained unknown due to a lack of longitudinal study. The present longitudinal study examined potential bidirectional effects of CU traits leading to experiencing more negative life events and negative life events leading to increases in CU traits across 3 years among a sample of community-based school-aged (M = 10.9, SD = 1.71 years) boys and girls (N = 98). Repeated rating measures of CU traits, negative life events and conduct problems completed by children and parents during annual assessments were moderately to highly stable across time. Cross-lagged models supported a reciprocal relationship of moderate magnitude between child-reported CU traits and "controllable" negative life events. Parent-reported CU traits predicted "uncontrollable" life events at the earlier time point and controllable life events at the later time point, but no reciprocal effect was evident. These findings have important implications for understanding developmental processes that contribute to the stability of CU traits in youth.

  19. The Five A's: what do patients want after an adverse event?

    PubMed

    Cox, Wendy

    2007-01-01

    After an adverse event, Five A's: Acknowledgment, Apology, All the Facts, Assurance and Appropriate Compensation, serve to meet the essential needs of patients and their families. This simple mnemonic creates a clear framework of understanding for the actions health professionals need to take to manage errors and adverse events in an empathic and patient-oriented fashion. While not all patients demand or need compensation, most need at least the first four A's. Patient-centered communication using this simple framework following an adverse event will foster a climate of understanding and frank discussion, addressing the emotional and physical needs of the whole patient and family.

  20. A mediation skills model to manage disclosure of errors and adverse events to patients.

    PubMed

    Liebman, Carol B; Hyman, Chris Stern

    2004-01-01

    In 2002 Pennsylvania became the first state to impose on hospitals a statutory duty to notify patients in writing of a serious event. If the disclosure conversations are carefully planned, properly executed, and responsive to patients' needs, this new requirement creates possible benefits for both patient safety and litigation risk management. This paper describes a model for accomplishing these goals that encourages health care providers to communicate more effectively with patients following an adverse event or medical error, learn from mistakes, respond to the concerns of patients and families after an adverse event, and arrive at a fair and cost-effective resolution of valid claims.

  1. An analysis of post-event processing in social anxiety disorder.

    PubMed

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  2. Dynamic Pointing Triggers Shifts of Visual Attention in Young Infants

    ERIC Educational Resources Information Center

    Rohlfing, Katharina J.; Longo, Matthew R.; Bertenthal, Bennett I.

    2012-01-01

    Pointing, like eye gaze, is a deictic gesture that can be used to orient the attention of another person towards an object or an event. Previous research suggests that infants first begin to follow a pointing gesture between 10 and 13 months of age. We investigated whether sensitivity to pointing could be seen at younger ages employing a technique…

  3. Reactive oxygen metabolites (ROMs) are associated with cardiovascular disease in chronic hemodialysis patients.

    PubMed

    Bossola, Maurizio; Vulpio, Carlo; Colacicco, Luigi; Scribano, Donata; Zuppi, Cecilia; Tazza, Luigi

    2012-02-11

    The aim of our study was to measure reactive oxygen metabolites (ROMs) in chronic hemodialysis (HD) patients and evaluate the possible association with cardiovascular disease (CVD) and mortality. We measured ROMs in 76 HD patients and correlated with CVD, cardiovascular (CV) events in the follow-up and all-cause and CVD-related mortality. The levels of ROMs presented a median value of 270 (238.2-303.2) CARR U (interquartile range). We created a ROC curve (ROMs levels vs. CVD) and we identified a cut-off point of 273 CARR U. Patients with ROMs levels ≥273 CARR U were significantly older, had higher C-reactive protein levels and lower creatinine concentrations. The prevalence of CVD was higher in patients with ROMs levels ≥273 (87.1%) than in those with ROMs levels <273 CARR U (17.7%; p<0.0001). ROMs levels were significantly higher in patients with CVD (317±63.8) than in those without (242.7±49.1; p<0.0001). At multiple regression analysis, age, creatinine and C-reactive protein were independent factors associated with ROMs. At multiple logistic regression analysis the association between ROMs and CVD was independent (OR: 1.02, 95% CI: 1.00-1.05; p=0.03). Twenty six patients developed cardiovascular (CV) events during the follow-up. Of these, seven were in the group with ROMs levels <273 CARR U and 19 in the group with ROMs levels ≥273 CARR U. The logistic regression analysis showed that both age (OR: 1.06, 95% CI: 1.01-1.12; p=0.013) and ROMs levels (OR: 1.10, 95% CI: 1.00-1.02; p=0.045) were independently associated with CV events in the follow-up. ROMs are independently associated with CVD and predict CV events in chronic HD patients.

  4. Ionospheric response to a recurrent magnetic storm during an event of High Speed Stream in October 2016.

    NASA Astrophysics Data System (ADS)

    Nicoli Candido, C. M.; Resende, L.; Becker-Guedes, F.; Batista, I. S.

    2017-12-01

    In this work we investigate the response of the low latitude ionosphere to recurrent geomagnetic activity caused by events of High speed streams (HSSs)/Corotating Interaction Regions (CIRs) during the low descending phase of solar activity in the solar cycle 24. Intense magnetic field regions called Corotating Interaction Regions or CIRs are created by the interaction of fast streams and slow streams ejected by long duration coronal holes in Sun. This interaction leads to an increase in the mean interplanetary magnetic field (IMF) which causes moderate and recurrent geomagnetic activity when interacts with the Earth's magnetosphere. The ionosphere can be affected by these phenomena by several ways, such as an increase (or decrease) of the plasma ionization, intensification of plasma instabilities during post-sunset/post-midnight hours and subsequent development of plasma irregularities/spread-F, as well as occurrence of plasma scintillation. Therefore, we investigate the low latitude ionospheric response during moderate geomagnetic storm associated to an event of High Speed Stream occurred during decreasing phase of solar activity in 2016. An additional ionization increasing is observed in Es layer during the main peak of the geomagnetic storm. We investigate two possible different mechanisms that caused these extras ionization: the role of prompt penetration of interplanetary electric field, IEFEy at equatorial region, and the energetic electrons precipitation on the E and F layers variations. Finally, we used data from Digisondes installed at equatorial region, São Luís, and at conjugate points in Brazilian latitudes, Boa Vista and Cachoeira Paulista. We analyzed the ionospheric parameters such as the critical frequency of F layer, foF2, the F layer peak height, hmF2, the F layer bottomside, h'F, the blanketing frequency of sporadic layer, fbEs, the virtual height of Es layer h'Es and the top frequency of the Es layer ftEs during this event.

  5. Twelve Years of Interviews with the Inupiat people of Arctic Alaska: Report from a Community Workshop

    NASA Astrophysics Data System (ADS)

    Eisner, W. R.; Hinkel, K. M.; Cuomo, C.

    2015-12-01

    On 20 August 2015, a workshop was held in Barrow, Alaska, which presented the highlights of 12 years of research connecting local indigenous knowledge of landscape processes with scientific research on arctic lakes, tundra changes, and permafrost stability. Seventy-six Iñupiat elders, hunters, and other knowledge-holders from the North Slope villages of Barrow, Atqasuk, Wainwright, Nuiqsut, and Anaktuvuk Pass were interviewed, and over 75 hours of videotaped interviews were produced. The interviews provided information and observations on landforms, lakes, erosion, permafrost degradation and thermokarst, changes in the environment and in animal behavior, human modification of lakes, tundra damage from 4-wheel off-road vehicles, tundra trail expansion, and other phenomena. Community concerns regarding the impact of environmental change on food procurement, animal migration, human travel routes, and the future of subsistence practices were also prominent themes. Following an interview, each videotaped session was logged. Each time an elder pointed to a location on a map and explained a landscape event/observation or told a story, the time-stamp in the video was recorded. Each logged event consisted of a code and a short account of the observation. From these reference sheets, a Geographic Information System (GIS) dataset was created. A logged account for each videotape, with geographic coordinates, event code, and event description is available for each videotape. The goal of the workshop was to report on our findings, thank the community for their support, and collaboratively develop plans for archiving and disseminating this data. A complete video library and searchable, printed and digital issues of the logging dataset for archiving in the communities were also produced. Discussions with administrative personnel at the Tuzzy Library in Barrow and the Inupiat Heritage Center have enabled us to set standards and develop a timeline for turning over the library of videos and GIS data to the North Slope community.

  6. CryoSat-2 Processing and Model Interpretation of Greenland Ice Sheet Volume Changes

    NASA Astrophysics Data System (ADS)

    Nilsson, J.; Gardner, A. S.; Sandberg Sorensen, L.

    2015-12-01

    CryoSat-2 was launched in late 2010 tasked with monitoring the changes of the Earth's land and sea ice. It carries a novel radar altimeter allowing the satellite to monitor changes in highly complex terrain, such as smaller ice caps, glaciers and the marginal areas of the ice sheets. Here we present on the development and validation of an independent elevation retrieval processing chain and respective elevation changes based on ESA's L1B data. Overall we find large improvement in both accuracy and precision over Greenland relative to ESA's L2 product when comparing against both airborne data and crossover analysis. The seasonal component and spatial sampling of the surface elevation changes where also compared against ICESat derived changes from 2003-2009. The comparison showed good agreement between the to product on a local scale. However, a global sampling bias was detected in the seasonal signal due to the clustering of CryoSat-2 data in higher elevation areas. The retrieval processing chain presented here does not correct for changes in surface scattering conditions and appears to be insensitive to the 2012 melt event (Nilsson et al., 2015). This in contrast to the elevation changes derived from ESA's L2 elevation product, which where found to be sensitive to the effects of the melt event. The positive elevation bias created by the event introduced a discrepancy between the two products with a magnitude of roughly 90 km3/year. This difference can directly be attributed to the differences in retracking procedure pointing to the importance of the retracking of the radar waveforms for altimetric volume change studies. Greenland 2012 melt event effects on CryoSat-2 radar altimetry./ Nilsson, Johan; Vallelonga, Paul Travis; Simonsen, Sebastian Bjerregaard; Sørensen, Louise Sandberg; Forsberg, René; Dahl-Jensen, Dorthe; Hirabayashi, Motohiro; Goto-Azuma, Kumiko; Hvidberg, Christine S.; Kjær, Helle A.; Satow, Kazuhide.

  7. Reassembly of Point Pleasant Bridge : documentation of structural damage and identification of laboratory specimens

    DOT National Transportation Integrated Search

    1970-07-01

    The collapse of the Point Pleasant Bridge created many unique problems and posed many new questions to the bridge engineering profession. One question that was paramount was, "What caused the bridge to collapse?". Arrangements were made with the Corp...

  8. Triangular Libration Points in the CR3BP with Radiation, Triaxiality and Potential from a Belt

    NASA Astrophysics Data System (ADS)

    Singh, Jagadish; Taura, Joel John

    2017-07-01

    In this paper the equations of motion of the circular restricted three body problem is modified to include radiation of the bigger primary, triaxiality of the smaller primary; and gravitational potential created by a belt. We have obtained that due to the perturbations, the locations of the triangular libration points and their linear stability are affected. The points move towards the bigger primary due to the resultant effect of the perturbations. Triangular libration points are stable for 0<μ<μc0<μ<μc and unstable for μc≤μ≤12μc≤μ≤12, where μcμc is the critical mass ratio affected by the perturbations. The radiation of the bigger primary and triaxiality of the smaller primary have destabilizing propensities, whereas the potential created by the belt has stabilizing propensity. This model could be applied in the study of the motion of a dust particle near radiating -triaxial binary system surrounded by a belt.

  9. First impressions and beyond: marketing your practice in touch points--Part I.

    PubMed

    Bisera, Cheryl

    2012-01-01

    Often medical administrators or providers call in a marketing expert when they feel the practice is lacking the growth they want. What's on their mind is usually how to bring in more patients, and they automatically look to external marketing strategies. However, one of the most important elements to successfully marketing a practice is making sure you haven't created a turnstile, where new patients are coming often but not returning or being converted into loyal, referring patients. When new patients are going as quickly as they are coming, you aren't building solid growth. Loyal, referring patients are powerful marketing assets-they are in the community speaking good of you and your practice from first-hand experience. You can create this atmosphere of loyal, referring patients by providing positive touch points that fulfill the needs of your patients. Touch points are the groundwork supporting other types of marketing. This article covers three important touch points that are crucial to a positive patient experience.

  10. Predictors of long-term recurrent vascular events after ischemic stroke at young age: the Italian Project on Stroke in Young Adults.

    PubMed

    Pezzini, Alessandro; Grassi, Mario; Lodigiani, Corrado; Patella, Rosalba; Gandolfo, Carlo; Zini, Andrea; Delodovici, Maria Luisa; Paciaroni, Maurizio; Del Sette, Massimo; Toriello, Antonella; Musolino, Rossella; Calabrò, Rocco Salvatore; Bovi, Paolo; Adami, Alessandro; Silvestrelli, Giorgio; Sessa, Maria; Cavallini, Anna; Marcheselli, Simona; Bonifati, Domenico Marco; Checcarelli, Nicoletta; Tancredi, Lucia; Chiti, Alberto; Del Zotto, Elisabetta; Spalloni, Alessandra; Giossi, Alessia; Volonghi, Irene; Costa, Paolo; Giacalone, Giacomo; Ferrazzi, Paola; Poli, Loris; Morotti, Andrea; Rasura, Maurizia; Simone, Anna Maria; Gamba, Massimo; Cerrato, Paolo; Micieli, Giuseppe; Melis, Maurizio; Massucco, Davide; De Giuli, Valeria; Iacoviello, Licia; Padovani, Alessandro

    2014-04-22

    Data on long-term risk and predictors of recurrent thrombotic events after ischemic stroke at a young age are limited. We followed 1867 patients with first-ever ischemic stroke who were 18 to 45 years of age (mean age, 36.8±7.1 years; women, 49.0%), as part of the Italian Project on Stroke in Young Adults (IPSYS). Median follow-up was 40 months (25th to 75th percentile, 53). The primary end point was a composite of ischemic stroke, transient ischemic attack, myocardial infarction, or other arterial events. One hundred sixty-three patients had recurrent thrombotic events (average rate, 2.26 per 100 person-years at risk). At 10 years, cumulative risk was 14.7% (95% confidence interval, 12.2%-17.9%) for primary end point, 14.0% (95% confidence interval, 11.4%-17.1%) for brain ischemia, and 0.7% (95% confidence interval, 0.4%-1.3%) for myocardial infarction or other arterial events. Familial history of stroke, migraine with aura, circulating antiphospholipid antibodies, discontinuation of antiplatelet and antihypertensive medications, and any increase of 1 traditional vascular risk factor were independent predictors of the composite end point in multivariable Cox proportional hazards analysis. A point-scoring system for each variable was generated by their β-coefficients, and a predictive score (IPSYS score) was calculated as the sum of the weighted scores. The area under the receiver operating characteristic curve of the 0- to 5-year score was 0.66 (95% confidence interval, 0.61-0.71; mean, 10-fold internally cross-validated area under the receiver operating characteristic curve, 0.65). Among patients with ischemic stroke aged 18 to 45 years, the long-term risk of recurrent thrombotic events is associated with modifiable, age-specific risk factors. The IPSYS score may serve as a simple tool for risk estimation.

  11. Critical Events in the Lives of Interns

    PubMed Central

    Graham, Mark; Schmidt, Hilary; Stern, David T.; Miller, Steven Z.

    2008-01-01

    BACKGROUND Early residency is a crucial time in the professional development of physicians. As interns assume primary care for their patients, they take on new responsibilities. The events they find memorable during this time could provide us with insight into their developing professional identities. OBJECTIVE To evaluate the most critical events in the lives of interns. PARTICIPANTS Forty-one internal medicine residents at one program participated in a two-day retreat in the fall of their first year. Each resident provided a written description of a recent high point, low point, and patient conflict. MEASUREMENTS We used a variant of grounded theory to analyze these critical incidents and determine the underlying themes of early internship. Independent inter-rater agreement of >90% was achieved for the coding of excerpts. MAIN RESULTS The 123 critical incidents were clustered into 23 categories. The categories were further organized into six themes: confidence, life balance, connections, emotional responses, managing expectations, and facilitating teamwork. High points were primarily in the themes of confidence and connections. Low points were dispersed more generally throughout the conceptual framework. Conflicts with patients were about negotiating the expectations inherent in the physician–patient relationship. CONCLUSION The high points, low points, and conflicts reported by early residents provide us with a glimpse into the lives of interns. The themes we have identified reflect critical challenges interns face the development of their professional identity. Program directors could use this process and conceptual framework to guide the development and promotion of residents’ emerging professional identities. PMID:18972091

  12. Creating a Model of Acceptance: Preservice Teachers Interact with Non-English-Speaking Latino Parents Using Culturally Relevant Mathematics and Science Activities at Family Learning Events

    ERIC Educational Resources Information Center

    Ramirez, Olga; McCollough, Cherie A.; Diaz, Zulmaris

    2016-01-01

    The following describes a culturally relevant mathematics and science content program implemented by preservice teachers (PSTs) at Family Math/Science Learning Events (FM/SLEs) conducted through two different university programs in south Texas. These experiences are required course activities designed to inform PSTs of the importance of…

  13. When the Stars Align: On the Contributions of Gilbert Gottlieb and Peter C. M. Molenaar to Developmental Science Theory and Method

    ERIC Educational Resources Information Center

    Lerner, Richard M.; Batanova, Milena; Ettekal, Andrea Vest; Hunter, Cristina

    2015-01-01

    When truly spectacular events occur in the performing arts or in team sports, when the sets of artists or athletes respectively creating these events are discussed, a common phrase used in America to explain the "good fortune" that was involved in such unique occurrences is that "the stars aligned." In this commentary on:…

  14. Tapir: A web interface for transit/eclipse observability

    NASA Astrophysics Data System (ADS)

    Jensen, Eric

    2013-06-01

    Tapir is a set of tools, written in Perl, that provides a web interface for showing the observability of periodic astronomical events, such as exoplanet transits or eclipsing binaries. The package provides tools for creating finding charts for each target and airmass plots for each event. The code can access target lists that are stored on-line in a Google spreadsheet or in a local text file.

  15. Electromagnetic radiation as a probe of the initial state and of viscous dynamics in relativistic nuclear collisions

    NASA Astrophysics Data System (ADS)

    Vujanovic, Gojko; Paquet, Jean-François; Denicol, Gabriel S.; Luzum, Matthew; Jeon, Sangyong; Gale, Charles

    2016-07-01

    The penetrating nature of electromagnetic signals makes them suitable probes to explore the properties of the strongly interacting medium created in relativistic nuclear collisions. We examine the effects of the initial conditions and shear relaxation time on the spectra and flow coefficients of electromagnetic probes, using an event-by-event 3+1-dimensional viscous hydrodynamic simulation (music).

  16. The fragmentation of Kosmos 2163

    NASA Technical Reports Server (NTRS)

    1992-01-01

    On 6 Dec. 1991 Kosmos 2163, a maneuverable Soviet spacecraft which had been in orbit for 58 days, experienced a major breakup at an altitude of approximately 210 km. Although numerous pieces of debris were created, the fragments decayed rapidly leaving no long-term impact on the near-Earth environment. The assessed cause of the event is the deliberate detonation of an explosive device. Details of this event are presented.

  17. Flux transfer events produced by the onset of merging at multiple X lines

    NASA Astrophysics Data System (ADS)

    Ku, Hwar C.; Sibeck, David G.

    2000-02-01

    We present two-dimensional magnetohydrodynamic model predictions for the signatures of flux transfer events (FTEs) at the dayside magnetopause produced by the onset of merging along two or more extended X lines. We consider three scenarios: (1) equal merging rates with identical resistivities south (ɛ1) and north (ɛ2) of the equator, (2) unequal merging rates with ɛ1>ɛ2, and (3) equal merging rates in the presence of a background northward magnetosheath flow velocity (Vz=0.15Bsph/ρmshμ0). Realistic ratios of magnetosheath to magnetospheric parameters are chosen to conform with in situ observations and our previous simulations [Ku and Sibeck, 1997, 1998a, b]: ρmsh/ρsph=10, Bmsh/Bsph=0.5, and Tmsh/Tsph=0.175. In case 1, a stationary magnetic island forms between two bulges accelerating away from the subsolar point. In case 2, the magnetic island formed between the two X lines pursues the bulge created near the line with smaller resistivity. In case 3 the island formed between the two X lines pursues the bulge by moving in the direction of the background flow. All three scenarios produce events with characteristics similar to those in the single X line model: strong asymmetric bipolar magnetic fields and plasma velocities normal to the magnetopause in the magnetosheath and no significant signatures in the magnetosphere (with the magnitude of the trailing pulse exceeding that of the leading edge). However, the islands produced in cases 2 and 3 also generate more symmetric signatures typical of FTEs in both the magnetosheath and magnetosphere. By comparison with observations, the transition parameter plots that we present can be used to discriminate between the merging scenarios.

  18. Probabilistic tsunami hazard assessment in Greece for seismic sources along the segmented Hellenic Arc

    NASA Astrophysics Data System (ADS)

    Novikova, Tatyana; Babeyko, Andrey; Papadopoulos, Gerassimos

    2017-04-01

    Greece and adjacent coastal areas are characterized by a high population exposure to tsunami hazard. The Hellenic Arc is the most active geotectonic structure for the generation of earthquakes and tsunamis. We performed probabilistic tsunami hazard assessment for selected locations of Greek coastlines which are the forecasting points officially used in the tsunami warning operations by the Hellenic National Tsunami Warning Center and the NEAMTWS/IOC/UNESCO. In our analysis we considered seismic sources for tsunami generation along the western, central and eastern segments of the Hellenic Arc. We first created a synthetic catalog as long as 10,000 years for all the significant earthquakes with magnitudes in the range from 6.0 to 8.5, the real events being included in this catalog. For each event included in the synthetic catalog a tsunami was generated and propagated using Boussinesq model. The probability of occurrence for each event was determined by Gutenberg-Richter magnitude-frequency distribution. The results of our study are expressed as hazard curves and hazard maps. The hazard curves were obtained for the selected sites and present the annual probability of exceedance as a function of pick coastal tsunami amplitude. Hazard maps represent the distribution of peak coastal tsunami amplitudes corresponding to a fixed annual probability. In such forms our results can be easily compared to the ones obtained in other studies and further employed for the development of tsunami risk management plans. This research is a contribution to the EU-FP7 tsunami research project ASTARTE (Assessment, Strategy And Risk Reduction for Tsunamis in Europe), grant agreement no: 603839, 2013-10-30.

  19. To make or buy patient safety solutions: a resource dependence and transaction cost economics perspective.

    PubMed

    Fareed, Naleef; Mick, Stephen S

    2011-01-01

    For almost a decade, public and private organizations have pressured hospitals to improve their patient safety records. Since 2008, the Centers for Medicare & Medicaid Services has no longer been reimbursing hospitals for secondary diagnoses not reported during the point of admission. This ruling has motivated some hospitals to engage in safety-oriented programs to decrease adverse events. This study examined which hospitals may engage in patient safety solutions and whether they create these patient safety solutions within their structures or use suppliers in the market. We used a theoretical model that incorporates the key constructs of resource dependence theory and transaction cost economics theory to predict a hospital's reaction to Centers for Medicare & Medicaid Services "never event" regulations. We present propositions that speculate on how forces conceptualized from the resource dependence theory may affect adoption of patient safety innovations and, when they do, whether the adopting hospitals will do so internally or externally according to the transaction cost economics theory. On the basis of forces identified by the resource dependence theory, we predict that larger, teaching, safety net, horizontally integrated, highly interdependent, and public hospitals in concentrated, high public payer presence, competitive, and resource-rich environments will be more likely to engage in patient safety innovations. Following the logic of the transaction cost economics theory, we predict that of the hospitals that react positively to the never event regulation, most will internalize their innovations in patient safety solutions rather than approach the market, a choice that helps hospitals economize on transaction costs. This study helps hospital managers in their strategic thinking and planning in relation to current and future regulations related to patient safety. For researchers and policy analysts, our propositions provide the basis for empirical testing.

  20. The Effect of Task Duration on Event-Based Prospective Memory: A Multinomial Modeling Approach

    PubMed Central

    Zhang, Hongxia; Tang, Weihai; Liu, Xiping

    2017-01-01

    Remembering to perform an action when a specific event occurs is referred to as Event-Based Prospective Memory (EBPM). This study investigated how EBPM performance is affected by task duration by having university students (n = 223) perform an EBPM task that was embedded within an ongoing computer-based color-matching task. For this experiment, we separated the overall task’s duration into the filler task duration and the ongoing task duration. The filler task duration is the length of time between the intention and the beginning of the ongoing task, and the ongoing task duration is the length of time between the beginning of the ongoing task and the appearance of the first Prospective Memory (PM) cue. The filler task duration and ongoing task duration were further divided into three levels: 3, 6, and 9 min. Two factors were then orthogonally manipulated between-subjects using a multinomial processing tree model to separate the effects of different task durations on the two EBPM components. A mediation model was then created to verify whether task duration influences EBPM via self-reminding or discrimination. The results reveal three points. (1) Lengthening the duration of ongoing tasks had a negative effect on EBPM performance while lengthening the duration of the filler task had no significant effect on it. (2) As the filler task was lengthened, both the prospective and retrospective components show a decreasing and then increasing trend. Also, when the ongoing task duration was lengthened, the prospective component decreased while the retrospective component significantly increased. (3) The mediating effect of discrimination between the task duration and EBPM performance was significant. We concluded that different task durations influence EBPM performance through different components with discrimination being the mediator between task duration and EBPM performance. PMID:29163277

  1. EzyAmp signal amplification cascade enables isothermal detection of nucleic acid and protein targets.

    PubMed

    Linardy, Evelyn M; Erskine, Simon M; Lima, Nicole E; Lonergan, Tina; Mokany, Elisa; Todd, Alison V

    2016-01-15

    Advancements in molecular biology have improved the ability to characterize disease-related nucleic acids and proteins. Recently, there has been an increasing desire for tests that can be performed outside of centralised laboratories. This study describes a novel isothermal signal amplification cascade called EzyAmp (enzymatic signal amplification) that is being developed for detection of targets at the point of care. EzyAmp exploits the ability of some restriction endonucleases to cleave substrates containing nicks within their recognition sites. EzyAmp uses two oligonucleotide duplexes (partial complexes 1 and 2) which are initially cleavage-resistant as they lack a complete recognition site. The recognition site of partial complex 1 can be completed by hybridization of a triggering oligonucleotide (Driver Fragment 1) that is generated by a target-specific initiation event. Binding of Driver Fragment 1 generates a completed complex 1, which upon cleavage, releases Driver Fragment 2. In turn, binding of Driver Fragment 2 to partial complex 2 creates completed complex 2 which when cleaved releases additional Driver Fragment 1. Each cleavage event separates fluorophore quencher pairs resulting in an increase in fluorescence. At this stage a cascade of signal production becomes independent of further target-specific initiation events. This study demonstrated that the EzyAmp cascade can facilitate detection and quantification of nucleic acid targets with sensitivity down to aM concentration. Further, the same cascade detected VEGF protein with a sensitivity of 20nM showing that this universal method for amplifying signal may be linked to the detection of different types of analytes in an isothermal format. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  2. The Long-term Middle Atmospheric Influence of Very Large Solar Proton Events

    NASA Technical Reports Server (NTRS)

    Jackman, Charles H.; Marsh, Daniel R.; Vitt, Francis M.; Garcia, Rolando R.; Randall, Cora E.; Fleming, Eric L.; Frith, Stacey M.

    2008-01-01

    Long-term variations in ozone have been caused by both natural and humankind related processes. The humankind or anthropogenic influence on ozone originates from the chlorofluorocarbons and halons (chlorine and bromine) and has led to international regulations greatly limiting the release of these substances. Certain natural ozone influences are also important in polar regions and are caused by the impact of solar charged particles on the atmosphere. Such natural variations have been studied in order to better quantify the human influence on polar ozone. Large-scale explosions on the Sun near solar maximum lead to emissions of charged particles (mainly protons and electrons), some of which enter the Earth's magnetosphere and rain down on the polar regions. "Solar proton events" have been used to describe these phenomena since the protons associated with these solar events sometimes create a significant atmospheric disturbance. We have used the National Center for Atmospheric Research (NCAR) Whole Atmosphere Community Climate Model (WACCM) to study the long-term (> few months) influences of solar proton events from 1963 through 2004 on stratospheric ozone and temperature. There were extremely large solar proton events in 1972, 1989,2000,2001, and 2003. These events caused very distinctive polar changes in layers of the Earth's atmosphere known as the stratosphere (12-50 km; -7-30 miles) and mesosphere (50-90 km; 30-55 miles). The solar protons connected with these events created hydrogen- and nitrogen-containing compounds, which led to the polar ozone destruction. The nitrogen-containing compounds, called odd nitrogen, lasted much longer than the hydrogen-containing compounds and led to long-lived stratospheric impacts. An extremely active period for these events occurred in the five-year period, 2000- 2004, and caused increases in odd nitrogen which lasted for several months after individual events. Associated stratospheric ozone decreases of >lo% were calculated to last for up to five months past the largest events. However, the computed total column ozone and stratospheric temperature changes connected with the solar events were not found to be statistically significant. Thus, solar proton events do not likely contribute significantly to measured total column ozone fluctuations and stratospheric temperature changes.

  3. Astronomy and catastrophes through myth and old texts.

    NASA Astrophysics Data System (ADS)

    Bon, E.; Ćirković, M.; Stojić, Igor; Gavrilović, Nataša

    In the old myths and iconographies there are some motives that indicate at least one cataclysmic event that influenced many old religions and myths, that could be linked to the impact of the celestial object. We investigate the hypothesis of coherent catastrophism put forward in recent years by Clube, Bailey, Napier and others from both astrobiological and culturogical points of view. The conventional idea that the quasi-periodic break-up of celestial bodies influence terrestrial conditions can today be placed in both wider (astro-biological) and deeper (historico-culturological) context. In particular, we point out that the link between the Neolithic history of astronomy, and origin of Mithraism. We speculate that the main icon of Mithraic religion could pinpoint an event that happened around 4000 BC, when the spring equinox entered the constellation of Taurus. We also, link some motives in other old religions and myths to the same event, or to some similar events that inspired those myths.

  4. Kinematic Signatures of Telic and Atelic Events in ASL Predicates

    ERIC Educational Resources Information Center

    Malaia, Evie; Wilbur, Ronnie B.

    2012-01-01

    This article presents an experimental investigation of kinematics of verb sign production in American Sign Language (ASL) using motion capture data. The results confirm that event structure differences in the meaning of the verbs are reflected in the kinematic formation: for example, in the telic verbs (throw, hit), the end-point of the event is…

  5. Prevalence and Effects of Life Event Exposure among Undergraduate and Community College Students

    ERIC Educational Resources Information Center

    Anders, Samantha L.; Frazier, Patricia A.; Shallcross, Sandra L.

    2012-01-01

    The purposes of this study were to assess lifetime and recent exposure to various life events among undergraduate and community college students and to assess the relation between event exposure and a broad range of outcomes (i.e., mental and physical health, life satisfaction, grade point average). Undergraduate students from a midwestern…

  6. The Relationship of Life Events to Academic Performance in College Students.

    ERIC Educational Resources Information Center

    Knapp, Samuel

    Numerous studies have shown a correlation between life events and physical health, mental health, and behavioral measures such as impaired grade point average. Most of these studies have measured stressfulness by summing the total number of events experienced in a given time period. However, Vinokur and Selzer have shown that the amount of…

  7. Beyond Jeopardy and Lectures: Using "Microsoft PowerPoint" as a Game Design Tool to Teach Science

    ERIC Educational Resources Information Center

    Siko, Jason; Barbour, Michael; Toker, Sacip

    2011-01-01

    To date, research involving homemade PowerPoint games as an instructional tool has not shown statistically significant gains in student performance. This paper examines the results of a study comparing the performance of students in a high school chemistry course who created homemade PowerPoint games as a test review with the students who used a…

  8. Possible multihazard events (tsunamis, earthquakes, landslides) expected on the North Bulgarian Black sea coast

    NASA Astrophysics Data System (ADS)

    Ranguelov, B.; Gospodinopv, D.

    2009-04-01

    Earthquakes The area is famous with its seismic regime. The region usually shows non regular behavior of the strong events occurrence. There are episodes of activation and between them long periods of seismic quiescence. The most important one is at the I-st century BC when according to the chronicler Strabo, the ancient Greek colony "Bisone sank in the waters of the sea". The seismic source is known as Shabla-Kaliakra zone with the best documented seismic event of 31st March 1901. This event had a magnitude of 7.2 (estimated by the macroseismic transformation formula) with a source depth of about 10-20 km. The epicenter was located in the aquatory of the sea. The observed macroseismic intensity on the land reached the maximum value of X degree MSK. This event produced a number of secondary effects - landslides, rockfalls, subsidence, extensive destruction of the houses located around and tsunami (up to 3 meters height observed at Balchik port. This event is selected as referent one. Tsunamis Such earthquakes (magnitude greater then 7.0) almost always trigger tsunamis. They could be generated by the earthquake rupture process, or more frequently by the secondary triggered phenomena - landslides (submarine or surface) and/or other geodynamic phenomena - rock falls, degradation of gas hydrates, etc. the most famous water level change is described by Strabo - related to the great catastrophe. The area shows also some other expressions about tsunamis - the last one - a non seismic tsunami at 7th May, 2007 with maximum observed amplitudes of about 3 meters water level changes. Landslides The area on the north Bulgarian Black Sea coast is covered by many active landslides. They have different size, depth and activation time. Most of them are located near the coast line thus presenting huge danger about the beaches, tourist infrastructure, population and historical heritage. The most famous landslide (subsidence) is related with the I-st century BC seismic event, when a huge mass slide in the waters, buried Bisone and created the peak Chirakman. The event of 1901 also created landslides, subsidence of a huge land block with dimensions of about 1x1 km. and rock falls with large boulders. The landslide could be also submarine; creating is such way turbidities and/or mud flows from the bottom deposits like sapropel breccia and mud volcano depositions. The time dependent scenario The initial data about the time development of the hazards phenomena is based on their main physical properties - size, location, velocity of the process, intensity (magnitude), etc. The table about the main parameters, possible consequences and general threaten objects is created. The main time development of the disasters in case of the referent event (magnitude 7.2) is presented at the time chart diagram. The time chart development of the selected hazardous processes is presented as follows: Conclusions The time dependent scenario in case of a referent M7.2 seismic event is developed. The investigations about the consecutive and simultaneous action of all expected hazards and their multirisk effects are performed. The results obtained show the complex possible consequences and interrelated dependencies. Acknowledgments: This study is supported by the SCHEMA and TRANSFER EU Projects.

  9. 78 FR 35567 - Safety Zone; Lower Mississippi River, Mile Marker 219 to Mile Marker 229, in the Vicinity of Port...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-13

    ... Mississippi River in this area have risen very rapidly, creating a faster than normal current that is a hazard... Rouge gauge has reached 33 feet and continues to rise. This elevated water level is creating a faster... event the Baton Rouge Gage is reading below 33 feet and falling. The Captain of the Port, New Orleans or...

  10. The Thanksgiving Primer: A Complete Guide to Re-creating the First Harvest Festival for Your Family, Friends or Church. Revised and Enlarged.

    ERIC Educational Resources Information Center

    Travers, Carolyn Freeman, Ed.

    This booklet contains a collection of answers to the most frequently-asked questions on subjects related to the first Thanksgiving, with answers prepared by Plimoth Plantation Museum staff. The book is intended to serve as a guide for re-creating the original 17th century event. The contents include: (1) "The American Thanksgiving: The…

  11. Creating Joint Attentional Frames and Pointing to Evidence in the Reading and Writing Process

    ERIC Educational Resources Information Center

    Unger, John A.; Liu, Rong; Scullion, Vicki A.

    2015-01-01

    This theory-into-practice paper integrates Tomasello's concept of Joint Attentional Frames and well-known ideas related to the work of Russian psychologist, Lev Vygotsky, with more recent ideas from social semiotics. Classroom procedures for incorporating student-created Joint Attentional Frames into literacy lessons are explained by links to…

  12. Enhancing Science Instruction through Student-Created PowerPoint Presentations

    ERIC Educational Resources Information Center

    Gerido, Leona; Curran, Mary Carla

    2014-01-01

    Technology use in science classes can enhance lessons and reinforce scientific content. The creation of multimedia projects is a great way to engage students in lessons about estuarine ecosystems. In this activity, students can learn about estuarine organisms and use their creativity to write a story, create artwork, and develop a multimedia…

  13. What the Federal Government Owes Student Borrowers

    ERIC Educational Resources Information Center

    Combe, Paul

    2009-01-01

    This nation's federal student-loan system has reached a tipping point that, with the new leadership in Washington, offers a rare opportunity to create real change. To create a more consumer-focused student-loan program with both public and private capital, the Education Department, lenders, colleges in both programs, guarantors, and others should…

  14. Automating gene library synthesis by structure-based combinatorial protein engineering: examples from plant sesquiterpene synthases.

    PubMed

    Dokarry, Melissa; Laurendon, Caroline; O'Maille, Paul E

    2012-01-01

    Structure-based combinatorial protein engineering (SCOPE) is a homology-independent recombination method to create multiple crossover gene libraries by assembling defined combinations of structural elements ranging from single mutations to domains of protein structure. SCOPE was originally inspired by DNA shuffling, which mimics recombination during meiosis, where mutations from parental genes are "shuffled" to create novel combinations in the resulting progeny. DNA shuffling utilizes sequence identity between parental genes to mediate template-switching events (the annealing and extension of one parental gene fragment on another) in PCR reassembly reactions to generate crossovers and hence recombination between parental genes. In light of the conservation of protein structure and degeneracy of sequence, SCOPE was developed to enable the "shuffling" of distantly related genes with no requirement for sequence identity. The central principle involves the use of oligonucleotides to encode for crossover regions to choreograph template-switching events during PCR assembly of gene fragments to create chimeric genes. This approach was initially developed to create libraries of hybrid DNA polymerases from distantly related parents, and later developed to create a combinatorial mutant library of sesquiterpene synthases to explore the catalytic landscapes underlying the functional divergence of related enzymes. This chapter presents a simplified protocol of SCOPE that can be integrated with different mutagenesis techniques and is suitable for automation by liquid-handling robots. Two examples are presented to illustrate the application of SCOPE to create gene libraries using plant sesquiterpene synthases as the model system. In the first example, we outline how to create an active-site library as a series of complex mixtures of diverse mutants. In the second example, we outline how to create a focused library as an array of individual clones to distil minimal combinations of functionally important mutations. Through these examples, the principles of the technique are illustrated and the suitability of automating various aspects of the procedure for given applications are discussed. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. Impact of dual antiplatelet therapy after coronary artery bypass surgery on 1-year outcomes in the Arterial Revascularization Trial.

    PubMed

    Benedetto, Umberto; Altman, Douglas G; Gerry, Stephen; Gray, Alastair; Lees, Belinda; Flather, Marcus; Taggart, David P

    2017-09-01

    There is still little evidence to boldport routine dual antiplatelet therapy (DAPT) with P2Y12 antagonists following coronary artery bypass grafting (CABG). The Arterial Revascularization Trial (ART) was designed to compare 10-year survival after bilateral versus single internal thoracic artery grafting. We aimed to get insights into the effect of DAPT (with clopidogrel) following CABG on 1-year outcomes by performing a post hoc ART analysis. Among patients enrolled in the ART (n = 3102), 609 (21%) and 2308 (79%) were discharged on DAPT or aspirin alone, respectively. The primary end-point was the incidence of major adverse cerebrovascular and cardiac events (MACCE) at 1 year including cardiac death, myocardial infarction, cerebrovascular accident and reintervention; safety end-point was bleeding requiring hospitalization. Propensity score (PS) matching was used to create comparable groups. Among 609 PS-matched pairs, MACCE occurred in 34 (5.6%) and 34 (5.6%) in the DAPT and aspirin alone groups, respectively, with no significant difference between the 2 groups [hazard ratio (HR) 0.97, 95% confidence interval (CI) 0.59-1.59; P = 0.90]. Only 188 (31%) subjects completed 1 year of DAPT, and in this subgroup, MACCE rate was 5.8% (HR 1.11, 95% CI 0.53-2.30; P = 0.78). In the overall sample, bleeding rate was higher in DAPT group (2.3% vs 1.1%; P = 0.02), although this difference was no longer significant after matching (2.3% vs 1.8%; P = 0.54). Based on these findings, when compared with aspirin alone, DAPT with clopidogrel prescribed at discharge was not associated with a significant reduction of adverse cardiac and cerebrovascular events at 1 year following CABG. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  16. Animation of Mapped Photo Collections for Storytelling

    NASA Astrophysics Data System (ADS)

    Fujita, Hideyuki; Arikawa, Masatoshi

    Our research goal is to facilitate the sharing of stories with digital photographs. Some map websites now collect stories associated with peoples' relationships to places. Users map collections of places and include their intangible emotional associations with each location along with photographs, videos, etc. Though this framework of mapping stories is important, it is not sufficiently expressive to communicate stories in a narrative fashion. For example, when the number of the mapped collections of places is particularly large, it is neither easy for viewers to interpret the map nor is it easy for the creator to express a story as a series of events in the real world. This is because each narrative, in the form of a sequence of textual narratives, a sequence of photographs, a movie, or audio is mapped to just one point. As a result, it is up to the viewer to decide which points on the map must be read, and in what order. The conventional framework is fairly suitable for mapping and expressing fragments or snapshots of a whole story and not for conveying the whole story as a narrative using the entire map as the setting. We therefore propose a new framework, Spatial Slideshow, for mapping personal photo collections and representing them as stories such as route guidances, sightseeing guidances, historical topics, fieldwork records, personal diaries, and so on. It is a fusion of personal photo mapping and photo storytelling. Each story is conveyed through a sequence of mapped photographs, presented as a synchronized animation of a map and an enhanced photo slideshow. The main technical novelty of this paper is a method for creating three-dimensional animations of photographs that induce the visual effect of motion from photo to photo. We believe that the proposed framework may have considerable significance in facilitating the grassroots development of spatial content driven by visual communication concerning real-world locations or events.

  17. A Novel Web Application to Analyze and Visualize Extreme Heat Events

    NASA Astrophysics Data System (ADS)

    Li, G.; Jones, H.; Trtanj, J.

    2016-12-01

    Extreme heat is the leading cause of weather-related deaths in the United States annually and is expected to increase with our warming climate. However, most of these deaths are preventable with proper tools and services to inform the public about heat waves. In this project, we have investigated the key indicators of a heat wave, the vulnerable populations, and the data visualization strategies of how those populations most effectively absorb heat wave data. A map-based web app has been created that allows users to search and visualize historical heat waves in the United States incorporating these strategies. This app utilizes daily maximum temperature data from NOAA Global Historical Climatology Network which contains about 2.7 million data points from over 7,000 stations per year. The point data are spatially aggregated into county-level data using county geometry from US Census Bureau and stored in Postgres database with PostGIS spatial capability. GeoServer, a powerful map server, is used to serve the image and data layers (WMS and WFS). The JavaScript-based web-mapping platform Leaflet is used to display the temperature layers. A number of functions have been implemented for the search and display. Users can search for extreme heat events by county or by date. The "by date" option allows a user to select a date and a Tmax threshold which then highlights all of the areas on the map that meet those date and temperature parameters. The "by county" option allows the user to select a county on the map which then retrieves a list of heat wave dates and daily Tmax measurements. This visualization is clean, user-friendly, and novel because while this sort of time, space, and temperature measurements can be found by querying meteorological datasets, there does not exist a tool that neatly packages this information together in an easily accessible and non-technical manner, especially in a time where climate change urges a better understanding of heat waves.

  18. Solar Hard X-ray Observations with NuSTAR

    NASA Astrophysics Data System (ADS)

    Smith, David M.; Krucker, S.; Hudson, H. S.; Hurford, G. J.; White, S. M.; Mewaldt, R. A.; Stern, D.; Grefenstette, B. W.; Harrison, F. A.

    2011-05-01

    High-sensitivity imaging of coronal hard X-rays allows detection of freshly accelerated nonthermal electrons at the acceleration site. A few such observations have been made with Yohkoh and RHESSI, but a leap in sensitivity could help pin down the time, place, and manner of reconnection. In 2012, the Nuclear Spectroscopic Telescope Array (NuSTAR), a NASA Small Explorer for high energy astrophysics that uses grazing-incidence optics to focus X-rays up to 80 keV, will be launched. NuSTAR is capable of solar pointing, and three weeks will be dedicated to solar observing during the baseline two-year mission. NuSTAR will be 200 times more sensitive than RHESSI in the hard X-ray band. This will allow the following new observations, among others: 1) Extrapolation of the micro/nanoflare distribution by two orders of magnitude down in flux 2) Search for hard X-rays from network nanoflares (soft X-ray bright points) and evaluation of their role in coronal heating 3) Discovery of hard X-ray bremsstrahlung from the electron beams driving type III radio bursts, and measurement of their electron spectrum 4) Hard X-ray studies of polar soft X-ray jets and impulsive solar energetic particle events at the edge of coronal holes, and comparison of these events with observations of 3He and other particles in interplanetary space 5) Study of coronal bremsstrahlung from particles accelerated by coronal mass ejections as they are first launched 6) Study of particles at the coronal reconnection site when flare footpoints are occulted; and 7) Search for hypothetical axion particles created in the solar core via the hard X-ray signal from their conversion to X-rays in the coronal magnetic field. NuSTAR will also serve as a pathfinder for a future dedicated space mission with enhanced capabilities, such as a satellite version of the FOXSI sounding rocket.

  19. Geolocalization of Influenza Outbreak Within an Acute Care Population: A Layered-Surveillance Approach.

    PubMed

    Kannan, Vijay Christopher; Hodgson, Nicole; Lau, Andrew; Goodin, Kate; Dugas, Andrea Freyer; LoVecchio, Frank

    2016-11-01

    We seek to use a novel layered-surveillance approach to localize influenza clusters within an acute care population. The first layer of this system is a syndromic surveillance screen to guide rapid polymerase chain reaction testing. The second layer is geolocalization and cluster analysis of these patients. We posit that any identified clusters could represent at-risk populations who could serve as high-yield targets for preventive medical interventions. This was a prospective observational surveillance study. Patients were screened with a previously derived clinical decision guideline that has a 90% sensitivity and 30% specificity for influenza. Patients received points for the following signs and symptoms within the past 7 days: cough (2 points), headache (1 point), subjective fever (1 point), and documented fever at triage (temperature >38°C [100.4°F]) (1 point). Patients scoring 3 points or higher were indicated for influenza testing. Patients were tested with Xpert Flu (Cepheid, Sunnyvale, CA), a rapid polymerase chain reaction test. Positive results were mapped with ArcGIS (ESRI, Redlands, CA) and analyzed with kernel density estimation to create heat maps. There were 1,360 patients tested with Xpert Flu with retrievable addresses within the greater Phoenix metro area. One hundred sixty-seven (12%) of them tested positive for influenza A and 23 (2%) tested positive for influenza B. The influenza A virus exhibited a clear cluster pattern within this patient population. The densest cluster was located in an approximately 1-square-mile region southeast of our hospital. Our layered-surveillance approach was effective in localizing a cluster of influenza A outbreak. This region may house a high-yield target population for public health intervention. Further collaborative efforts will be made between our hospital and the Maricopa County Department of Public Health to perform a series of community vaccination events before the next influenza season. We hope these efforts will ultimately serve to reduce the burden of this disease on our patient population, and that this system will serve as a framework for future investigations locating at-risk populations. Copyright © 2016 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  20. Translating Deming's 14 Points for Education.

    ERIC Educational Resources Information Center

    Melvin, Charles A., III

    1991-01-01

    A consortium of four Wisconsin school districts has decided to stop tinkering with the educational system and apply W. Edward Deming's 14 management points to total system improvement. The rewritten precepts involve creating and adopting a fitting purpose, infusing quality into the educational "product," and working toward zero defects…

Top